WorldWideScience

Sample records for automated network analysis

  1. Automated Analysis of Security in Networking Systems

    DEFF Research Database (Denmark)

    Buchholtz, Mikael

    2004-01-01

    It has for a long time been a challenge to built secure networking systems. One way to counter this problem is to provide developers of software applications for networking systems with easy-to-use tools that can check security properties before the applications ever reach the marked. These tools...... will both help raise the general level of awareness of the problems and prevent the most basic flaws from occurring. This thesis contributes to the development of such tools. Networking systems typically try to attain secure communication by applying standard cryptographic techniques. In this thesis...... such networking systems are modelled in the process calculus LySa. On top of this programming language based formalism an analysis is developed, which relies on techniques from data and control ow analysis. These are techniques that can be fully automated, which make them an ideal basis for tools targeted at non...

  2. Automated drawing of network plots in network meta-analysis.

    Science.gov (United States)

    Rücker, Gerta; Schwarzer, Guido

    2016-03-01

    In systematic reviews based on network meta-analysis, the network structure should be visualized. Network plots often have been drawn by hand using generic graphical software. A typical way of drawing networks, also implemented in statistical software for network meta-analysis, is a circular representation, often with many crossing lines. We use methods from graph theory in order to generate network plots in an automated way. We give a number of requirements for graph drawing and present an algorithm that fits prespecified ideal distances between the nodes representing the treatments. The method was implemented in the function netgraph of the R package netmeta and applied to a number of networks from the literature. We show that graph representations with a small number of crossing lines are often preferable to circular representations. PMID:26060934

  3. Automated condition classification of a reciprocating compressor using time frequency analysis and an artificial neural network

    Science.gov (United States)

    Lin, Yih-Hwang; Wu, Hsien-Chang; Wu, Chung-Yung

    2006-12-01

    The purpose of this study is to develop an automated system for condition classification of a reciprocating compressor. Various time-frequency analysis techniques will be examined for decomposition of the vibration signals. Because a time-frequency distribution is a 3D data map, data reduction is indispensable for subsequent analysis. The extraction of the system characteristics using three indices, namely the time index, frequency index, and amplitude index, will be presented and examined for their applicability. The probability neural network is applied for automated condition classification using a combination of the three indices. The study reveals that a proper choice of the index combination and the time-frequency band can provide excellent classification accuracy for the machinery conditions examined in this work.

  4. Network based automation for SMEs

    DEFF Research Database (Denmark)

    Shahabeddini Parizi, Mohammad; Radziwon, Agnieszka

    2016-01-01

    The implementation of appropriate automation concepts which increase productivity in Small and Medium Sized Enterprises (SMEs) requires a lot of effort, due to their limited resources. Therefore, it is strongly recommended for small firms to open up for the external sources of knowledge, which co......, this paper develops and discusses a set of guidelines for systematic productivity improvement within an innovative collaboration in regards to automation processes in SMEs.......The implementation of appropriate automation concepts which increase productivity in Small and Medium Sized Enterprises (SMEs) requires a lot of effort, due to their limited resources. Therefore, it is strongly recommended for small firms to open up for the external sources of knowledge, which...... could be obtained through network interaction. Based on two extreme cases of SMEs representing low-tech industry and an in-depth analysis of their manufacturing facilities this paper presents how collaboration between firms embedded in a regional ecosystem could result in implementation of new...

  5. Automated minimax design of networks

    DEFF Research Database (Denmark)

    Madsen, Kaj; Schjær-Jacobsen, Hans; Voldby, J

    1975-01-01

    A new gradient algorithm for the solution of nonlinear minimax problems has been developed. The algorithm is well suited for automated minimax design of networks and it is very simple to use. It compares favorably with recent minimax and leastpth algorithms. General convergence problems related...... to minimax design of networks are discussed. Finally, minimax design of equalization networks for reflectiontype microwave amplifiers is carried out by means of the proposed algorithm....

  6. Taiwan Automated Telescope Network

    Directory of Open Access Journals (Sweden)

    Dean-Yi Chou

    2010-01-01

    can be operated either interactively or fully automatically. In the interactive mode, it can be controlled through the Internet. In the fully automatic mode, the telescope operates with preset parameters without any human care, including taking dark frames and flat frames. The network can also be used for studies that require continuous observations for selected objects.

  7. Automated Image Analysis for the Detection of Benthic Crustaceans and Bacterial Mat Coverage Using the VENUS Undersea Cabled Network

    Directory of Open Access Journals (Sweden)

    Jacopo Aguzzi

    2011-11-01

    Full Text Available The development and deployment of sensors for undersea cabled observatories is presently biased toward the measurement of habitat variables, while sensor technologies for biological community characterization through species identification and individual counting are less common. The VENUS cabled multisensory network (Vancouver Island, Canada deploys seafloor camera systems at several sites. Our objective in this study was to implement new automated image analysis protocols for the recognition and counting of benthic decapods (i.e., the galatheid squat lobster, Munida quadrispina, as well as for the evaluation of changes in bacterial mat coverage (i.e., Beggiatoa spp., using a camera deployed in Saanich Inlet (103 m depth. For the counting of Munida we remotely acquired 100 digital photos at hourly intervals from 2 to 6 December 2009. In the case of bacterial mat coverage estimation, images were taken from 2 to 8 December 2009 at the same time frequency. The automated image analysis protocols for both study cases were created in MatLab 7.1. Automation for Munida counting incorporated the combination of both filtering and background correction (Median- and Top-Hat Filters with Euclidean Distances (ED on Red-Green-Blue (RGB channels. The Scale-Invariant Feature Transform (SIFT features and Fourier Descriptors (FD of tracked objects were then extracted. Animal classifications were carried out with the tools of morphometric multivariate statistic (i.e., Partial Least Square Discriminant Analysis; PLSDA on Mean RGB (RGBv value for each object and Fourier Descriptors (RGBv+FD matrices plus SIFT and ED. The SIFT approach returned the better results. Higher percentages of images were correctly classified and lower misclassification errors (an animal is present but not detected occurred. In contrast, RGBv+FD and ED resulted in a high incidence of records being generated for non-present animals. Bacterial mat coverage was estimated in terms of Percent

  8. A Systematic, Automated Network Planning Method

    DEFF Research Database (Denmark)

    Holm, Jens Åge; Pedersen, Jens Myrup

    2006-01-01

    This paper describes a case study conducted to evaluate the viability of a systematic, automated network planning method. The motivation for developing the network planning method was that many data networks are planned in an adhoc manner with no assurance of quality of the solution with respect...

  9. Technological Developments in Networking, Education and Automation

    CERN Document Server

    Elleithy, Khaled; Iskander, Magued; Kapila, Vikram; Karim, Mohammad A; Mahmood, Ausif

    2010-01-01

    "Technological Developments in Networking, Education and Automation" includes a set of rigorously reviewed world-class manuscripts addressing and detailing state-of-the-art research projects in the following areas: Computer Networks: Access Technologies, Medium Access Control, Network architectures and Equipment, Optical Networks and Switching, Telecommunication Technology, and Ultra Wideband Communications. Engineering Education and Online Learning: including development of courses and systems for engineering, technical and liberal studies programs; online laboratories; intelligent

  10. Automated Motivic Analysis

    DEFF Research Database (Denmark)

    Lartillot, Olivier

    2016-01-01

    Motivic analysis provides very detailed understanding of musical composi- tions, but is also particularly difficult to formalize and systematize. A computational automation of the discovery of motivic patterns cannot be reduced to a mere extraction of all possible sequences of descriptions....... The systematic approach inexorably leads to a proliferation of redundant structures that needs to be addressed properly. Global filtering techniques cause a drastic elimination of interesting structures that damages the quality of the analysis. On the other hand, a selection of closed patterns allows...... for lossless compression. The structural complexity resulting from successive repetitions of patterns can be controlled through a simple modelling of cycles. Generally, motivic patterns cannot always be defined solely as sequences of descriptions in a fixed set of dimensions: throughout the descriptions...

  11. AUTOMATED ANALYSIS OF BREAKERS

    Directory of Open Access Journals (Sweden)

    E. M. Farhadzade

    2014-01-01

    Full Text Available Breakers relate to Electric Power Systems’ equipment, the reliability of which influence, to a great extend, on reliability of Power Plants. In particular, the breakers determine structural reliability of switchgear circuit of Power Stations and network substations. Failure in short-circuit switching off by breaker with further failure of reservation unit or system of long-distance protection lead quite often to system emergency.The problem of breakers’ reliability improvement and the reduction of maintenance expenses is becoming ever more urgent in conditions of systematic increasing of maintenance cost and repair expenses of oil circuit and air-break circuit breakers. The main direction of this problem solution is the improvement of diagnostic control methods and organization of on-condition maintenance. But this demands to use a great amount of statistic information about nameplate data of breakers and their operating conditions, about their failures, testing and repairing, advanced developments (software of computer technologies and specific automated information system (AIS.The new AIS with AISV logo was developed at the department: “Reliability of power equipment” of AzRDSI of Energy. The main features of AISV are:· to provide the security and data base accuracy;· to carry out systematic control of breakers conformity with operating conditions;· to make the estimation of individual  reliability’s value and characteristics of its changing for given combination of characteristics variety;· to provide personnel, who is responsible for technical maintenance of breakers, not only with information but also with methodological support, including recommendations for the given problem solving  and advanced methods for its realization.

  12. Performance assessment of the Tactical Network Analysis and Planning System Plus (TNAPS+) automated planning tool for C4I systems

    OpenAIRE

    Ziegenfuss, Paul C.

    1999-01-01

    The Joint Staff established the Tactical Network Analysis and Planning System Plus (TNAPS+) as the interim joint communications planning and management system. The Marines Command and Control Systems Course and the Army's Joint Task Force System Planning Course both utilize TNAPS+ to conduct tactical C41 network planning in their course requirements. This thesis is a Naval Postgraduate School C41 curriculum practical application of TNAPS+ in an expeditionary Joint Task Force environment, focu...

  13. Optimization-based Method for Automated Road Network Extraction

    Energy Technology Data Exchange (ETDEWEB)

    Xiong, D

    2001-09-18

    Automated road information extraction has significant applicability in transportation. It provides a means for creating, maintaining, and updating transportation network databases that are needed for purposes ranging from traffic management to automated vehicle navigation and guidance. This paper is to review literature on the subject of road extraction and to describe a study of an optimization-based method for automated road network extraction.

  14. Toward the automation of road networks extraction processes

    Science.gov (United States)

    Leymarie, Frederic; Boichis, Nicolas; Airault, Sylvain; Jamet, Olivier

    1996-12-01

    Syseca and IGN are working on various steps in the ongoing march from digital photogrammetry to the semi-automation and ultimately the full automation of data manipulation, i.e., capture and analysis. The immediate goals are to reduce the production costs and the data availability delays. Within this context, we have tackle the distinctive problem of 'automated road network extraction.' The methodology adopted is to first study semi-automatic solutions which probably increase the global efficiency of human operators in topographic data capture; in a second step, automatic solutions are designed based upon the gained experience. We report on different (semi-)automatic solutions for the road following algorithm. One key aspect of our method is to have the stages of 'detection' and 'geometric recovery' cooperate together while remaining distinct. 'Detection' is based on a local (texture) analysis of the image, while 'geometric recovery' is concerned with the extraction of 'road objects' for both monocular and stereo information. 'Detection' is a low-level visual process, 'reasoning' directly at the level of image intensities, while the mid-level visual process, 'geometric recovery', uses contextual knowledge about roads, both generic, e.g. parallelism of borders, and specific, e.g. using previously extracted road segments and disparities. We then pursue our 'march' by reporting on steps we are exploring toward full automation. We have in particular made attempts at tackling the automation of the initialization step to start searching in a valid direction.

  15. Automated activation-analysis system

    International Nuclear Information System (INIS)

    An automated delayed neutron counting and instrumental neutron activation analysis system has been developed at Los Alamos National Laboratory's Omega West Reactor (OWR) to analyze samples for uranium and 31 additional elements with a maximum throughput of 400 samples per day. The system and its mode of operation for a large reconnaissance survey are described

  16. Home Network Technologies and Automating Demand Response

    Energy Technology Data Exchange (ETDEWEB)

    McParland, Charles

    2009-12-01

    sophisticated energy consumers, it has been possible to improve the DR 'state of the art' with a manageable commitment of technical resources on both the utility and consumer side. Although numerous C & I DR applications of a DRAS infrastructure are still in either prototype or early production phases, these early attempts at automating DR have been notably successful for both utilities and C & I customers. Several factors have strongly contributed to this success and will be discussed below. These successes have motivated utilities and regulators to look closely at how DR programs can be expanded to encompass the remaining (roughly) half of the state's energy load - the light commercial and, in numerical terms, the more important residential customer market. This survey examines technical issues facing the implementation of automated DR in the residential environment. In particular, we will look at the potential role of home automation networks in implementing wide-scale DR systems that communicate directly to individual residences.

  17. Flux-P: Automating Metabolic Flux Analysis

    Directory of Open Access Journals (Sweden)

    Birgitta E. Ebert

    2012-11-01

    Full Text Available Quantitative knowledge of intracellular fluxes in metabolic networks is invaluable for inferring metabolic system behavior and the design principles of biological systems. However, intracellular reaction rates can not often be calculated directly but have to be estimated; for instance, via 13C-based metabolic flux analysis, a model-based interpretation of stable carbon isotope patterns in intermediates of metabolism. Existing software such as FiatFlux, OpenFLUX or 13CFLUX supports experts in this complex analysis, but requires several steps that have to be carried out manually, hence restricting the use of this software for data interpretation to a rather small number of experiments. In this paper, we present Flux-P as an approach to automate and standardize 13C-based metabolic flux analysis, using the Bio-jETI workflow framework. Exemplarily based on the FiatFlux software, it demonstrates how services can be created that carry out the different analysis steps autonomously and how these can subsequently be assembled into software workflows that perform automated, high-throughput intracellular flux analysis of high quality and reproducibility. Besides significant acceleration and standardization of the data analysis, the agile workflow-based realization supports flexible changes of the analysis workflows on the user level, making it easy to perform custom analyses.

  18. Automating Risk Analysis of Software Design Models

    Directory of Open Access Journals (Sweden)

    Maxime Frydman

    2014-01-01

    Full Text Available The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  19. Automated pipelines for spectroscopic analysis

    Science.gov (United States)

    Allende Prieto, C.

    2016-09-01

    The Gaia mission will have a profound impact on our understanding of the structure and dynamics of the Milky Way. Gaia is providing an exhaustive census of stellar parallaxes, proper motions, positions, colors and radial velocities, but also leaves some glaring holes in an otherwise complete data set. The radial velocities measured with the on-board high-resolution spectrograph will only reach some 10 % of the full sample of stars with astrometry and photometry from the mission, and detailed chemical information will be obtained for less than 1 %. Teams all over the world are organizing large-scale projects to provide complementary radial velocities and chemistry, since this can now be done very efficiently from the ground thanks to large and mid-size telescopes with a wide field-of-view and multi-object spectrographs. As a result, automated data processing is taking an ever increasing relevance, and the concept is applying to many more areas, from targeting to analysis. In this paper, I provide a quick overview of recent, ongoing, and upcoming spectroscopic surveys, and the strategies adopted in their automated analysis pipelines.

  20. Reload safety analysis automation tools

    International Nuclear Information System (INIS)

    Performing core physics calculations for the sake of reload safety analysis is a very demanding and time consuming process. This process generally begins with the preparation of libraries for the core physics code using a lattice code. The next step involves creating a very large set of calculations with the core physics code. Lastly, the results of the calculations must be interpreted, correctly applying uncertainties and checking whether applicable limits are satisfied. Such a procedure requires three specialized experts. One must understand the lattice code in order to correctly calculate and interpret its results. The next expert must have a good understanding of the physics code in order to create libraries from the lattice code results and to correctly define all the calculations involved. The third expert must have a deep knowledge of the power plant and the reload safety analysis procedure in order to verify, that all the necessary calculations were performed. Such a procedure involves many steps and is very time consuming. At ÚJV Řež, a.s., we have developed a set of tools which can be used to automate and simplify the whole process of performing reload safety analysis. Our application QUADRIGA automates lattice code calculations for library preparation. It removes user interaction with the lattice code and reduces his task to defining fuel pin types, enrichments, assembly maps and operational parameters all through a very nice and user-friendly GUI. The second part in reload safety analysis calculations is done by CycleKit, a code which is linked with our core physics code ANDREA. Through CycleKit large sets of calculations with complicated interdependencies can be performed using simple and convenient notation. CycleKit automates the interaction with ANDREA, organizes all the calculations, collects the results, performs limit verification and displays the output in clickable html format. Using this set of tools for reload safety analysis simplifies

  1. A Method for Automated Planning of FTTH Access Network Infrastructures

    DEFF Research Database (Denmark)

    Riaz, Muhammad Tahir; Pedersen, Jens Myrup; Madsen, Ole Brun

    2005-01-01

    In this paper a method for automated planning of Fiber to the Home (FTTH) access networks is proposed. We introduced a systematic approach for planning access network infrastructure. The GIS data and a set of algorithms were employed to make the planning process more automatic. The method explains...... method. The method, however, does not fully automate the planning but make the planning process significantly fast. The results and discussion are presented and conclusion is given in the end....

  2. Cost Analysis of an Automated and Manual Cataloging and Book Processing System.

    Science.gov (United States)

    Druschel, Joselyn

    1981-01-01

    Cost analysis of an automated network system and a manual system of cataloging and book processing indicates a 20 percent savings using automation. Per unit costs based on the average monthly automation rate are used for comparison. Higher manual system costs are attributed to staff costs. (RAA)

  3. Automated quantitative analysis for pneumoconiosis

    Science.gov (United States)

    Kondo, Hiroshi; Zhao, Bin; Mino, Masako

    1998-09-01

    Automated quantitative analysis for pneumoconiosis is presented. In this paper Japanese standard radiographs of pneumoconiosis are categorized by measuring the area density and the number density of small rounded opacities. And furthermore the classification of the size and shape of the opacities is made from the measuring of the equivalent radiuses of each opacity. The proposed method includes a bi- level unsharp masking filter with a 1D uniform impulse response in order to eliminate the undesired parts such as the images of blood vessels and ribs in the chest x-ray photo. The fuzzy contrast enhancement is also introduced in this method for easy and exact detection of small rounded opacities. Many simulation examples show that the proposed method is more reliable than the former method.

  4. Automation for System Safety Analysis

    Science.gov (United States)

    Malin, Jane T.; Fleming, Land; Throop, David; Thronesbery, Carroll; Flores, Joshua; Bennett, Ted; Wennberg, Paul

    2009-01-01

    This presentation describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis and simulation to identify and evaluate possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations and scenarios; and 4) identify resulting candidate scenarios for software integration testing. There has been significant technical progress in model extraction from Orion program text sources, architecture model derivation (components and connections) and documentation of extraction sources. Models have been derived from Internal Interface Requirements Documents (IIRDs) and FMEA documents. Linguistic text processing is used to extract model parts and relationships, and the Aerospace Ontology also aids automated model development from the extracted information. Visualizations of these models assist analysts in requirements overview and in checking consistency and completeness.

  5. An Automated Data Analysis Tool for Livestock Market Data

    Science.gov (United States)

    Williams, Galen S.; Raper, Kellie Curry

    2011-01-01

    This article describes an automated data analysis tool that allows Oklahoma Cooperative Extension Service educators to disseminate results in a timely manner. Primary data collected at Oklahoma Quality Beef Network (OQBN) certified calf auctions across the state results in a large amount of data per sale site. Sale summaries for an individual sale…

  6. Management issues in automated audit analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jackson, K.A.; Hochberg, J.G.; Wilhelmy, S.K.; McClary, J.F.; Christoph, G.G.

    1994-03-01

    This paper discusses management issues associated with the design and implementation of an automated audit analysis system that we use to detect security events. It gives the viewpoint of a team directly responsible for developing and managing such a system. We use Los Alamos National Laboratory`s Network Anomaly Detection and Intrusion Reporter (NADIR) as a case in point. We examine issues encountered at Los Alamos, detail our solutions to them, and where appropriate suggest general solutions. After providing an introduction to NADIR, we explore four general management issues: cost-benefit questions, privacy considerations, legal issues, and system integrity. Our experiences are of general interest both to security professionals and to anyone who may wish to implement a similar system. While NADIR investigates security events, the methods used and the management issues are potentially applicable to a broad range of complex systems. These include those used to audit credit card transactions, medical care payments, and procurement systems.

  7. Automated Pipelines for Spectroscopic Analysis

    CERN Document Server

    Prieto, Carlos Allende

    2016-01-01

    The Gaia mission will have a profound impact on our understanding of the structure and dynamics of the Milky Way. Gaia is providing an exhaustive census of stellar parallaxes, proper motions, positions, colors and radial velocities, but also leaves some flaring holes in an otherwise complete data set. The radial velocities measured with the on-board high-resolution spectrograph will only reach some 10% of the full sample of stars with astrometry and photometry from the mission, and detailed chemical information will be obtained for less than 1%. Teams all over the world are organizing large-scale projects to provide complementary radial velocities and chemistry, since this can now be done very efficiently from the ground thanks to large and mid-size telescopes with a wide field-of-view and multi-object spectrographs. As a result, automated data processing is taking an ever increasing relevance, and the concept is applying to many more areas, from targeting to analysis. In this paper, I provide a quick overvie...

  8. An Automated 3d Indoor Topological Navigation Network Modelling

    Science.gov (United States)

    Jamali, A.; Rahman, A. A.; Boguslawski, P.; Gold, C. M.

    2015-10-01

    Indoor navigation is important for various applications such as disaster management and safety analysis. In the last decade, indoor environment has been a focus of wide research; that includes developing techniques for acquiring indoor data (e.g. Terrestrial laser scanning), 3D indoor modelling and 3D indoor navigation models. In this paper, an automated 3D topological indoor network generated from inaccurate 3D building models is proposed. In a normal scenario, 3D indoor navigation network derivation needs accurate 3D models with no errors (e.g. gap, intersect) and two cells (e.g. rooms, corridors) should touch each other to build their connections. The presented 3D modeling of indoor navigation network is based on surveying control points and it is less dependent on the 3D geometrical building model. For reducing time and cost of indoor building data acquisition process, Trimble LaserAce 1000 as surveying instrument is used. The modelling results were validated against an accurate geometry of indoor building environment which was acquired using Trimble M3 total station.

  9. Automated neural network-based instrument validation system

    Science.gov (United States)

    Xu, Xiao

    2000-10-01

    In a complex control process, instrument calibration is periodically performed to maintain the instruments within the calibration range, which assures proper control and minimizes down time. Instruments are usually calibrated under out-of-service conditions using manual calibration methods, which may cause incorrect calibration or equipment damage. Continuous in-service calibration monitoring of sensors and instruments will reduce unnecessary instrument calibrations, give operators more confidence in instrument measurements, increase plant efficiency or product quality, and minimize the possibility of equipment damage during unnecessary manual calibrations. In this dissertation, an artificial neural network (ANN)-based instrument calibration verification system is designed to achieve the on-line monitoring and verification goal for scheduling maintenance. Since an ANN is a data-driven model, it can learn the relationships among signals without prior knowledge of the physical model or process, which is usually difficult to establish for the complex non-linear systems. Furthermore, the ANNs provide a noise-reduced estimate of the signal measurement. More importantly, since a neural network learns the relationships among signals, it can give an unfaulted estimate of a faulty signal based on information provided by other unfaulted signals; that is, provide a correct estimate of a faulty signal. This ANN-based instrument verification system is capable of detecting small degradations or drifts occurring in instrumentation, and preclude false control actions or system damage caused by instrument degradation. In this dissertation, an automated scheme of neural network construction is developed. Previously, the neural network structure design required extensive knowledge of neural networks. An automated design methodology was developed so that a network structure can be created without expert interaction. This validation system was designed to monitor process sensors plant

  10. Feasibility Analysis of Crane Automation

    Institute of Scientific and Technical Information of China (English)

    DONG Ming-xiao; MEI Xue-song; JIANG Ge-dong; ZHANG Gui-qing

    2006-01-01

    This paper summarizes the modeling methods, open-loop control and closed-loop control techniques of various forms of cranes, worldwide, and discusses their feasibilities and limitations in engineering. Then the dynamic behaviors of cranes are analyzed. Finally, we propose applied modeling methods and feasible control techniques and demonstrate the feasibilities of crane automation.

  11. Distribution system analysis and automation

    CERN Document Server

    Gers, Juan

    2013-01-01

    A comprehensive guide to techniques that allow engineers to simulate, analyse and optimise power distribution systems which combined with automation, underpin the emerging concept of the "smart grid". This book is supported by theoretical concepts with real-world applications and MATLAB exercises.

  12. AUTOMATED DEFECT CLASSIFICATION USING AN ARTIFICIAL NEURAL NETWORK

    International Nuclear Information System (INIS)

    The automated defect classification algorithm based on artificial neural network with multilayer backpropagation structure was utilized. The selected features of flaws were used as input data. In order to train the neural network it is necessary to prepare learning data which is representative database of defects. Database preparation requires the following steps: image acquisition and pre-processing, image enhancement, defect detection and feature extraction. The real digital radiographs of welded parts of a ship were used for this purpose.

  13. A framework for automated service composition in collaborative networks

    NARCIS (Netherlands)

    H. Afsarmanesh; M. Sargolzaei; M. Shadi

    2012-01-01

    This paper proposes a novel framework for automated software service composition that can significantly support and enhance collaboration among enterprises in service provision industry, such as in tourism insurance and e-commerce collaborative networks (CNs). Our proposed framework is founded on se

  14. Automated Bilingual Circulation System Using PC Local Area Networks.

    Science.gov (United States)

    Iskanderani, A. I.; Anwar, M. A.

    1992-01-01

    Describes a local automated bilingual circulation system using personal computers in a local area network that was developed at King Abdulaziz University (Saudi Arabia) for Arabic and English materials. Topics addressed include the system structure, hardware, major features, storage requirements, and costs. (nine references) (LRW)

  15. Building Automation Networks for Smart Grids

    Directory of Open Access Journals (Sweden)

    Peizhong Yi

    2011-01-01

    Full Text Available Smart grid, as an intelligent power generation, distribution, and control system, needs various communication systems to meet its requirements. The ability to communicate seamlessly across multiple networks and domains is an open issue which is yet to be adequately addressed in smart grid architectures. In this paper, we present a framework for end-to-end interoperability in home and building area networks within smart grids. 6LoWPAN and the compact application protocol are utilized to facilitate the use of IPv6 and Zigbee application profiles such as Zigbee smart energy for network and application layer interoperability, respectively. A differential service medium access control scheme enables end-to-end connectivity between 802.15.4 and IP networks while providing quality of service guarantees for Zigbee traffic over Wi-Fi. We also address several issues including interference mitigation, load scheduling, and security and propose solutions to them.

  16. Automated analysis of 3D echocardiography

    NARCIS (Netherlands)

    Stralen, Marijn van

    2009-01-01

    In this thesis we aim at automating the analysis of 3D echocardiography, mainly targeting the functional analysis of the left ventricle. Manual analysis of these data is cumbersome, time-consuming and is associated with inter-observer and inter-institutional variability. Methods for reconstruction o

  17. Automated Negotiation for Resource Assignment in Wireless Surveillance Sensor Networks

    Directory of Open Access Journals (Sweden)

    Enrique de la Hoz

    2015-11-01

    Full Text Available Due to the low cost of CMOS IP-based cameras, wireless surveillance sensor networks have emerged as a new application of sensor networks able to monitor public or private areas or even country borders. Since these networks are bandwidth intensive and the radioelectric spectrum is limited, especially in unlicensed bands, it is mandatory to assign frequency channels in a smart manner. In this work, we propose the application of automated negotiation techniques for frequency assignment. Results show that these techniques are very suitable for the problem, being able to obtain the best solutions among the techniques with which we have compared them.

  18. Automated Negotiation for Resource Assignment in Wireless Surveillance Sensor Networks.

    Science.gov (United States)

    de la Hoz, Enrique; Gimenez-Guzman, Jose Manuel; Marsa-Maestre, Ivan; Orden, David

    2015-01-01

    Due to the low cost of CMOS IP-based cameras, wireless surveillance sensor networks have emerged as a new application of sensor networks able to monitor public or private areas or even country borders. Since these networks are bandwidth intensive and the radioelectric spectrum is limited, especially in unlicensed bands, it is mandatory to assign frequency channels in a smart manner. In this work, we propose the application of automated negotiation techniques for frequency assignment. Results show that these techniques are very suitable for the problem, being able to obtain the best solutions among the techniques with which we have compared them.

  19. NetBench. Automated Network Performance Testing

    CERN Document Server

    Cadeddu, Mattia

    2016-01-01

    In order to evaluate the operation of high performance routers, CERN has developed the NetBench software to run benchmarking tests by injecting various traffic patterns and observing the network devices behaviour in real-time. The tool features a modular design with a Python based console used to inject traffic and collect the results in a database, and a web user

  20. Automated Technology for Verificiation and Analysis

    DEFF Research Database (Denmark)

    This volume contains the papers presented at the 7th International Symposium on Automated Technology for Verification and Analysis held during October 13-16 in Macao SAR, China. The primary objective of the ATVA conferences remains the same: to exchange and promote the latest advances of state-of-the-art...... research on theoretical and practical aspects of automated analysis, verification, and synthesis. Among 74 research papers and 10 tool papers submitted to ATVA 2009, the Program Committee accepted 23 as regular papers and 3 as tool papers. In all, 33 experts from 17 countries worked hard to make sure...

  1. An automated inverse analysis system using neural networks and computational mechanics with its application to identification of 3-D crack shape hidden in solid

    International Nuclear Information System (INIS)

    This paper describes a new inverse analysis system using the hierarchical (multilayer) neural networks and the computational mechanics. The present inverse analysis basically consists of the following three subprocesses: by parametrically varying system parameters, their corresponding responses of the system are calculated through computational mechanics simulations, each of which is an ordinary direct analysis. Each data pair of system parameters system responses is called training pattern; the back-propagation neural network is iteratively trained using a number of training patterns. The system responses are given to the input units of the network, while the system parameters to be identified are given to its output units as teacher signal; some system responses measured are given to the well trained network, which immediately outputs appropriate system parameters even for untrained patterns. This is an inverse analysis. To demonstrate its practical performances, the present system is applied to identify locations and shapes of two adjacent dissimilar surface cracks hidden in a pipe with the electric potential drop method. The results clearly show that the present system is very efficient and accurate. (author). 7 refs., 10 figs

  2. Using neural networks to assess flight deck human–automation interaction

    International Nuclear Information System (INIS)

    The increased complexity and interconnectivity of flight deck automation has made the prediction of human–automation interaction (HAI) difficult and has resulted in a number of accidents and incidents. There is a need to develop objective and robust methods by which the changes in HAI brought about by the introduction of new automation into the flight deck could be predicted and assessed prior to implementation and without use of extensive simulation. This paper presents a method to model a parametrization of flight deck automation known as HART and link it to HAI consequences using a backpropagation neural network approach. The transformation of the HART into a computational model suitable for modeling as a neural network is described. To test and train the network data were collected from 40 airline pilots for six HAI consequences based on one scenario family consisting of a baseline and four variants. For a binary classification of HAI consequences, the neural network successfully classified 62–78.5% depending on the consequence. The results were verified using a decision tree analysis

  3. Automation of the proximate analysis of coals

    Energy Technology Data Exchange (ETDEWEB)

    1985-01-01

    A study is reported of the feasibility of using a multi-jointed general-purpose robot for the automated analysis of moisture, volatile matter, ash and total post-combustion sulfur in coal and coke. The results obtained with an automated system are compared with those of conventional manual methods. The design of the robot hand and the safety measures provided are now both fully satisfactory, and the analytic values obtained exhibit little scatter. It is concluded that the use of this robot system results in a better working environment and in considerable labour saving. Applications to other tasks are under development.

  4. Anomaly detection in an automated safeguards system using neural networks

    International Nuclear Information System (INIS)

    An automated safeguards system must be able to detect an anomalous event, identify the nature of the event, and recommend a corrective action. Neural networks represent a new way of thinking about basic computational mechanisms for intelligent information processing. In this paper, we discuss the issues involved in applying a neural network model to the first step of this process: anomaly detection in materials accounting systems. We extend our previous model to a 3-tank problem and compare different neural network architectures and algorithms. We evaluate the computational difficulties in training neural networks and explore how certain design principles affect the problems. The issues involved in building a neural network architecture include how the information flows, how the network is trained, how the neurons in a network are connected, how the neurons process information, and how the connections between neurons are modified. Our approach is based on the demonstrated ability of neural networks to model complex, nonlinear, real-time processes. By modeling the normal behavior of the processes, we can predict how a system should be behaving and, therefore, detect when an abnormality occurs

  5. Automated Loads Analysis System (ATLAS)

    Science.gov (United States)

    Gardner, Stephen; Frere, Scot; O’Reilly, Patrick

    2013-01-01

    ATLAS is a generalized solution that can be used for launch vehicles. ATLAS is used to produce modal transient analysis and quasi-static analysis results (i.e., accelerations, displacements, and forces) for the payload math models on a specific Shuttle Transport System (STS) flight using the shuttle math model and associated forcing functions. This innovation solves the problem of coupling of payload math models into a shuttle math model. It performs a transient loads analysis simulating liftoff, landing, and all flight events between liftoff and landing. ATLAS utilizes efficient and numerically stable algorithms available in MSC/NASTRAN.

  6. QoS measurement of Zigbee home automation network using various modulation schemes

    Directory of Open Access Journals (Sweden)

    Gurjit Kaur,

    2011-02-01

    Full Text Available Increased demands on implementation of wireless sensor networks in automation praxis resulted in relatively new wireless standard – Zigbee. The IEEE 802.15.4 standard specifies the multiple PHYs for 868, 915 and 2400 MHz three frequency bands. In this paper we presents the performance analysis of different modulation schemesat different frequency range like ASK, BPSK and QPSK at 868, 915 & 2400 MHz based on their effect on the quality of service parameter by using CBR application in Zigbee home automation network using static IEEE 802.15.4. The QoS parameters such as average end-to-end delay, jitter, and throughput are investigated as the performance metrics. The results show that even though ASK at 868 MHz gives the highest throughput of 100% which is highest amongst all five modulation schemes but still overall ASK at 915 MHz is the best suited modulation schemes for CBR application of Zigbee home automation since it produced lowest jitter value of 0.03s and lowest average end-to-end delay value of 3.30s which are both favorable conditions for better performance of CBR application of Zigbee home automation network.

  7. Automated Modeling of Microwave Structures by Enhanced Neural Networks

    Directory of Open Access Journals (Sweden)

    Z. Raida

    2006-12-01

    Full Text Available The paper describes the methodology of the automated creation of neural models of microwave structures. During the creation process, artificial neural networks are trained using the combination of the particle swarm optimization and the quasi-Newton method to avoid critical training problems of the conventional neural nets. In the paper, neural networks are used to approximate the behavior of a planar microwave filter (moment method, Zeland IE3D. In order to evaluate the efficiency of neural modeling, global optimizations are performed using numerical models and neural ones. Both approaches are compared from the viewpoint of CPU-time demands and the accuracy. Considering conclusions, methodological recommendations for including neural networks to the microwave design are formulated.

  8. A computational framework for the automated construction of glycosylation reaction networks.

    Directory of Open Access Journals (Sweden)

    Gang Liu

    Full Text Available Glycosylation is among the most common and complex post-translational modifications identified to date. It proceeds through the catalytic action of multiple enzyme families that include the glycosyltransferases that add monosaccharides to growing glycans, and glycosidases which remove sugar residues to trim glycans. The expression level and specificity of these enzymes, in part, regulate the glycan distribution or glycome of specific cell/tissue systems. Currently, there is no systematic method to describe the enzymes and cellular reaction networks that catalyze glycosylation. To address this limitation, we present a streamlined machine-readable definition for the glycosylating enzymes and additional methodologies to construct and analyze glycosylation reaction networks. In this computational framework, the enzyme class is systematically designed to store detailed specificity data such as enzymatic functional group, linkage and substrate specificity. The new classes and their associated functions enable both single-reaction inference and automated full network reconstruction, when given a list of reactants and/or products along with the enzymes present in the system. In addition, graph theory is used to support functions that map the connectivity between two or more species in a network, and that generate subset models to identify rate-limiting steps regulating glycan biosynthesis. Finally, this framework allows the synthesis of biochemical reaction networks using mass spectrometry (MS data. The features described above are illustrated using three case studies that examine: i O-linked glycan biosynthesis during the construction of functional selectin-ligands; ii automated N-linked glycosylation pathway construction; and iii the handling and analysis of glycomics based MS data. Overall, the new computational framework enables automated glycosylation network model construction and analysis by integrating knowledge of glycan structure and enzyme

  9. Techniques for Automated Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Marcus, Ryan C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-09-02

    The performance of a particular HPC code depends on a multitude of variables, including compiler selection, optimization flags, OpenMP pool size, file system load, memory usage, MPI configuration, etc. As a result of this complexity, current predictive models have limited applicability, especially at scale. We present a formulation of scientific codes, nodes, and clusters that reduces complex performance analysis to well-known mathematical techniques. Building accurate predictive models and enhancing our understanding of scientific codes at scale is an important step towards exascale computing.

  10. Proximate analysis by automated thermogravimetry

    Energy Technology Data Exchange (ETDEWEB)

    Elder, J.P.

    1983-05-01

    A study has been made of the use of the Perkin-Elmer thermogravimetric instrument TGS-2, under the control of the System 4 microprocessor for the automatic proximate analysis of solid fossil fuels and related matter. The programs developed are simple to operate, and do not require detailed temperature calibration of the instrumental system. They have been tested with coals of varying rank, biomass samples and Devonian oil shales all of which were of special importance to the State of Kentucky. Precise, accurate data conforming to ASTM specifications were obtained. The simplicity of the technique suggests that it may complement the classical ASTM method and could be used when this latter procedure cannot be employed. However, its adoption as a standardized method must await the development of statistical data resulting from interlaboratory testing on a variety of fossil fuels. (9 refs.)

  11. Survey on Wireless Sensor Network Technologies for Industrial Automation: The Security and Quality of Service Perspectives

    Directory of Open Access Journals (Sweden)

    Delphine Christin

    2010-04-01

    Full Text Available Wireless Sensor Networks (WSNs are gradually adopted in the industrial world due to their advantages over wired networks. In addition to saving cabling costs, WSNs widen the realm of environments feasible for monitoring. They thus add sensing and acting capabilities to objects in the physical world and allow for communication among these objects or with services in the future Internet. However, the acceptance of WSNs by the industrial automation community is impeded by open issues, such as security guarantees and provision of Quality of Service (QoS. To examine both of these perspectives, we select and survey relevant WSN technologies dedicated to industrial automation. We determine QoS requirements and carry out a threat analysis, which act as basis of our evaluation of the current state-of-the-art. According to the results of this evaluation, we identify and discuss open research issues.

  12. Ecological network analysis: network construction

    NARCIS (Netherlands)

    Fath, B.D.; Scharler, U.M.; Ulanowicz, R.E.; Hannon, B.

    2007-01-01

    Ecological network analysis (ENA) is a systems-oriented methodology to analyze within system interactions used to identify holistic properties that are otherwise not evident from the direct observations. Like any analysis technique, the accuracy of the results is as good as the data available, but t

  13. Automated Radiochemical Separation, Analysis, and Sensing

    International Nuclear Information System (INIS)

    Chapter 14 for the 2nd edition of the Handbook of Radioactivity Analysis. The techniques and examples described in this chapter demonstrate that modern fluidic techniques and instrumentation can be used to develop automated radiochemical separation workstations. In many applications, these can be mechanically simple and key parameters can be controlled from software. If desired, many of the fluidic components and solution can be located remotely from the radioactive samples and other hot sample processing zones. There are many issues to address in developing automated radiochemical separation that perform reliably time after time in unattended operation. These are associated primarily with the separation and analytical chemistry aspects of the process. The relevant issues include the selectivity of the separation, decontamination factors, matrix effects, and recoveries from the separation column. In addition, flow rate effects, column lifetimes, carryover from one sample to another, and sample throughput must be considered. Nevertheless, successful approaches for addressing these issues have been developed. Radiochemical analysis is required not only for processing nuclear waste samples in the laboratory, but also for at-site or in situ applications. Monitors for nuclear waste processing operations represent an at-site application where continuous unattended monitoring is required to assure effective process radiochemical separations that produce waste streams that qualify for conversion to stable waste forms. Radionuclide sensors for water monitoring and long term stewardship represent an application where at-site or in situ measurements will be most effective. Automated radiochemical analyzers and sensors have been developed that demonstrate that radiochemical analysis beyond the analytical laboratory is both possible and practical

  14. NEW TECHNIQUES USED IN AUTOMATED TEXT ANALYSIS

    Directory of Open Access Journals (Sweden)

    M. I strate

    2010-12-01

    Full Text Available Automated analysis of natural language texts is one of the most important knowledge discovery tasks for any organization. According to Gartner Group, almost 90% of knowledge available at an organization today is dispersed throughout piles of documents buried within unstructured text. Analyzing huge volumes of textual information is often involved in making informed and correct business decisions. Traditional analysis methods based on statistics fail to help processing unstructured texts and the society is in search of new technologies for text analysis. There exist a variety of approaches to the analysis of natural language texts, but most of them do not provide results that could be successfully applied in practice. This article concentrates on recent ideas and practical implementations in this area.

  15. APSAS; an Automated Particle Size Analysis System

    Science.gov (United States)

    Poppe, Lawrence J.; Eliason, A.H.; Fredericks, J.J.

    1985-01-01

    The Automated Particle Size Analysis System integrates a settling tube and an electroresistance multichannel particle-size analyzer (Coulter Counter) with a Pro-Comp/gg microcomputer and a Hewlett Packard 2100 MX(HP 2100 MX) minicomputer. This system and its associated software digitize the raw sediment grain-size data, combine the coarse- and fine-fraction data into complete grain-size distributions, perform method of moments and inclusive graphics statistics, verbally classify the sediment, generate histogram and cumulative frequency plots, and transfer the results into a data-retrieval system. This system saves time and labor and affords greater reliability, resolution, and reproducibility than conventional methods do.

  16. Toward the automated generation of genome-scale metabolic networks in the SEED

    Directory of Open Access Journals (Sweden)

    Gould John

    2007-04-01

    Full Text Available Abstract Background Current methods for the automated generation of genome-scale metabolic networks focus on genome annotation and preliminary biochemical reaction network assembly, but do not adequately address the process of identifying and filling gaps in the reaction network, and verifying that the network is suitable for systems level analysis. Thus, current methods are only sufficient for generating draft-quality networks, and refinement of the reaction network is still largely a manual, labor-intensive process. Results We have developed a method for generating genome-scale metabolic networks that produces substantially complete reaction networks, suitable for systems level analysis. Our method partitions the reaction space of central and intermediary metabolism into discrete, interconnected components that can be assembled and verified in isolation from each other, and then integrated and verified at the level of their interconnectivity. We have developed a database of components that are common across organisms, and have created tools for automatically assembling appropriate components for a particular organism based on the metabolic pathways encoded in the organism's genome. This focuses manual efforts on that portion of an organism's metabolism that is not yet represented in the database. We have demonstrated the efficacy of our method by reverse-engineering and automatically regenerating the reaction network from a published genome-scale metabolic model for Staphylococcus aureus. Additionally, we have verified that our method capitalizes on the database of common reaction network components created for S. aureus, by using these components to generate substantially complete reconstructions of the reaction networks from three other published metabolic models (Escherichia coli, Helicobacter pylori, and Lactococcus lactis. We have implemented our tools and database within the SEED, an open-source software environment for comparative

  17. Automated Analysis, Classification, and Display of Waveforms

    Science.gov (United States)

    Kwan, Chiman; Xu, Roger; Mayhew, David; Zhang, Frank; Zide, Alan; Bonggren, Jeff

    2004-01-01

    A computer program partly automates the analysis, classification, and display of waveforms represented by digital samples. In the original application for which the program was developed, the raw waveform data to be analyzed by the program are acquired from space-shuttle auxiliary power units (APUs) at a sampling rate of 100 Hz. The program could also be modified for application to other waveforms -- for example, electrocardiograms. The program begins by performing principal-component analysis (PCA) of 50 normal-mode APU waveforms. Each waveform is segmented. A covariance matrix is formed by use of the segmented waveforms. Three eigenvectors corresponding to three principal components are calculated. To generate features, each waveform is then projected onto the eigenvectors. These features are displayed on a three-dimensional diagram, facilitating the visualization of the trend of APU operations.

  18. An automated confirmatory system for analysis of mammograms.

    Science.gov (United States)

    Peng, W; Mayorga, R V; Hussein, E M A

    2016-03-01

    This paper presents an integrated system for the automatic analysis of mammograms to assist radiologists in confirming their diagnosis in mammography screening. The proposed automated confirmatory system (ACS) can process a digitalized mammogram online, and generates a high quality filtered segmentation of an image for biological interpretation and a texture-feature based diagnosis. We use a serial of image pre-processing and segmentation techniques, including 2D median filtering, seeded region growing (SRG) algorithm, image contrast enhancement, to remove noise, delete radiopaque artifacts and eliminate the projection of the pectoral muscle from a digitalized mammogram. We also develop an entire-image texture-feature based classification method, by combining a Rough-set approach to extract five fundamental texture features from images, and then an Artificial Neural Network technique to classify a mammogram as: normal; indicating the presence of a benign lump; or representing a malignant tumor. Here, 222 random images from the Mammographic Image Analysis Society (MIAS) database are used for the offline ACS training. Once the system is tuned and trained, it is ready for the automated use for the analysis and diagnosis of new mammograms. To test the trained system, a separate set of 100 random images from the MIAS and another set of 100 random images from the independent BancoWeb database are selected. The proposed ACS is shown to be successful in confirming diagnosis of mammograms from the two independent databases. PMID:26742491

  19. Automated and comprehensive link engineering supporting branched, ring, and mesh network topologies

    Science.gov (United States)

    Farina, J.; Khomchenko, D.; Yevseyenko, D.; Meester, J.; Richter, A.

    2016-02-01

    Link design, while relatively easy in the past, can become quite cumbersome with complex channel plans and equipment configurations. The task of designing optical transport systems and selecting equipment is often performed by an applications or sales engineer using simple tools, such as custom Excel spreadsheets. Eventually, every individual has their own version of the spreadsheet as well as their own methodology for building the network. This approach becomes unmanageable very quickly and leads to mistakes, bending of the engineering rules and installations that do not perform as expected. We demonstrate a comprehensive planning environment, which offers an efficient approach to unify, control and expedite the design process by controlling libraries of equipment and engineering methodologies, automating the process and providing the analysis tools necessary to predict system performance throughout the system and for all channels. In addition to the placement of EDFAs and DCEs, performance analysis metrics are provided at every step of the way. Metrics that can be tracked include power, CD and OSNR, SPM, XPM, FWM and SBS. Automated routine steps assist in design aspects such as equalization, padding and gain setting for EDFAs, the placement of ROADMs and transceivers, and creating regeneration points. DWDM networks consisting of a large number of nodes and repeater huts, interconnected in linear, branched, mesh and ring network topologies, can be designed much faster when compared with conventional design methods. Using flexible templates for all major optical components, our technology-agnostic planning approach supports the constant advances in optical communications.

  20. ASteCA - Automated Stellar Cluster Analysis

    CERN Document Server

    Perren, Gabriel I; Piatti, Andrés E

    2014-01-01

    We present ASteCA (Automated Stellar Cluster Analysis), a suit of tools designed to fully automatize the standard tests applied on stellar clusters to determine their basic parameters. The set of functions included in the code make use of positional and photometric data to obtain precise and objective values for a given cluster's center coordinates, radius, luminosity function and integrated color magnitude, as well as characterizing through a statistical estimator its probability of being a true physical cluster rather than a random overdensity of field stars. ASteCA incorporates a Bayesian field star decontamination algorithm capable of assigning membership probabilities using photometric data alone. An isochrone fitting process based on the generation of synthetic clusters from theoretical isochrones and selection of the best fit through a genetic algorithm is also present, which allows ASteCA to provide accurate estimates for a cluster's metallicity, age, extinction and distance values along with its unce...

  1. Hybrid digital signal processing and neural networks for automated diagnostics using NDE methods

    Energy Technology Data Exchange (ETDEWEB)

    Upadhyaya, B.R.; Yan, W. [Tennessee Univ., Knoxville, TN (United States). Dept. of Nuclear Engineering

    1993-11-01

    The primary purpose of the current research was to develop an integrated approach by combining information compression methods and artificial neural networks for the monitoring of plant components using nondestructive examination data. Specifically, data from eddy current inspection of heat exchanger tubing were utilized to evaluate this technology. The focus of the research was to develop and test various data compression methods (for eddy current data) and the performance of different neural network paradigms for defect classification and defect parameter estimation. Feedforward, fully-connected neural networks, that use the back-propagation algorithm for network training, were implemented for defect classification and defect parameter estimation using a modular network architecture. A large eddy current tube inspection database was acquired from the Metals and Ceramics Division of ORNL. These data were used to study the performance of artificial neural networks for defect type classification and for estimating defect parameters. A PC-based data preprocessing and display program was also developed as part of an expert system for data management and decision making. The results of the analysis showed that for effective (low-error) defect classification and estimation of parameters, it is necessary to identify proper feature vectors using different data representation methods. The integration of data compression and artificial neural networks for information processing was established as an effective technique for automation of diagnostics using nondestructive examination methods.

  2. Hybrid digital signal processing and neural networks for automated diagnostics using NDE methods

    International Nuclear Information System (INIS)

    The primary purpose of the current research was to develop an integrated approach by combining information compression methods and artificial neural networks for the monitoring of plant components using nondestructive examination data. Specifically, data from eddy current inspection of heat exchanger tubing were utilized to evaluate this technology. The focus of the research was to develop and test various data compression methods (for eddy current data) and the performance of different neural network paradigms for defect classification and defect parameter estimation. Feedforward, fully-connected neural networks, that use the back-propagation algorithm for network training, were implemented for defect classification and defect parameter estimation using a modular network architecture. A large eddy current tube inspection database was acquired from the Metals and Ceramics Division of ORNL. These data were used to study the performance of artificial neural networks for defect type classification and for estimating defect parameters. A PC-based data preprocessing and display program was also developed as part of an expert system for data management and decision making. The results of the analysis showed that for effective (low-error) defect classification and estimation of parameters, it is necessary to identify proper feature vectors using different data representation methods. The integration of data compression and artificial neural networks for information processing was established as an effective technique for automation of diagnostics using nondestructive examination methods

  3. Automated Stellar Classification for Large Surveys with EKF and RBF Neural Networks

    Institute of Scientific and Technical Information of China (English)

    Ling Bai; Ping Guo; Zhan-Yi Hu

    2005-01-01

    An automated classification technique for large size stellar surveys is proposed. It uses the extended Kalman filter as a feature selector and pre-classifier of the data, and the radial basis function neural networks for the classification.Experiments with real data have shown that the correct classification rate can reach as high as 93%, which is quite satisfactory. When different system models are selected for the extended Kalman filter, the classification results are relatively stable. It is shown that for this particular case the result using extended Kalman filter is better than using principal component analysis.

  4. PERFORMANCE ANALYSIS OF FIELD LAYER COMMUNICATING NETWORK BASED ON POLLING PROTOCOL IN SUBSTATION AUTOMATION SYSTEM%基于polling方式的变电站现场级通信网络性能

    Institute of Scientific and Technical Information of China (English)

    王峥; 胡敏强; 郑建勇

    2001-01-01

    This article firstly presents the basic analysis of the data generated from field measuring and controlling units (FMCU), using the simple method of probability theory and considering the particularity of middle or small integrated substation automation systems. Then detailed discussions on each segment of the program running in the FMCU are given. The limitation on CPU resources of FMCUs is studied, which is closely related with polling cycle. After each part of the polling cycle is determined, the expressions about the steady performances of field layer communicating network such as channel efficiency or average delay are given, on which the optimum number of FMCUs in such network is obtained.%针对中小型变电站自动化系统中通信网络的特点,首先采用概率统计的基本方法分析了现场测控单元(简称为FMCU)数据产生的规律,然后对FMCU内部的各个程序段进行了较为详细的讨论,确定了FMCU的CPU资源对轮询(polling)周期的约束条件。在逐一分析了轮询周期的组成部分之后,给出了在上述约束条件下的现场级通信网络稳态性能指标,即信道利用率、数据平均发送延时与轮询周期及FMCU数目之间相互关系的表达式。在此基础上提出了确定通信网络最佳FMCU配置数目的方法。

  5. An approach to automated chromosome analysis

    International Nuclear Information System (INIS)

    The methods of approach developed with a view to automatic processing of the different stages of chromosome analysis are described in this study divided into three parts. Part 1 relates the study of automated selection of metaphase spreads, which operates a decision process in order to reject ail the non-pertinent images and keep the good ones. This approach has been achieved by Computing a simulation program that has allowed to establish the proper selection algorithms in order to design a kit of electronic logical units. Part 2 deals with the automatic processing of the morphological study of the chromosome complements in a metaphase: the metaphase photographs are processed by an optical-to-digital converter which extracts the image information and writes it out as a digital data set on a magnetic tape. For one metaphase image this data set includes some 200 000 grey values, encoded according to a 16, 32 or 64 grey-level scale, and is processed by a pattern recognition program isolating the chromosomes and investigating their characteristic features (arm tips, centromere areas), in order to get measurements equivalent to the lengths of the four arms. Part 3 studies a program of automated karyotyping by optimized pairing of human chromosomes. The data are derived from direct digitizing of the arm lengths by means of a BENSON digital reader. The program supplies' 1/ a list of the pairs, 2/ a graphic representation of the pairs so constituted according to their respective lengths and centromeric indexes, and 3/ another BENSON graphic drawing according to the author's own representation of the chromosomes, i.e. crosses with orthogonal arms, each branch being the accurate measurement of the corresponding chromosome arm. This conventionalized karyotype indicates on the last line the really abnormal or non-standard images unpaired by the program, which are of special interest for the biologist. (author)

  6. ASteCA: Automated Stellar Cluster Analysis

    Science.gov (United States)

    Perren, G. I.; Vázquez, R. A.; Piatti, A. E.

    2015-04-01

    We present the Automated Stellar Cluster Analysis package (ASteCA), a suit of tools designed to fully automate the standard tests applied on stellar clusters to determine their basic parameters. The set of functions included in the code make use of positional and photometric data to obtain precise and objective values for a given cluster's center coordinates, radius, luminosity function and integrated color magnitude, as well as characterizing through a statistical estimator its probability of being a true physical cluster rather than a random overdensity of field stars. ASteCA incorporates a Bayesian field star decontamination algorithm capable of assigning membership probabilities using photometric data alone. An isochrone fitting process based on the generation of synthetic clusters from theoretical isochrones and selection of the best fit through a genetic algorithm is also present, which allows ASteCA to provide accurate estimates for a cluster's metallicity, age, extinction and distance values along with its uncertainties. To validate the code we applied it on a large set of over 400 synthetic MASSCLEAN clusters with varying degrees of field star contamination as well as a smaller set of 20 observed Milky Way open clusters (Berkeley 7, Bochum 11, Czernik 26, Czernik 30, Haffner 11, Haffner 19, NGC 133, NGC 2236, NGC 2264, NGC 2324, NGC 2421, NGC 2627, NGC 6231, NGC 6383, NGC 6705, Ruprecht 1, Tombaugh 1, Trumpler 1, Trumpler 5 and Trumpler 14) studied in the literature. The results show that ASteCA is able to recover cluster parameters with an acceptable precision even for those clusters affected by substantial field star contamination. ASteCA is written in Python and is made available as an open source code which can be downloaded ready to be used from its official site.

  7. Ecological Automation Design, Extending Work Domain Analysis

    NARCIS (Netherlands)

    Amelink, M.H.J.

    2010-01-01

    In high–risk domains like aviation, medicine and nuclear power plant control, automation has enabled new capabilities, increased the economy of operation and has greatly contributed to safety. However, automation increases the number of couplings in a system, which can inadvertently lead to more com

  8. Automation literature: A brief review and analysis

    Science.gov (United States)

    Smith, D.; Dieterly, D. L.

    1980-01-01

    Current thought and research positions which may allow for an improved capability to understand the impact of introducing automation to an existing system are established. The orientation was toward the type of studies which may provide some general insight into automation; specifically, the impact of automation in human performance and the resulting system performance. While an extensive number of articles were reviewed, only those that addressed the issue of automation and human performance were selected to be discussed. The literature is organized along two dimensions: time, Pre-1970, Post-1970; and type of approach, Engineering or Behavioral Science. The conclusions reached are not definitive, but do provide the initial stepping stones in an attempt to begin to bridge the concept of automation in a systematic progression.

  9. Automated analysis and annotation of basketball video

    Science.gov (United States)

    Saur, Drew D.; Tan, Yap-Peng; Kulkarni, Sanjeev R.; Ramadge, Peter J.

    1997-01-01

    Automated analysis and annotation of video sequences are important for digital video libraries, content-based video browsing and data mining projects. A successful video annotation system should provide users with useful video content summary in a reasonable processing time. Given the wide variety of video genres available today, automatically extracting meaningful video content for annotation still remains hard by using current available techniques. However, a wide range video has inherent structure such that some prior knowledge about the video content can be exploited to improve our understanding of the high-level video semantic content. In this paper, we develop tools and techniques for analyzing structured video by using the low-level information available directly from MPEG compressed video. Being able to work directly in the video compressed domain can greatly reduce the processing time and enhance storage efficiency. As a testbed, we have developed a basketball annotation system which combines the low-level information extracted from MPEG stream with the prior knowledge of basketball video structure to provide high level content analysis, annotation and browsing for events such as wide- angle and close-up views, fast breaks, steals, potential shots, number of possessions and possession times. We expect our approach can also be extended to structured video in other domains.

  10. Automated Detection of Soma Location and Morphology in Neuronal Network Cultures

    OpenAIRE

    Burcin Ozcan; Pooran Negi; Fernanda Laezza; Manos Papadakis; Demetrio Labate

    2015-01-01

    Automated identification of the primary components of a neuron and extraction of its sub-cellular features are essential steps in many quantitative studies of neuronal networks. The focus of this paper is the development of an algorithm for the automated detection of the location and morphology of somas in confocal images of neuronal network cultures. This problem is motivated by applications in high-content screenings (HCS), where the extraction of multiple morphological features of neurons ...

  11. Automated Cache Performance Analysis And Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Mohror, Kathryn [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-12-23

    While there is no lack of performance counter tools for coarse-grained measurement of cache activity, there is a critical lack of tools for relating data layout to cache behavior to application performance. Generally, any nontrivial optimizations are either not done at all, or are done ”by hand” requiring significant time and expertise. To the best of our knowledge no tool available to users measures the latency of memory reference instructions for partic- ular addresses and makes this information available to users in an easy-to-use and intuitive way. In this project, we worked to enable the Open|SpeedShop performance analysis tool to gather memory reference latency information for specific instructions and memory ad- dresses, and to gather and display this information in an easy-to-use and intuitive way to aid performance analysts in identifying problematic data structures in their codes. This tool was primarily designed for use in the supercomputer domain as well as grid, cluster, cloud-based parallel e-commerce, and engineering systems and middleware. Ultimately, we envision a tool to automate optimization of application cache layout and utilization in the Open|SpeedShop performance analysis tool. To commercialize this soft- ware, we worked to develop core capabilities for gathering enhanced memory usage per- formance data from applications and create and apply novel methods for automatic data structure layout optimizations, tailoring the overall approach to support existing supercom- puter and cluster programming models and constraints. In this Phase I project, we focused on infrastructure necessary to gather performance data and present it in an intuitive way to users. With the advent of enhanced Precise Event-Based Sampling (PEBS) counters on recent Intel processor architectures and equivalent technology on AMD processors, we are now in a position to access memory reference information for particular addresses. Prior to the introduction of PEBS counters

  12. Request-Driven Schedule Automation for the Deep Space Network

    Science.gov (United States)

    Johnston, Mark D.; Tran, Daniel; Arroyo, Belinda; Call, Jared; Mercado, Marisol

    2010-01-01

    The DSN Scheduling Engine (DSE) has been developed to increase the level of automated scheduling support available to users of NASA s Deep Space Network (DSN). We have adopted a request-driven approach to DSN scheduling, in contrast to the activity-oriented approach used up to now. Scheduling requests allow users to declaratively specify patterns and conditions on their DSN service allocations, including timing, resource requirements, gaps, overlaps, time linkages among services, repetition, priorities, and a wide range of additional factors and preferences. The DSE incorporates a model of the key constraints and preferences of the DSN scheduling domain, along with algorithms to expand scheduling requests into valid resource allocations, to resolve schedule conflicts, and to repair unsatisfied requests. We use time-bounded systematic search with constraint relaxation to return nearby solutions if exact ones cannot be found, where the relaxation options and order are under user control. To explore the usability aspects of our approach we have developed a graphical user interface incorporating some crucial features to make it easier to work with complex scheduling requests. Among these are: progressive revelation of relevant detail, immediate propagation and visual feedback from a user s decisions, and a meeting calendar metaphor for repeated patterns of requests. Even as a prototype, the DSE has been deployed and adopted as the initial step in building the operational DSN schedule, thus representing an important initial validation of our overall approach. The DSE is a core element of the DSN Service Scheduling Software (S(sup 3)), a web-based collaborative scheduling system now under development for deployment to all DSN users.

  13. Communication Network Analysis Methods.

    Science.gov (United States)

    Farace, Richard V.; Mabee, Timothy

    This paper reviews a variety of analytic procedures that can be applied to network data, discussing the assumptions and usefulness of each procedure when applied to the complexity of human communication. Special attention is paid to the network properties measured or implied by each procedure. Factor analysis and multidimensional scaling are among…

  14. An Automated Capacitance-Based Fuel Level Monitoring System for Networked Tanks

    Directory of Open Access Journals (Sweden)

    Oke Alice O

    2015-08-01

    Full Text Available The making of an effective fuel measuring system has been a great challenge in the Nigerian industry, as various oil organization are running into different problems ranging from fire outbreak, oil pilfering, oil spillage and some other negative effects. The use of meter rule or long rod at most petrol filling stations for quantity assessment of fuel in tank is inefficient, stressful, dangerous and almost impossible in a networking environment. This archaic method does not provide good reorder date and does not give a good inventory. As such there is a need to automate the system by providing a real time measurement of fuel storage device to meet the demand of the customers. In this paper, a system was designed to sense the level of fuel in a networked tanks using a capacitive sensor controlled by an ATMEGA 328 Arduino microcontroller. The result was automated both in digital and analogue form through radio frequency Transmission using XBee and interfaced to Computer System for notification of fuel level and refill operations. This enables consumption control, cost analysis and tax accounting for fuel purchases

  15. Automated image analysis techniques for cardiovascular magnetic resonance imaging

    NARCIS (Netherlands)

    Geest, Robertus Jacobus van der

    2011-01-01

    The introductory chapter provides an overview of various aspects related to quantitative analysis of cardiovascular MR (CMR) imaging studies. Subsequently, the thesis describes several automated methods for quantitative assessment of left ventricular function from CMR imaging studies. Several novel

  16. Automated Measurement and Signaling Systems for the Transactional Network

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Brown, Richard; Price, Phillip; Page, Janie; Granderson, Jessica; Riess, David; Czarnecki, Stephen; Ghatikar, Girish; Lanzisera, Steven

    2013-12-31

    The Transactional Network Project is a multi-lab activity funded by the US Department of Energy?s Building Technologies Office. The project team included staff from Lawrence Berkeley National Laboratory, Pacific Northwest National Laboratory and Oak Ridge National Laboratory. The team designed, prototyped and tested a transactional network (TN) platform to support energy, operational and financial transactions between any networked entities (equipment, organizations, buildings, grid, etc.). PNNL was responsible for the development of the TN platform, with agents for this platform developed by each of the three labs. LBNL contributed applications to measure the whole-building electric load response to various changes in building operations, particularly energy efficiency improvements and demand response events. We also provide a demand response signaling agent and an agent for cost savings analysis. LBNL and PNNL demonstrated actual transactions between packaged rooftop units and the electric grid using the platform and selected agents. This document describes the agents and applications developed by the LBNL team, and associated tests of the applications.

  17. Distributed Microprocessor Automation Network for Synthesizing Radiotracers Used in Positron Emission Tomography [PET

    Science.gov (United States)

    Russell, J. A. G.; Alexoff, D. L.; Wolf, A. P.

    1984-09-01

    This presentation describes an evolving distributed microprocessor network for automating the routine production synthesis of radiotracers used in Positron Emission Tomography. We first present a brief overview of the PET method for measuring biological function, and then outline the general procedure for producing a radiotracer. The paper identifies several reasons for our automating the syntheses of these compounds. There is a description of the distributed microprocessor network architecture chosen and the rationale for that choice. Finally, we speculate about how this network may be exploited to extend the power of the PET method from the large university or National Laboratory to the biomedical research and clinical community at large. (DT)

  18. Automated Steel Cleanliness Analysis Tool (ASCAT)

    International Nuclear Information System (INIS)

    The objective of this study was to develop the Automated Steel Cleanliness Analysis Tool (ASCATTM) to permit steelmakers to evaluate the quality of the steel through the analysis of individual inclusions. By characterizing individual inclusions, determinations can be made as to the cleanliness of the steel. Understanding the complicating effects of inclusions in the steelmaking process and on the resulting properties of steel allows the steel producer to increase throughput, better control the process, reduce remelts, and improve the quality of the product. The ASCAT (Figure 1) is a steel-smart inclusion analysis tool developed around a customized next-generation computer controlled scanning electron microscopy (NG-CCSEM) hardware platform that permits acquisition of inclusion size and composition data at a rate never before possible in SEM-based instruments. With built-in customized ''intelligent'' software, the inclusion data is automatically sorted into clusters representing different inclusion types to define the characteristics of a particular heat (Figure 2). The ASCAT represents an innovative new tool for the collection of statistically meaningful data on inclusions, and provides a means of understanding the complicated effects of inclusions in the steel making process and on the resulting properties of steel. Research conducted by RJLG with AISI (American Iron and Steel Institute) and SMA (Steel Manufactures of America) members indicates that the ASCAT has application in high-grade bar, sheet, plate, tin products, pipes, SBQ, tire cord, welding rod, and specialty steels and alloys where control of inclusions, whether natural or engineered, are crucial to their specification for a given end-use. Example applications include castability of calcium treated steel; interstitial free (IF) degasser grade slag conditioning practice; tundish clogging and erosion minimization; degasser circulation and optimization; quality assessment/steel cleanliness; slab, billet

  19. Automated Steel Cleanliness Analysis Tool (ASCAT)

    Energy Technology Data Exchange (ETDEWEB)

    Gary Casuccio (RJ Lee Group); Michael Potter (RJ Lee Group); Fred Schwerer (RJ Lee Group); Dr. Richard J. Fruehan (Carnegie Mellon University); Dr. Scott Story (US Steel)

    2005-12-30

    The objective of this study was to develop the Automated Steel Cleanliness Analysis Tool (ASCATTM) to permit steelmakers to evaluate the quality of the steel through the analysis of individual inclusions. By characterizing individual inclusions, determinations can be made as to the cleanliness of the steel. Understanding the complicating effects of inclusions in the steelmaking process and on the resulting properties of steel allows the steel producer to increase throughput, better control the process, reduce remelts, and improve the quality of the product. The ASCAT (Figure 1) is a steel-smart inclusion analysis tool developed around a customized next-generation computer controlled scanning electron microscopy (NG-CCSEM) hardware platform that permits acquisition of inclusion size and composition data at a rate never before possible in SEM-based instruments. With built-in customized ''intelligent'' software, the inclusion data is automatically sorted into clusters representing different inclusion types to define the characteristics of a particular heat (Figure 2). The ASCAT represents an innovative new tool for the collection of statistically meaningful data on inclusions, and provides a means of understanding the complicated effects of inclusions in the steel making process and on the resulting properties of steel. Research conducted by RJLG with AISI (American Iron and Steel Institute) and SMA (Steel Manufactures of America) members indicates that the ASCAT has application in high-grade bar, sheet, plate, tin products, pipes, SBQ, tire cord, welding rod, and specialty steels and alloys where control of inclusions, whether natural or engineered, are crucial to their specification for a given end-use. Example applications include castability of calcium treated steel; interstitial free (IF) degasser grade slag conditioning practice; tundish clogging and erosion minimization; degasser circulation and optimization; quality assessment

  20. Neural Network Based State of Health Diagnostics for an Automated Radioxenon Sampler/Analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Keller, Paul E.; Kangas, Lars J.; Hayes, James C.; Schrom, Brian T.; Suarez, Reynold; Hubbard, Charles W.; Heimbigner, Tom R.; McIntyre, Justin I.

    2009-05-13

    Artificial neural networks (ANNs) are used to determine the state-of-health (SOH) of the Automated Radioxenon Analyzer/Sampler (ARSA). ARSA is a gas collection and analysis system used for non-proliferation monitoring in detecting radioxenon released during nuclear tests. SOH diagnostics are important for automated, unmanned sensing systems so that remote detection and identification of problems can be made without onsite staff. Both recurrent and feed-forward ANNs are presented. The recurrent ANN is trained to predict sensor values based on current valve states, which control air flow, so that with only valve states the normal SOH sensor values can be predicted. Deviation between modeled value and actual is an indication of a potential problem. The feed-forward ANN acts as a nonlinear version of principal components analysis (PCA) and is trained to replicate the normal SOH sensor values. Because of ARSA’s complexity, this nonlinear PCA is better able to capture the relationships among the sensors than standard linear PCA and is applicable to both sensor validation and recognizing off-normal operating conditions. Both models provide valuable information to detect impending malfunctions before they occur to avoid unscheduled shutdown. Finally, the ability of ANN methods to predict the system state is presented.

  1. Automated Cache Performance Analysis And Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Mohror, Kathryn [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-12-23

    While there is no lack of performance counter tools for coarse-grained measurement of cache activity, there is a critical lack of tools for relating data layout to cache behavior to application performance. Generally, any nontrivial optimizations are either not done at all, or are done ”by hand” requiring significant time and expertise. To the best of our knowledge no tool available to users measures the latency of memory reference instructions for partic- ular addresses and makes this information available to users in an easy-to-use and intuitive way. In this project, we worked to enable the Open|SpeedShop performance analysis tool to gather memory reference latency information for specific instructions and memory ad- dresses, and to gather and display this information in an easy-to-use and intuitive way to aid performance analysts in identifying problematic data structures in their codes. This tool was primarily designed for use in the supercomputer domain as well as grid, cluster, cloud-based parallel e-commerce, and engineering systems and middleware. Ultimately, we envision a tool to automate optimization of application cache layout and utilization in the Open|SpeedShop performance analysis tool. To commercialize this soft- ware, we worked to develop core capabilities for gathering enhanced memory usage per- formance data from applications and create and apply novel methods for automatic data structure layout optimizations, tailoring the overall approach to support existing supercom- puter and cluster programming models and constraints. In this Phase I project, we focused on infrastructure necessary to gather performance data and present it in an intuitive way to users. With the advent of enhanced Precise Event-Based Sampling (PEBS) counters on recent Intel processor architectures and equivalent technology on AMD processors, we are now in a position to access memory reference information for particular addresses. Prior to the introduction of PEBS counters

  2. Automated migration analysis based on cell texture: method & reliability

    Directory of Open Access Journals (Sweden)

    Chittenden Thomas W

    2005-03-01

    Full Text Available Abstract Background In this paper, we present and validate a way to measure automatically the extent of cell migration based on automated examination of a series of digital photographs. It was designed specifically to identify the impact of Second Hand Smoke (SHS on endothelial cell migration but has broader applications. The analysis has two stages: (1 preprocessing of image texture, and (2 migration analysis. Results The output is a graphic overlay that indicates the front lines of cell migration superimposed on each original image, with automated reporting of the distance traversed vs. time. Expert preference compares to manual placement of leading edge shows complete equivalence of automated vs. manual leading edge definition for cell migration measurement. Conclusion Our method is indistinguishable from careful manual determinations of cell front lines, with the advantages of full automation, objectivity, and speed.

  3. Automated detection of soma location and morphology in neuronal network cultures.

    Directory of Open Access Journals (Sweden)

    Burcin Ozcan

    Full Text Available Automated identification of the primary components of a neuron and extraction of its sub-cellular features are essential steps in many quantitative studies of neuronal networks. The focus of this paper is the development of an algorithm for the automated detection of the location and morphology of somas in confocal images of neuronal network cultures. This problem is motivated by applications in high-content screenings (HCS, where the extraction of multiple morphological features of neurons on large data sets is required. Existing algorithms are not very efficient when applied to the analysis of confocal image stacks of neuronal cultures. In addition to the usual difficulties associated with the processing of fluorescent images, these types of stacks contain a small number of images so that only a small number of pixels are available along the z-direction and it is challenging to apply conventional 3D filters. The algorithm we present in this paper applies a number of innovative ideas from the theory of directional multiscale representations and involves the following steps: (i image segmentation based on support vector machines with specially designed multiscale filters; (ii soma extraction and separation of contiguous somas, using a combination of level set method and directional multiscale filters. We also present an approach to extract the soma's surface morphology using the 3D shearlet transform. Extensive numerical experiments show that our algorithms are computationally efficient and highly accurate in segmenting the somas and separating contiguous ones. The algorithms presented in this paper will facilitate the development of a high-throughput quantitative platform for the study of neuronal networks for HCS applications.

  4. Feed forward neural networks and genetic algorithms for automated financial time series modelling

    OpenAIRE

    Kingdon, J. C.

    1995-01-01

    This thesis presents an automated system for financial time series modelling. Formal and applied methods are investigated for combining feed-forward Neural Networks and Genetic Algorithms (GAs) into a single adaptive/learning system for automated time series forecasting. Four important research contributions arise from this investigation: i) novel forms of GAs are introduced which are designed to counter the representational bias associated with the conventional Holland GA, ii) an...

  5. AN AUTOMATED TESTING NETWORK SYSTEM FOR PAPER LABORATORY BASED ON CAN BUS

    Institute of Scientific and Technical Information of China (English)

    Xianhui Yi; Dongbo Yan; Huanbin Liu; Jigeng Li

    2004-01-01

    This paper presents an automated testing network system for paper laboratory based on CAN bus. The overall architecture, hardware interface and software function are discussed in detail. It is indicated through experiment that the system can collect,analyze and store the test results from the various measuring instruments in the paper lab automatically.The simple, reliable, low-cost measuring automation system will have a prosperous application in the future paper industry.

  6. Automated Construction of Node Software Using Attributes in a Ubiquitous Sensor Network Environment

    OpenAIRE

    JangMook Kang; Woojin Lee; Juil Kim

    2010-01-01

    In sensor networks, nodes must often operate in a demanding environment facing restrictions such as restricted computing resources, unreliable wireless communication and power shortages. Such factors make the development of ubiquitous sensor network (USN) applications challenging. To help developers construct a large amount of node software for sensor network applications easily and rapidly, this paper proposes an approach to the automated construction of node software for USN applications us...

  7. Policy-Based Automation of Dynamique and Multipoint Virtual Private Network Simulation on OPNET Modeler

    Directory of Open Access Journals (Sweden)

    Ayoub BAHNASSE

    2014-12-01

    Full Text Available The simulation of large-scale networks is a challenging task especially if the network to simulate is the Dynamic Multipoint Virtual Private Network, it requires expert knowledge to properly configure its component technologies. The study of these network architectures in a real environment is almost impossible because it requires a very large number of equipment, however, this task is feasible in a simulation environment like OPNET Modeler, provided to master both the tool and the different architectures of the Dynamic Multipoint Virtual Private Network. Several research studies have been conducted to automate the generation and simulation of complex networks under various simulators, according to our research no work has dealt with the Dynamic Multipoint Virtual Private Network. In this paper we present a simulation model of the Dynamic and Multipoint Virtual Private network in OPNET Modeler, and a WEB-based tool for project management on the same network.

  8. Automating sensitivity analysis of computer models using computer calculus

    International Nuclear Information System (INIS)

    An automated procedure for performing sensitivity analysis has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with direct and adjoint sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies

  9. Obtaining informedness in collaborative networks through automated information provisioning

    DEFF Research Database (Denmark)

    Thimm, Heiko; Rasmussen, Karsten Boye

    2013-01-01

    Successful collaboration in business networks calls for well-informed network participants. Members who know about the many aspects of the network are an effective vehicle to successfully resolve conflicts, build a prospering collaboration climate and promote trust within the network...... provisioning service. This article presents a corresponding modelling framework and a rule-based approach for the active system capabilities required. Details of a prototype implementation building on concepts of the research area of active databases are also reported........ The importance of well-informed network participants has led to the concept of network participant informedness which is derived from existing theories and concepts for firm informedness. It is possible to support and develop well-informed network participants through a specialised IT-based active information...

  10. Analysis of Trinity Power Metrics for Automated Monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Michalenko, Ashley Christine [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-07-28

    This is a presentation from Los Alamos National Laboraotyr (LANL) about the analysis of trinity power metrics for automated monitoring. The following topics are covered: current monitoring efforts, motivation for analysis, tools used, the methodology, work performed during the summer, and future work planned.

  11. On Automating and Standardising Corpus Callosum Analysis in Brain MRI

    DEFF Research Database (Denmark)

    Stegmann, Mikkel Bille; Skoglund, Karl

    2005-01-01

    Corpus callosum analysis is influenced by many factors. The effort in controlling these has previously been incomplete and scattered. This paper sketches a complete pipeline for automated corpus callosum analysis from magnetic resonance images, with focus on measurement standardisation. The prese...

  12. Image analysis and platform development for automated phenotyping in cytomics

    NARCIS (Netherlands)

    Yan, Kuan

    2013-01-01

    This thesis is dedicated to the empirical study of image analysis in HT/HC screen study. Often a HT/HC screening produces extensive amounts that cannot be manually analyzed. Thus, an automated image analysis solution is prior to an objective understanding of the raw image data. Compared to general a

  13. Automating sensitivity analysis of computer models using computer calculus

    International Nuclear Information System (INIS)

    An automated procedure for performing sensitivity analyses has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with ''direct'' and ''adjoint'' sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies. 24 refs., 2 figs

  14. Social network analysis

    NARCIS (Netherlands)

    W. de Nooy

    2009-01-01

    Social network analysis (SNA) focuses on the structure of ties within a set of social actors, e.g., persons, groups, organizations, and nations, or the products of human activity or cognition such as web sites, semantic concepts, and so on. It is linked to structuralism in sociology stressing the si

  15. On-Chip Network Design Automation with Source Routing Switches

    Institute of Scientific and Technical Information of China (English)

    MA Liwei; SUN Yihe

    2007-01-01

    Network-on-chip (NoC) is a new design paradigm for system-on-chip intraconnections in the billion-transistor era. Application specific on-chip network design is essential for NoC success in this new era.This paper presents a class of source routing switch that can be used to efficiently form arbitrary network topologies and that can be optimized for various applications. Hardware description language versions of the networks can be generated automatically for simulations and for syntheses. A series of switches and networks has been configured with their performances including latency, delay, area, and power, and analyzed theoretically and experimentally. The results show that this NoC architecture provides a large design space for application specific on-chip network designs.

  16. Towards self-healing in distribution networks operation: Bipartite graph modelling for automated switching

    Energy Technology Data Exchange (ETDEWEB)

    Kost' alova, Alena; Carvalho, Pedro M.S. [Department of Electrical Engineering and Computers, Instituto Superior Tecnico, UTL, Av. Rovisco Pais, Lisbon (Portugal)

    2011-01-15

    The concept of self-healing has been recently introduced in power systems. The general self-healing framework is complex and includes several aspects of networks' operation. This paper deals with automated switching in the context of autonomous operation of distribution networks. The paper presents a new network data model that allows effective reconfiguration algorithms to be designed. The model is based on bipartite graph representation of switching possibilities. The model properties and capabilities are illustrated for simple self-healing algorithms and a small real world medium voltage distribution network. (author)

  17. Network performance analysis

    CERN Document Server

    Bonald, Thomas

    2013-01-01

    The book presents some key mathematical tools for the performance analysis of communication networks and computer systems.Communication networks and computer systems have become extremely complex. The statistical resource sharing induced by the random behavior of users and the underlying protocols and algorithms may affect Quality of Service.This book introduces the main results of queuing theory that are useful for analyzing the performance of these systems. These mathematical tools are key to the development of robust dimensioning rules and engineering methods. A number of examples i

  18. Tri-Band PCB Antenna for Wireless Sensor Network Transceivers in Home Automation Applications

    DEFF Research Database (Denmark)

    Rohde, John; Toftegaard, Thomas Skjødeberg

    2012-01-01

    A novel tri-band antenna design for wireless sensor network devices in home automation applications is proposed. The design is based on a combination of a conventional monopole wire antenna and discrete distributed load impedances. The load impedances are employed to ensure the degrees of freedom...

  19. SensorScheme: Supply chain management automation using Wireless Sensor Networks

    NARCIS (Netherlands)

    Evers, L.; Havinga, P.J.M.; Kuper, J.; Lijding, M.E.M.; Meratnia, N.

    2007-01-01

    The supply chain management business can benefit greatly from automation, as recent developments with RFID technology shows. The use of Wireless Sensor Network technology promises to bring the next leap in efficiency and quality of service. However, current WSN system software does not yet provide t

  20. Modeling Multiple Human-Automation Distributed Systems using Network-form Games

    Science.gov (United States)

    Brat, Guillaume

    2012-01-01

    The paper describes at a high-level the network-form game framework (based on Bayes net and game theory), which can be used to model and analyze safety issues in large, distributed, mixed human-automation systems such as NextGen.

  1. An overview of the contaminant analysis automation program

    International Nuclear Information System (INIS)

    The Department of Energy (DOE) has significant amounts of radioactive and hazardous wastes stored, buried, and still being generated at many sites within the United States. These wastes must be characterized to determine the elemental, isotopic, and compound content before remediation can begin. In this paper, the authors project that sampling requirements will necessitate generating more than 10 million samples by 1995, which will far exceed the capabilities of our current manual chemical analysis laboratories. The Contaminant Analysis Automation effort (CAA), with Los Alamos National Laboratory (LANL) as to the coordinating Laboratory, is designing and fabricating robotic systems that will standardize and automate both the hardware and the software of the most common environmental chemical methods. This will be accomplished by designing and producing several unique analysis systems called Standard Analysis Methods (SAM). Each SAM will automate a specific chemical method, including sample preparation, the analytical analysis, and the data interpretation, by using a building block known as the Standard Laboratory Module (SLM). This concept allows the chemist to assemble an automated environmental method using standardized SLMs easily and without the worry of hardware compatibility or the necessity of generating complicated control programs

  2. A holistic approach to ZigBee performance enhancement for home automation networks.

    Science.gov (United States)

    Betzler, August; Gomez, Carles; Demirkol, Ilker; Paradells, Josep

    2014-08-14

    Wireless home automation networks are gaining importance for smart homes. In this ambit, ZigBee networks play an important role. The ZigBee specification defines a default set of protocol stack parameters and mechanisms that is further refined by the ZigBee Home Automation application profile. In a holistic approach, we analyze how the network performance is affected with the tuning of parameters and mechanisms across multiple layers of the ZigBee protocol stack and investigate possible performance gains by implementing and testing alternative settings. The evaluations are carried out in a testbed of 57 TelosB motes. The results show that considerable performance improvements can be achieved by using alternative protocol stack configurations. From these results, we derive two improved protocol stack configurations for ZigBee wireless home automation networks that are validated in various network scenarios. In our experiments, these improved configurations yield a relative packet delivery ratio increase of up to 33.6%, a delay decrease of up to 66.6% and an improvement of the energy efficiency for battery powered devices of up to 48.7%, obtainable without incurring any overhead to the network.

  3. Analysis of computer networks

    CERN Document Server

    Gebali, Fayez

    2015-01-01

    This textbook presents the mathematical theory and techniques necessary for analyzing and modeling high-performance global networks, such as the Internet. The three main building blocks of high-performance networks are links, switching equipment connecting the links together, and software employed at the end nodes and intermediate switches. This book provides the basic techniques for modeling and analyzing these last two components. Topics covered include, but are not limited to: Markov chains and queuing analysis, traffic modeling, interconnection networks and switch architectures and buffering strategies.   ·         Provides techniques for modeling and analysis of network software and switching equipment; ·         Discusses design options used to build efficient switching equipment; ·         Includes many worked examples of the application of discrete-time Markov chains to communication systems; ·         Covers the mathematical theory and techniques necessary for ana...

  4. ORIGAMI Automator Primer. Automated ORIGEN Source Terms and Spent Fuel Storage Pool Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wieselquist, William A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Thompson, Adam B. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bowman, Stephen M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Peterson, Joshua L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-04-01

    Source terms and spent nuclear fuel (SNF) storage pool decay heat load analyses for operating nuclear power plants require a large number of Oak Ridge Isotope Generation and Depletion (ORIGEN) calculations. SNF source term calculations also require a significant amount of bookkeeping to track quantities such as core and assembly operating histories, spent fuel pool (SFP) residence times, heavy metal masses, and enrichments. The ORIGEN Assembly Isotopics (ORIGAMI) module in the SCALE code system provides a simple scheme for entering these data. However, given the large scope of the analysis, extensive scripting is necessary to convert formats and process data to create thousands of ORIGAMI input files (one per assembly) and to process the results into formats readily usable by follow-on analysis tools. This primer describes a project within the SCALE Fulcrum graphical user interface (GUI) called ORIGAMI Automator that was developed to automate the scripting and bookkeeping in large-scale source term analyses. The ORIGAMI Automator enables the analyst to (1) easily create, view, and edit the reactor site and assembly information, (2) automatically create and run ORIGAMI inputs, and (3) analyze the results from ORIGAMI. ORIGAMI Automator uses the standard ORIGEN binary concentrations files produced by ORIGAMI, with concentrations available at all time points in each assembly’s life. The GUI plots results such as mass, concentration, activity, and decay heat using a powerful new ORIGEN Post-Processing Utility for SCALE (OPUS) GUI component. This document includes a description and user guide for the GUI, a step-by-step tutorial for a simplified scenario, and appendices that document the file structures used.

  5. Hypothesis Testing for Automated Community Detection in Networks

    CERN Document Server

    Bickel, Peter J

    2013-01-01

    Community detection in networks is a key exploratory tool with applications in a diverse set of areas, ranging from finding communities in social and biological networks to identifying link farms in the World Wide Web. The problem of finding communities or clusters in a network has received much attention from statistics, physics and computer science. However, most clustering algorithms assume knowledge of the number of clusters k. In this paper we propose to automatically determine k in a graph generated from a Stochastic Blockmodel. Our main contribution is twofold; first, we theoretically establish the limiting distribution of the principal eigenvalue of the suitably centered and scaled adjacency matrix, and use that distribution for our hypothesis test. Secondly, we use this test to design a recursive bipartitioning algorithm. Using quantifiable classification tasks on real world networks with ground truth, we show that our algorithm outperforms existing probabilistic models for learning overlapping clust...

  6. A Neural-Network-Based Semi-Automated Geospatial Classification Tool

    Science.gov (United States)

    Hale, R. G.; Herzfeld, U. C.

    2014-12-01

    North America's largest glacier system, the Bering Bagley Glacier System (BBGS) in Alaska, surged in 2011-2013, as shown by rapid mass transfer, elevation change, and heavy crevassing. Little is known about the physics controlling surge glaciers' semi-cyclic patterns; therefore, it is crucial to collect and analyze as much data as possible so that predictive models can be made. In addition, physical signs frozen in ice in the form of crevasses may help serve as a warning for future surges. The BBGS surge provided an opportunity to develop an automated classification tool for crevasse classification based on imagery collected from small aircraft. The classification allows one to link image classification to geophysical processes associated with ice deformation. The tool uses an approach that employs geostatistical functions and a feed-forward perceptron with error back-propagation. The connectionist-geostatistical approach uses directional experimental (discrete) variograms to parameterize images into a form that the Neural Network (NN) can recognize. In an application to preform analysis on airborne video graphic data from the surge of the BBGS, an NN was able to distinguish 18 different crevasse classes with 95 percent or higher accuracy, for over 3,000 images. Recognizing that each surge wave results in different crevasse types and that environmental conditions affect the appearance in imagery, we designed the tool's semi-automated pre-training algorithm to be adaptable. The tool can be optimized to specific settings and variables of image analysis: (airborne and satellite imagery, different camera types, observation altitude, number and types of classes, and resolution). The generalization of the classification tool brings three important advantages: (1) multiple types of problems in geophysics can be studied, (2) the training process is sufficiently formalized to allow non-experts in neural nets to perform the training process, and (3) the time required to

  7. Fuzzy Emotional Semantic Analysis and Automated Annotation of Scene Images

    Directory of Open Access Journals (Sweden)

    Jianfang Cao

    2015-01-01

    Full Text Available With the advances in electronic and imaging techniques, the production of digital images has rapidly increased, and the extraction and automated annotation of emotional semantics implied by images have become issues that must be urgently addressed. To better simulate human subjectivity and ambiguity for understanding scene images, the current study proposes an emotional semantic annotation method for scene images based on fuzzy set theory. A fuzzy membership degree was calculated to describe the emotional degree of a scene image and was implemented using the Adaboost algorithm and a back-propagation (BP neural network. The automated annotation method was trained and tested using scene images from the SUN Database. The annotation results were then compared with those based on artificial annotation. Our method showed an annotation accuracy rate of 91.2% for basic emotional values and 82.4% after extended emotional values were added, which correspond to increases of 5.5% and 8.9%, respectively, compared with the results from using a single BP neural network algorithm. Furthermore, the retrieval accuracy rate based on our method reached approximately 89%. This study attempts to lay a solid foundation for the automated emotional semantic annotation of more types of images and therefore is of practical significance.

  8. PollyNET: a global network of automated Raman-polarization lidars for continuous aerosol profiling

    Directory of Open Access Journals (Sweden)

    H. Baars

    2015-10-01

    Full Text Available A global vertically resolved aerosol data set covering more than 10 years of observations at more than 20 measurement sites distributed from 63° N to 52° S and 72° W to 124° E has been achieved within the Raman and polarization lidar network PollyNET. This network consists of portable, remote-controlled multiwavelength-polarization-Raman lidars (Polly for automated and continuous 24/7 observations of clouds and aerosols. PollyNET is an independent, voluntary, and scientific network. All Polly lidars feature a standardized instrument design and apply unified calibration, quality control, and data analysis. The observations are processed in near-real time without manual intervention, and are presented online at http://polly.tropos.de. The paper gives an overview of the observations on four continents and two research vessels obtained with eight Polly systems. The specific aerosol types at these locations (mineral dust, smoke, dust-smoke and other dusty mixtures, urban haze, and volcanic ash are identified by their Ångström exponent, lidar ratio, and depolarization ratio. The vertical aerosol distribution at the PollyNET locations is discussed on the basis of more than 55 000 automatically retrieved 30 min particle backscatter coefficient profiles at 532 nm. A seasonal analysis of measurements at selected sites revealed typical and extraordinary aerosol conditions as well as seasonal differences. These studies show the potential of PollyNET to support the establishment of a global aerosol climatology that covers the entire troposphere.

  9. Automated image analysis in the study of collagenous colitis

    DEFF Research Database (Denmark)

    Fiehn, Anne-Marie Kanstrup; Kristensson, Martin; Engel, Ulla;

    2016-01-01

    PURPOSE: The aim of this study was to develop an automated image analysis software to measure the thickness of the subepithelial collagenous band in colon biopsies with collagenous colitis (CC) and incomplete CC (CCi). The software measures the thickness of the collagenous band on microscopic...

  10. Automated SEM-EDS GSR Analysis for Turkish Ammunitions

    International Nuclear Information System (INIS)

    In this work, Automated Scanning Electron Microscopy with Energy Dispersive X-ray Spectrometry (SEM-EDS) was used to characterize 7.65 and 9mm cartridges Turkish ammunition. All samples were analyzed in a SEM Jeol JSM-5600LV equipped BSE detector and a Link ISIS 300 (EDS). A working distance of 20mm, an accelerating voltage of 20 keV and gunshot residue software was used in all analysis. Automated search resulted in a high number of particles analyzed containing gunshot residues (GSR) unique elements (PbBaSb). The obtained data about the definition of characteristic GSR particles was concordant with other studies on this topic

  11. Automated procedure for performing computer security risk analysis

    International Nuclear Information System (INIS)

    Computers, the invisible backbone of nuclear safeguards, monitor and control plant operations and support many materials accounting systems. Our automated procedure to assess computer security effectiveness differs from traditional risk analysis methods. The system is modeled as an interactive questionnaire, fully automated on a portable microcomputer. A set of modular event trees links the questionnaire to the risk assessment. Qualitative scores are obtained for target vulnerability, and qualitative impact measures are evaluated for a spectrum of threat-target pairs. These are then combined by a linguistic algebra to provide an accurate and meaningful risk measure. 12 references, 7 figures

  12. Volumetric measurements of pulmonary nodules: variability in automated analysis tools

    Science.gov (United States)

    Juluru, Krishna; Kim, Woojin; Boonn, William; King, Tara; Siddiqui, Khan; Siegel, Eliot

    2007-03-01

    Over the past decade, several computerized tools have been developed for detection of lung nodules and for providing volumetric analysis. Incidentally detected lung nodules have traditionally been followed over time by measurements of their axial dimensions on CT scans to ensure stability or document progression. A recently published article by the Fleischner Society offers guidelines on the management of incidentally detected nodules based on size criteria. For this reason, differences in measurements obtained by automated tools from various vendors may have significant implications on management, yet the degree of variability in these measurements is not well understood. The goal of this study is to quantify the differences in nodule maximum diameter and volume among different automated analysis software. Using a dataset of lung scans obtained with both "ultra-low" and conventional doses, we identified a subset of nodules in each of five size-based categories. Using automated analysis tools provided by three different vendors, we obtained size and volumetric measurements on these nodules, and compared these data using descriptive as well as ANOVA and t-test analysis. Results showed significant differences in nodule maximum diameter measurements among the various automated lung nodule analysis tools but no significant differences in nodule volume measurements. These data suggest that when using automated commercial software, volume measurements may be a more reliable marker of tumor progression than maximum diameter. The data also suggest that volumetric nodule measurements may be relatively reproducible among various commercial workstations, in contrast to the variability documented when performing human mark-ups, as is seen in the LIDC (lung imaging database consortium) study.

  13. AUTOMATED POLICY COMPLIANCE AND CHANGE DETECTION MANAGED SERVICE IN DATA NETWORKS

    Directory of Open Access Journals (Sweden)

    Saeed M. Agbariah

    2013-11-01

    Full Text Available As networks continue to grow in size, speed and complexity, as well as in the diversification of their services, they require many ad-hoc configuration changes. Such changes may lead to potential configuration errors, policy violations, inefficiencies, and vulnerable states. The current Network Management landscape is in a dire need for an automated process to prioritize and manage risk, audit configurations against internal policies or external best practices, and provide centralize reporting for monitoring and regulatory purposes in real time. This paper defines a framework for automated configuration process with a policy compliance and change detection system, which performs automatic and intelligent network configuration audits by using pre-defined configuration templates and library of rules that encompass industry standards for various routing and security related guidelines.System administrators and change initiators will have a real time feedback if any of their configuration changes violate any of the policies set for any given device.

  14. Domotics – A Cost Effective Smart Home Automation System Using Wifi as Network Infrastructure

    Directory of Open Access Journals (Sweden)

    Abhinav Talgeri

    2014-08-01

    Full Text Available This paper describes an investigation into the potential for remote controlled operation of home automation (also called as Domotics systems. It considers problems with their implementation, discusses possible solutions through various network technologies and indicates how to optimize the use of such systems. This paper emphasizes on the design and prototype implementation of new home automation system that uses WiFi technology as a network infrastructure connecting its parts. The proposed can be viewed on two fold; the first part is the software (web server, which presents system core that manages, controls, and monitors users’ home. Users and system administrator can locally (LAN or remotely (internet manage the system code. Second part is hardware interface module, which provides appropriate interface to sensors and actuator of home automation system. Unlike most of available home automation system in the market the proposed system is scalable that one server can manage many hardware interface modules as long as it exists on WiFi network coverage.

  15. Project Management Phases of a SCADA System for Automation of Electrical Distribution Networks

    Directory of Open Access Journals (Sweden)

    Mohamed Najeh Lakhoua

    2012-03-01

    Full Text Available The aim of this paper is, firstly, to recall the basic concepts of SCADA (Supervisory Control And Data Acquisition systems, to present the project management phases of SCADA for real time implementation, and then to show the need of the automation for Electricity Distribution Companies (EDC on their distribution networks and the importance of using computer based system towards sustainable development of their services. A proposed computer based power distribution automation system is then discussed. Finally, some projects of SCADA system implementation in electrical companies over the world are briefly presented.

  16. Automated security management

    CERN Document Server

    Al-Shaer, Ehab; Xie, Geoffrey

    2013-01-01

    In this contributed volume, leading international researchers explore configuration modeling and checking, vulnerability and risk assessment, configuration analysis, and diagnostics and discovery. The authors equip readers to understand automated security management systems and techniques that increase overall network assurability and usability. These constantly changing networks defend against cyber attacks by integrating hundreds of security devices such as firewalls, IPSec gateways, IDS/IPS, authentication servers, authorization/RBAC servers, and crypto systems. Automated Security Managemen

  17. INSTRUMENTATION AND CONTROL FOR WIRELESS SENSOR NETWORK FOR AUTOMATED IRRIGATION

    Science.gov (United States)

    An in-field sensor-based irrigation system is of benefit to producers in efficient water management. A distributed wireless sensor network eliminates difficulties to wire sensor stations across the field and reduces maintenance cost. Implementing wireless sensor-based irrigation system is challengin...

  18. The Local Area Network (LAN) and Library Automation.

    Science.gov (United States)

    Farr, Rick C.

    1983-01-01

    Discussion of the use of inexpensive microcomputers in local area information networks (LAN) notes such minicomputer problems as costs, capacity, downtime, and vendor dependence, and the advantages of using LAN in offices and libraries, less costly LAN upgrades, library vendors and LAN systems, and LAN problems. A 28-item bibliography is provided.…

  19. Automated Network Anomaly Detection with Learning, Control and Mitigation

    Science.gov (United States)

    Ippoliti, Dennis

    2014-01-01

    Anomaly detection is a challenging problem that has been researched within a variety of application domains. In network intrusion detection, anomaly based techniques are particularly attractive because of their ability to identify previously unknown attacks without the need to be programmed with the specific signatures of every possible attack.…

  20. Development of an Intelligent Energy Management Network for Building Automation

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    IV. STRUCTURE OF CONTROL AND MONITORING AND MANAGEMENT IEMN adapts a three-tier distribution processing architecture for monitoring and control. These are the facility management layer, the BCMS control and monitoring layer, and the ACMC control and management layer. The facility management layer is the bottom layer of the architecture for monitoring and controlling the BACnet network.

  1. Initial Flight Results for an Automated Satellite Beacon Health Monitoring Network

    OpenAIRE

    Young, Anthony; Kitts, Christopher; Neumann, Michael; Mas, Ignacio; Rasay, Mike

    2010-01-01

    Beacon monitoring is an automated satellite health monitoring architecture that combines telemetry analysis, periodic low data rate message broadcasts by a spacecraft, and automated ground reception and data handling in order to implement a cost-effective anomaly detection and notification capability for spacecraft missions. Over the past two decades, this architecture has been explored and prototyped for a range of spacecraft mission classes to include use on NASA deep space probes, military...

  2. On Automating and Standardising Corpus Callosum Analysis in Brain MRI

    DEFF Research Database (Denmark)

    Stegmann, Mikkel Bille; Skoglund, Karl

    2005-01-01

    Corpus callosum analysis is influenced by many factors. The effort in controlling these has previously been incomplete and scattered. This paper sketches a complete pipeline for automated corpus callosum analysis from magnetic resonance images, with focus on measurement standardisation. The...... presented pipeline deals with i) estimation of the mid-sagittal plane, ii) localisation and registration of the corpus callosum, iii) parameterisation and representation of its contour, and iv) means of standardising the traditional reference area measurements....

  3. Fully automated apparatus for the proximate analysis of coals

    Energy Technology Data Exchange (ETDEWEB)

    Fukumoto, K.; Ishibashi, Y.; Ishii, T.; Maeda, K.; Ogawa, A.; Gotoh, K.

    1985-01-01

    The authors report the development of fully-automated equipment for the proximate analysis of coals, a development undertaken with the twin aims of labour-saving and developing robot applications technology. This system comprises a balance, electric furnaces, a sulfur analyzer, etc., arranged concentrically around a multi-jointed robot which automatically performs all the necessary operations, such as sampling and weighing the materials for analysis, and inserting and removing them from the furnaces. 2 references.

  4. Computer automated movement detection for the analysis of behavior

    OpenAIRE

    Ramazani, Roseanna B.; Harish R Krishnan; BERGESON, SUSAN E.; Atkinson, Nigel S.

    2007-01-01

    Currently, measuring ethanol behaviors in flies depends on expensive image analysis software or time intensive experimenter observation. We have designed an automated system for the collection and analysis of locomotor behavior data, using the IEEE 1394 acquisition program dvgrab, the image toolkit ImageMagick and the programming language Perl. In the proposed method, flies are placed in a clear container and a computer-controlled camera takes pictures at regular intervals. Digital subtractio...

  5. Automated haematology analysis to diagnose malaria

    Directory of Open Access Journals (Sweden)

    Grobusch Martin P

    2010-11-01

    Full Text Available Abstract For more than a decade, flow cytometry-based automated haematology analysers have been studied for malaria diagnosis. Although current haematology analysers are not specifically designed to detect malaria-related abnormalities, most studies have found sensitivities that comply with WHO malaria-diagnostic guidelines, i.e. ≥ 95% in samples with > 100 parasites/μl. Establishing a correct and early malaria diagnosis is a prerequisite for an adequate treatment and to minimizing adverse outcomes. Expert light microscopy remains the 'gold standard' for malaria diagnosis in most clinical settings. However, it requires an explicit request from clinicians and has variable accuracy. Malaria diagnosis with flow cytometry-based haematology analysers could become an important adjuvant diagnostic tool in the routine laboratory work-up of febrile patients in or returning from malaria-endemic regions. Haematology analysers so far studied for malaria diagnosis are the Cell-Dyn®, Coulter® GEN·S and LH 750, and the Sysmex XE-2100® analysers. For Cell-Dyn analysers, abnormal depolarization events mainly in the lobularity/granularity and other scatter-plots, and various reticulocyte abnormalities have shown overall sensitivities and specificities of 49% to 97% and 61% to 100%, respectively. For the Coulter analysers, a 'malaria factor' using the monocyte and lymphocyte size standard deviations obtained by impedance detection has shown overall sensitivities and specificities of 82% to 98% and 72% to 94%, respectively. For the XE-2100, abnormal patterns in the DIFF, WBC/BASO, and RET-EXT scatter-plots, and pseudoeosinophilia and other abnormal haematological variables have been described, and multivariate diagnostic models have been designed with overall sensitivities and specificities of 86% to 97% and 81% to 98%, respectively. The accuracy for malaria diagnosis may vary according to species, parasite load, immunity and clinical context where the

  6. CRITICAL ASSESSMENT OF AUTOMATED FLOW CYTOMETRY DATA ANALYSIS TECHNIQUES

    Science.gov (United States)

    Aghaeepour, Nima; Finak, Greg; Hoos, Holger; Mosmann, Tim R.; Gottardo, Raphael; Brinkman, Ryan; Scheuermann, Richard H.

    2013-01-01

    Traditional methods for flow cytometry (FCM) data processing rely on subjective manual gating. Recently, several groups have developed computational methods for identifying cell populations in multidimensional FCM data. The Flow Cytometry: Critical Assessment of Population Identification Methods (FlowCAP) challenges were established to compare the performance of these methods on two tasks – mammalian cell population identification to determine if automated algorithms can reproduce expert manual gating, and sample classification to determine if analysis pipelines can identify characteristics that correlate with external variables (e.g., clinical outcome). This analysis presents the results of the first of these challenges. Several methods performed well compared to manual gating or external variables using statistical performance measures, suggesting that automated methods have reached a sufficient level of maturity and accuracy for reliable use in FCM data analysis. PMID:23396282

  7. Towards Automated Design, Analysis and Optimization of Declarative Curation Workflows

    Directory of Open Access Journals (Sweden)

    Tianhong Song

    2014-10-01

    Full Text Available Data curation is increasingly important. Our previous work on a Kepler curation package has demonstrated advantages that come from automating data curation pipelines by using workflow systems. However, manually designed curation workflows can be error-prone and inefficient due to a lack of user understanding of the workflow system, misuse of actors, or human error. Correcting problematic workflows is often very time-consuming. A more proactive workflow system can help users avoid such pitfalls. For example, static analysis before execution can be used to detect the potential problems in a workflow and help the user to improve workflow design. In this paper, we propose a declarative workflow approach that supports semi-automated workflow design, analysis and optimization. We show how the workflow design engine helps users to construct data curation workflows, how the workflow analysis engine detects different design problems of workflows and how workflows can be optimized by exploiting parallelism.

  8. Automating with ROBOCOM. An expert system for complex engineering analysis

    International Nuclear Information System (INIS)

    Nuclear engineering analysis is automated with the help of preprocessors and postprocessors. All the analysis and processing steps are recorded in a form that is reportable and replayable. These recordings serve both as documentations and as robots, for they are capable of performing the analyses they document. Since the processors and robots in ROBOCOM interface the users in a way independent of the analysis program being used, it is now possible to unify input modeling for programs with similar functionality. ROBOCOM will eventually evolve into an encyclopedia of how every nuclear engineering analysis is performed

  9. The automation of analysis of technological process effectiveness

    Directory of Open Access Journals (Sweden)

    B. Krupińska

    2007-10-01

    Full Text Available Purpose: Improvement of technological processes by the use of technological efficiency analysis can create basis of their optimization. Informatization and computerization of wider and wider scope of activity is one of the most important current development trends of an enterprise.Design/methodology/approach: Indicators appointment makes it possible to evaluate the process efficiency, which can constitute an optimization basis of particular operation. Model of technological efficiency analysis is based on particular efficiency indicators that characterize operation, taking into account following criteria: operation – material, operation – machine, operation – human, operation – technological parameters.Findings: From the qualitative and correctness of choose of technology point of view comprehensive technological processes assessment makes up the basis of technological efficiency analysis. Results of technological efficiency analysis of technological process of prove that the chosen model of technological efficiency analysis makes it possible to improve the process continuously by the technological analysis, and application of computer assistance makes it possible to automate the process of efficiency analysis, and finally controlled improvement of technological processes.Practical implications: For the sake of complexity of technological efficiency analysis one has created an AEPT computer analysis from which result: operation efficiency indicators with distinguished indicators with minimal acceptable values, values of efficiency of the applied samples, value of technological process efficiency.Originality/value: The created computer analysis of ef technological process efficiency (AEPT makes it possible to automate the process of analysis and optimization.

  10. Multispectral tissue analysis and classification towards enabling automated robotic surgery

    Science.gov (United States)

    Triana, Brian; Cha, Jaepyeong; Shademan, Azad; Krieger, Axel; Kang, Jin U.; Kim, Peter C. W.

    2014-02-01

    Accurate optical characterization of different tissue types is an important tool for potentially guiding surgeons and enabling automated robotic surgery. Multispectral imaging and analysis have been used in the literature to detect spectral variations in tissue reflectance that may be visible to the naked eye. Using this technique, hidden structures can be visualized and analyzed for effective tissue classification. Here, we investigated the feasibility of automated tissue classification using multispectral tissue analysis. Broadband reflectance spectra (200-1050 nm) were collected from nine different ex vivo porcine tissues types using an optical fiber-probe based spectrometer system. We created a mathematical model to train and distinguish different tissue types based upon analysis of the observed spectra using total principal component regression (TPCR). Compared to other reported methods, our technique is computationally inexpensive and suitable for real-time implementation. Each of the 92 spectra was cross-referenced against the nine tissue types. Preliminary results show a mean detection rate of 91.3%, with detection rates of 100% and 70.0% (inner and outer kidney), 100% and 100% (inner and outer liver), 100% (outer stomach), and 90.9%, 100%, 70.0%, 85.7% (four different inner stomach areas, respectively). We conclude that automated tissue differentiation using our multispectral tissue analysis method is feasible in multiple ex vivo tissue specimens. Although measurements were performed using ex vivo tissues, these results suggest that real-time, in vivo tissue identification during surgery may be possible.

  11. Tank Farm Operations Surveillance Automation Analysis

    Energy Technology Data Exchange (ETDEWEB)

    MARQUEZ, D.L.

    2000-12-21

    The Nuclear Operations Project Services identified the need to improve manual tank farm surveillance data collection, review, distribution and storage practices often referred to as Operator Rounds. This document provides the analysis in terms of feasibility to improve the manual data collection methods by using handheld computer units, barcode technology, a database for storage and acquisitions, associated software, and operational procedures to increase the efficiency of Operator Rounds associated with surveillance activities.

  12. Tank Farm Operations Surveillance Automation Analysis

    International Nuclear Information System (INIS)

    The Nuclear Operations Project Services identified the need to improve manual tank farm surveillance data collection, review, distribution and storage practices often referred to as Operator Rounds. This document provides the analysis in terms of feasibility to improve the manual data collection methods by using handheld computer units, barcode technology, a database for storage and acquisitions, associated software, and operational procedures to increase the efficiency of Operator Rounds associated with surveillance activities

  13. Micro photometer's automation for quantitative spectrograph analysis

    International Nuclear Information System (INIS)

    A Microphotometer is used to increase the sharpness of dark spectral lines. Analyzing these lines one sample content and its concentration could be determined and the analysis is known as Quantitative Spectrographic Analysis. The Quantitative Spectrographic Analysis is carried out in 3 steps, as follows. 1. Emulsion calibration. This consists of gauging a photographic emulsion, to determine the intensity variations in terms of the incident radiation. For the procedure of emulsion calibration an adjustment with square minimum to the data obtained is applied to obtain a graph. It is possible to determine the density of dark spectral line against the incident light intensity shown by the microphotometer. 2. Working curves. The values of known concentration of an element against incident light intensity are plotted. Since the sample contains several elements, it is necessary to find a work curve for each one of them. 3. Analytical results. The calibration curve and working curves are compared and the concentration of the studied element is determined. The automatic data acquisition, calculation and obtaining of resulting, is done by means of a computer (PC) and a computer program. The conditioning signal circuits have the function of delivering TTL levels (Transistor Transistor Logic) to make the communication between the microphotometer and the computer possible. Data calculation is done using a computer programm

  14. Extended -Regular Sequence for Automated Analysis of Microarray Images

    Directory of Open Access Journals (Sweden)

    Jin Hee-Jeong

    2006-01-01

    Full Text Available Microarray study enables us to obtain hundreds of thousands of expressions of genes or genotypes at once, and it is an indispensable technology for genome research. The first step is the analysis of scanned microarray images. This is the most important procedure for obtaining biologically reliable data. Currently most microarray image processing systems require burdensome manual block/spot indexing work. Since the amount of experimental data is increasing very quickly, automated microarray image analysis software becomes important. In this paper, we propose two automated methods for analyzing microarray images. First, we propose the extended -regular sequence to index blocks and spots, which enables a novel automatic gridding procedure. Second, we provide a methodology, hierarchical metagrid alignment, to allow reliable and efficient batch processing for a set of microarray images. Experimental results show that the proposed methods are more reliable and convenient than the commercial tools.

  15. Automated Asteroseismic Analysis of Solar-type Stars

    DEFF Research Database (Denmark)

    Karoff, Christoffer; Campante, T.L.; Chaplin, W.J.

    2010-01-01

    , radius, luminosity, effective temperature, surface gravity and age based on grid modeling. All the tools take into account the window function of the observations which means that they work equally well for space-based photometry observations from e.g. the NASA Kepler satellite and ground-based velocity......The rapidly increasing volume of asteroseismic observations on solar-type stars has revealed a need for automated analysis tools. The reason for this is not only that individual analyses of single stars are rather time consuming, but more importantly that these large volumes of observations open...... are calculated in a consistent way. Here we present a set of automated asterosesimic analysis tools. The main engine of these set of tools is an algorithm for modelling the autocovariance spectra of the stellar acoustic spectra allowing us to measure not only the frequency of maximum power and the large...

  16. Multifractal analysis of complex networks

    Institute of Scientific and Technical Information of China (English)

    Wang Dan-Ling; Yu Zu-Guo; Anh V

    2012-01-01

    Complex networks have recently attracted much attention in diverse areas of science and technology.Many networks such as the WWW and biological networks are known to display spatial heterogeneity which can be characterized by their fractal dimensions.Multifractal analysis is a useful way to systematically describe the spatial heterogeneity of both theoretical and experimental fractal patterns.In this paper,we introduce a new box-covering algorithm for muttifractal analysis of complex networks.This algorithm is used to calculate the generalized fractal dimensions Dq of some theoretical networks,namely scale-free networks,small world networks,and random networks,and one kind of real network,namely protein-protein interaction networks of different species.Our numerical results indicate the existence of multifractality in scale-free networks and protein-protein interaction networks,while the multifractal behavior is not clear-cut for small world networks and random networks.The possible variation of Dq due to changes in the parameters of the theoretical network models is also discussed.

  17. Automated analysis of Xe-133 pulmonary ventilation (AAPV) in children

    Science.gov (United States)

    Cao, Xinhua; Treves, S. Ted

    2011-03-01

    In this study, an automated analysis of pulmonary ventilation (AAPV) was developed to visualize the ventilation in pediatric lungs using dynamic Xe-133 scintigraphy. AAPV is a software algorithm that converts a dynamic series of Xe- 133 images into four functional images: equilibrium, washout halftime, residual, and clearance rate by analyzing pixelbased activity. Compared to conventional methods of calculating global or regional ventilation parameters, AAPV provides a visual representation of pulmonary ventilation functions.

  18. RFI detection by automated feature extraction and statistical analysis

    OpenAIRE

    Winkel, Benjamin; Kerp, Juergen; Stanko, Stephan

    2006-01-01

    In this paper we present an interference detection toolbox consisting of a high dynamic range Digital Fast-Fourier-Transform spectrometer (DFFT, based on FPGA-technology) and data analysis software for automated radio frequency interference (RFI) detection. The DFFT spectrometer allows high speed data storage of spectra on time scales of less than a second. The high dynamic range of the device assures constant calibration even during extremely powerful RFI events. The software uses an algorit...

  19. A Method of Automated Nonparametric Content Analysis for Social Science

    OpenAIRE

    Hopkins, Daniel J.; King, Gary

    2010-01-01

    The increasing availability of digitized text presents enormous opportunities for social scientists. Yet hand coding many blogs, speeches, government records, newspapers, or other sources of unstructured text is infeasible. Although computer scientists have methods for automated content analysis, most are optimized to classify individual documents, whereas social scientists instead want generalizations about the population of documents, such as the proportion in a given category. Unfortunatel...

  20. Automated Cardiac Beat Classification Using RBF Neural Networks

    Directory of Open Access Journals (Sweden)

    Ali Khazaee

    2013-04-01

    Full Text Available This paper proposes a four stage, denoising, feature extraction, optimization and classification method for detection of premature ventricular contractions. In the first stage, we investigate the application of wavelet denoising in noise reduction of multi-channel high resolution ECG signals. In this stage, the Stationary Wavelet Transform is used. Feature extraction module extracts ten ECG morphological features and one timing interval feature. Then a number of radial basis function (RBF neural networks with different value of spread parameter are designed and compared their ability for classification of three different classes of ECG signals. Genetic Algorithm is used to find best value of RBF parameters. A classification accuracy of 100% for training dataset and 95.66% for testing dataset and an overall accuracy of detection of 95.83% were achieved over seven files from the MIT/BIH arrhythmia database.

  1. An optimized method for automated analysis of algal pigments by HPLC

    NARCIS (Netherlands)

    van Leeuwe, M. A.; Villerius, L. A.; Roggeveld, J.; Visser, R. J. W.; Stefels, J.

    2006-01-01

    A recent development in algal pigment analysis by high-performance liquid chromatography (HPLC) is the application of automation. An optimization of a complete sampling and analysis protocol applied specifically in automation has not yet been performed. In this paper we show that automation can only

  2. Location-Based Self-Adaptive Routing Algorithm for Wireless Sensor Networks in Home Automation

    OpenAIRE

    Hong SeungHo; Li XiaoHui; Fang KangLing

    2011-01-01

    The use of wireless sensor networks in home automation (WSNHA) is attractive due to their characteristics of self-organization, high sensing fidelity, low cost, and potential for rapid deployment. Although the AODVjr routing algorithm in IEEE 802.15.4/ZigBee and other routing algorithms have been designed for wireless sensor networks, not all are suitable for WSNHA. In this paper, we propose a location-based self-adaptive routing algorithm for WSNHA called WSNHA-LBAR. It confines route disco...

  3. Deep convolutional networks for automated detection of posterior-element fractures on spine CT

    Science.gov (United States)

    Roth, Holger R.; Wang, Yinong; Yao, Jianhua; Lu, Le; Burns, Joseph E.; Summers, Ronald M.

    2016-03-01

    Injuries of the spine, and its posterior elements in particular, are a common occurrence in trauma patients, with potentially devastating consequences. Computer-aided detection (CADe) could assist in the detection and classification of spine fractures. Furthermore, CAD could help assess the stability and chronicity of fractures, as well as facilitate research into optimization of treatment paradigms. In this work, we apply deep convolutional networks (ConvNets) for the automated detection of posterior element fractures of the spine. First, the vertebra bodies of the spine with its posterior elements are segmented in spine CT using multi-atlas label fusion. Then, edge maps of the posterior elements are computed. These edge maps serve as candidate regions for predicting a set of probabilities for fractures along the image edges using ConvNets in a 2.5D fashion (three orthogonal patches in axial, coronal and sagittal planes). We explore three different methods for training the ConvNet using 2.5D patches along the edge maps of `positive', i.e. fractured posterior-elements and `negative', i.e. non-fractured elements. An experienced radiologist retrospectively marked the location of 55 displaced posterior-element fractures in 18 trauma patients. We randomly split the data into training and testing cases. In testing, we achieve an area-under-the-curve of 0.857. This corresponds to 71% or 81% sensitivities at 5 or 10 false-positives per patient, respectively. Analysis of our set of trauma patients demonstrates the feasibility of detecting posterior-element fractures in spine CT images using computer vision techniques such as deep convolutional networks.

  4. Prevalence of discordant microscopic changes with automated CBC analysis

    Directory of Open Access Journals (Sweden)

    Fabiano de Jesus Santos

    2014-12-01

    Full Text Available Introduction:The most common cause of diagnostic error is related to errors in laboratory tests as well as errors of results interpretation. In order to reduce them, the laboratory currently has modern equipment which provides accurate and reliable results. The development of automation has revolutionized the laboratory procedures in Brazil and worldwide.Objective:To determine the prevalence of microscopic changes present in blood slides concordant and discordant with results obtained using fully automated procedures.Materials and method:From January to July 2013, 1,000 hematological parameters slides were analyzed. Automated analysis was performed on last generation equipment, which methodology is based on electrical impedance, and is able to quantify all the figurative elements of the blood in a universe of 22 parameters. The microscopy was performed by two experts in microscopy simultaneously.Results:The data showed that only 42.70% were concordant, comparing with 57.30% discordant. The main findings among discordant were: Changes in red blood cells 43.70% (n = 250, white blood cells 38.46% (n = 220, and number of platelet 17.80% (n = 102.Discussion:The data show that some results are not consistent with clinical or physiological state of an individual, and cannot be explained because they have not been investigated, which may compromise the final diagnosis.Conclusion:It was observed that it is of fundamental importance that the microscopy qualitative analysis must be performed in parallel with automated analysis in order to obtain reliable results, causing a positive impact on the prevention, diagnosis, prognosis, and therapeutic follow-up.

  5. Design of Networked Home Automation System Based on μCOS-II and AMAZON

    Directory of Open Access Journals (Sweden)

    Liu Jianfeng

    2015-01-01

    Full Text Available In recent years, with the popularity of computers and smart phones and the development of intelligent building in electronics industry, people’s requirement of living environment is gradually changing. The intelligent home furnishing building has become the new focus of people purchasing. And the networked home automation system which relies on the advanced network technology to connect with air conditioning, lighting, security, curtains, TV, water heater and other home furnishing systems into a local area network becomes a networked control system. μC /OS is a real-time operating system with the free open-source code, the compact structure and the preemptive real-time kernel. In this paper, the author focuses on the design of home furnishing total controller based on AMAZON multimedia processor and μC/OS-II real-time operating system, and achieves the remote access connection and control through the Ethernet.

  6. Location-Based Self-Adaptive Routing Algorithm for Wireless Sensor Networks in Home Automation

    Directory of Open Access Journals (Sweden)

    Hong SeungHo

    2011-01-01

    Full Text Available The use of wireless sensor networks in home automation (WSNHA is attractive due to their characteristics of self-organization, high sensing fidelity, low cost, and potential for rapid deployment. Although the AODVjr routing algorithm in IEEE 802.15.4/ZigBee and other routing algorithms have been designed for wireless sensor networks, not all are suitable for WSNHA. In this paper, we propose a location-based self-adaptive routing algorithm for WSNHA called WSNHA-LBAR. It confines route discovery flooding to a cylindrical request zone, which reduces the routing overhead and decreases broadcast storm problems in the MAC layer. It also automatically adjusts the size of the request zone using a self-adaptive algorithm based on Bayes' theorem. This makes WSNHA-LBAR more adaptable to the changes of the network state and easier to implement. Simulation results show improved network reliability as well as reduced routing overhead.

  7. Automated morphological classification of APM galaxies by supervised artificial neural networks

    CERN Document Server

    Naim, A; Sodré, L; Storrie-Lombardi, M C; Naim, A; Lahav, O; Sodre, L; Storrie-Lombardi, M C

    1995-01-01

    We train Artificial Neural Networks to classify galaxies based solely on the morphology of the galaxy images as they appear on blue survey plates. The images are reduced and morphological features such as bulge size and the number of arms are extracted, all in a fully automated manner. The galaxy sample was first classified by 6 independent experts. We use several definitions for the mean type of each galaxy, based on those classifications. We then train and test the network on these features. We find that the rms error of the network classifications, as compared with the mean types of the expert classifications, is 1.8 Revised Hubble Types. This is comparable to the overall rms dispersion between the experts. This result is robust and almost completely independent of the network architecture used.

  8. Automation of Large-scale Computer Cluster Monitoring Information Analysis

    Science.gov (United States)

    Magradze, Erekle; Nadal, Jordi; Quadt, Arnulf; Kawamura, Gen; Musheghyan, Haykuhi

    2015-12-01

    High-throughput computing platforms consist of a complex infrastructure and provide a number of services apt to failures. To mitigate the impact of failures on the quality of the provided services, a constant monitoring and in time reaction is required, which is impossible without automation of the system administration processes. This paper introduces a way of automation of the process of monitoring information analysis to provide the long and short term predictions of the service response time (SRT) for a mass storage and batch systems and to identify the status of a service at a given time. The approach for the SRT predictions is based on Adaptive Neuro Fuzzy Inference System (ANFIS). An evaluation of the approaches is performed on real monitoring data from the WLCG Tier 2 center GoeGrid. Ten fold cross validation results demonstrate high efficiency of both approaches in comparison to known methods.

  9. Automated Construction of Node Software Using Attributes in a Ubiquitous Sensor Network Environment

    Directory of Open Access Journals (Sweden)

    JangMook Kang

    2010-09-01

    Full Text Available In sensor networks, nodes must often operate in a demanding environment facing restrictions such as restricted computing resources, unreliable wireless communication and power shortages. Such factors make the development of ubiquitous sensor network (USN applications challenging. To help developers construct a large amount of node software for sensor network applications easily and rapidly, this paper proposes an approach to the automated construction of node software for USN applications using attributes. In the proposed technique, application construction proceeds by first developing a model for the sensor network and then designing node software by setting the values of the predefined attributes. After that, the sensor network model and the design of node software are verified. The final source codes of the node software are automatically generated from the sensor network model. We illustrate the efficiency of the proposed technique by using a gas/light monitoring application through a case study of a Gas and Light Monitoring System based on the Nano-Qplus operating system. We evaluate the technique using a quantitative metric—the memory size of execution code for node software. Using the proposed approach, developers are able to easily construct sensor network applications and rapidly generate a large number of node softwares at a time in a ubiquitous sensor network environment.

  10. Automated construction of node software using attributes in a ubiquitous sensor network environment.

    Science.gov (United States)

    Lee, Woojin; Kim, Juil; Kang, JangMook

    2010-01-01

    In sensor networks, nodes must often operate in a demanding environment facing restrictions such as restricted computing resources, unreliable wireless communication and power shortages. Such factors make the development of ubiquitous sensor network (USN) applications challenging. To help developers construct a large amount of node software for sensor network applications easily and rapidly, this paper proposes an approach to the automated construction of node software for USN applications using attributes. In the proposed technique, application construction proceeds by first developing a model for the sensor network and then designing node software by setting the values of the predefined attributes. After that, the sensor network model and the design of node software are verified. The final source codes of the node software are automatically generated from the sensor network model. We illustrate the efficiency of the proposed technique by using a gas/light monitoring application through a case study of a Gas and Light Monitoring System based on the Nano-Qplus operating system. We evaluate the technique using a quantitative metric-the memory size of execution code for node software. Using the proposed approach, developers are able to easily construct sensor network applications and rapidly generate a large number of node softwares at a time in a ubiquitous sensor network environment. PMID:22163678

  11. Network topology analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Kalb, Jeffrey L.; Lee, David S.

    2008-01-01

    Emerging high-bandwidth, low-latency network technology has made network-based architectures both feasible and potentially desirable for use in satellite payload architectures. The selection of network topology is a critical component when developing these multi-node or multi-point architectures. This study examines network topologies and their effect on overall network performance. Numerous topologies were reviewed against a number of performance, reliability, and cost metrics. This document identifies a handful of good network topologies for satellite applications and the metrics used to justify them as such. Since often multiple topologies will meet the requirements of the satellite payload architecture under development, the choice of network topology is not easy, and in the end the choice of topology is influenced by both the design characteristics and requirements of the overall system and the experience of the developer.

  12. Automated road network extraction from high spatial resolution multi-spectral imagery

    Science.gov (United States)

    Zhang, Qiaoping

    For the last three decades, the Geomatics Engineering and Computer Science communities have considered automated road network extraction from remotely-sensed imagery to be a challenging and important research topic. The main objective of this research is to investigate the theory and methodology of automated feature extraction for image-based road database creation, refinement or updating, and to develop a series of algorithms for road network extraction from high resolution multi-spectral imagery. The proposed framework for road network extraction from multi-spectral imagery begins with an image segmentation using the k-means algorithm. This step mainly concerns the exploitation of the spectral information for feature extraction. The road cluster is automatically identified using a fuzzy classifier based on a set of predefined road surface membership functions. These membership functions are established based on the general spectral signature of road pavement materials and the corresponding normalized digital numbers on each multi-spectral band. Shape descriptors of the Angular Texture Signature are defined and used to reduce the misclassifications between roads and other spectrally similar objects (e.g., crop fields, parking lots, and buildings). An iterative and localized Radon transform is developed for the extraction of road centerlines from the classified images. The purpose of the transform is to accurately and completely detect the road centerlines. It is able to find short, long, and even curvilinear lines. The input image is partitioned into a set of subset images called road component images. An iterative Radon transform is locally applied to each road component image. At each iteration, road centerline segments are detected based on an accurate estimation of the line parameters and line widths. Three localization approaches are implemented and compared using qualitative and quantitative methods. Finally, the road centerline segments are grouped into a

  13. Using historical wafermap data for automated yield analysis

    International Nuclear Information System (INIS)

    To be productive and profitable in a modern semiconductor fabrication environment, large amounts of manufacturing data must be collected, analyzed, and maintained. This includes data collected from in- and off-line wafer inspection systems and from the process equipment itself. This data is increasingly being used to design new processes, control and maintain tools, and to provide the information needed for rapid yield learning and prediction. Because of increasing device complexity, the amount of data being generated is outstripping the yield engineer close-quote s ability to effectively monitor and correct unexpected trends and excursions. The 1997 SIA National Technology Roadmap for Semiconductors highlights a need to address these issues through open-quotes automated data reduction algorithms to source defects from multiple data sources and to reduce defect sourcing time.close quotes SEMATECH and the Oak Ridge National Laboratory have been developing new strategies and technologies for providing the yield engineer with higher levels of assisted data reduction for the purpose of automated yield analysis. In this article, we will discuss the current state of the art and trends in yield management automation. copyright 1999 American Vacuum Society

  14. Applications of Automation Methods for Nonlinear Fracture Test Analysis

    Science.gov (United States)

    Allen, Phillip A.; Wells, Douglas N.

    2013-01-01

    Using automated and standardized computer tools to calculate the pertinent test result values has several advantages such as: 1. allowing high-fidelity solutions to complex nonlinear phenomena that would be impractical to express in written equation form, 2. eliminating errors associated with the interpretation and programing of analysis procedures from the text of test standards, 3. lessening the need for expertise in the areas of solid mechanics, fracture mechanics, numerical methods, and/or finite element modeling, to achieve sound results, 4. and providing one computer tool and/or one set of solutions for all users for a more "standardized" answer. In summary, this approach allows a non-expert with rudimentary training to get the best practical solution based on the latest understanding with minimum difficulty.Other existing ASTM standards that cover complicated phenomena use standard computer programs: 1. ASTM C1340/C1340M-10- Standard Practice for Estimation of Heat Gain or Loss Through Ceilings Under Attics Containing Radiant Barriers by Use of a Computer Program 2. ASTM F 2815 - Standard Practice for Chemical Permeation through Protective Clothing Materials: Testing Data Analysis by Use of a Computer Program 3. ASTM E2807 - Standard Specification for 3D Imaging Data Exchange, Version 1.0 The verification, validation, and round-robin processes required of a computer tool closely parallel the methods that are used to ensure the solution validity for equations included in test standard. The use of automated analysis tools allows the creation and practical implementation of advanced fracture mechanics test standards that capture the physics of a nonlinear fracture mechanics problem without adding undue burden or expense to the user. The presented approach forms a bridge between the equation-based fracture testing standards of today and the next generation of standards solving complex problems through analysis automation.

  15. AMDA: an R package for the automated microarray data analysis

    Directory of Open Access Journals (Sweden)

    Foti Maria

    2006-07-01

    Full Text Available Abstract Background Microarrays are routinely used to assess mRNA transcript levels on a genome-wide scale. Large amount of microarray datasets are now available in several databases, and new experiments are constantly being performed. In spite of this fact, few and limited tools exist for quickly and easily analyzing the results. Microarray analysis can be challenging for researchers without the necessary training and it can be time-consuming for service providers with many users. Results To address these problems we have developed an automated microarray data analysis (AMDA software, which provides scientists with an easy and integrated system for the analysis of Affymetrix microarray experiments. AMDA is free and it is available as an R package. It is based on the Bioconductor project that provides a number of powerful bioinformatics and microarray analysis tools. This automated pipeline integrates different functions available in the R and Bioconductor projects with newly developed functions. AMDA covers all of the steps, performing a full data analysis, including image analysis, quality controls, normalization, selection of differentially expressed genes, clustering, correspondence analysis and functional evaluation. Finally a LaTEX document is dynamically generated depending on the performed analysis steps. The generated report contains comments and analysis results as well as the references to several files for a deeper investigation. Conclusion AMDA is freely available as an R package under the GPL license. The package as well as an example analysis report can be downloaded in the Services/Bioinformatics section of the Genopolis http://www.genopolis.it/

  16. Attempt of automated space network operations at ETS-VI experimental data relay system

    Science.gov (United States)

    Ishihara, Kiyoomi; Sugawara, Masayuki

    1994-01-01

    National Space Development Agency of Japan (NASDA) is to perform experimental operations to acquire necessary technology for the future inter-satellite communications configured with a data relay satellite. This paper intends to overview functions of the experimental ground system which NASDA has developed for the Engineering Test Satellite VI (ETS-VI) Data Relay and Tracking Experiment, and to introduce Space Network System Operations Procedure (SNSOP) method with an example of Ka-band Single Access (KSA) acquisition sequence. To reduce operational load, SNSOP is developed with the concept of automated control and monitor of both ground terminal and data relay satellite. To perform acquisition and tracking operations fluently, the information exchange with user spacecraft controllers is automated by SNSOP functions.

  17. An advanced distributed automated extraction of drainage network model on high-resolution DEM

    Directory of Open Access Journals (Sweden)

    Y. Mao

    2014-07-01

    distributed automated extraction of drainage network model (Adam was proposed in the study. The Adam model has two features: (1 searching upward from outlet of basin instead of sink filling, (2 dividing sub-basins on low-resolution DEM, and then extracting drainage network on sub-basins of high-resolution DEM. The case study used elevation data of the Shuttle Radar Topography Mission (SRTM at 3 arc-second resolution in Zhujiang River basin, China. The results show Adam model can dramatically reduce the computation time. The extracting drainage network was continuous and more accurate than HydroSHEDS (Hydrological data and maps based on Shuttle Elevation Derivatives at multiple Scales.

  18. Analysis and simulation of a torque assist automated manual transmission

    Science.gov (United States)

    Galvagno, E.; Velardocchia, M.; Vigliani, A.

    2011-08-01

    The paper presents the kinematic and dynamic analysis of a power-shift automated manual transmission (AMT) characterised by a wet clutch, called assist clutch (ACL), replacing the fifth gear synchroniser. This torque assist mechanism becomes a torque transfer path during gearshifts, in order to overcome a typical dynamic problem of the AMTs, that is the driving force interruption. The mean power contributions during gearshifts are computed for different engine and ACL interventions, thus allowing to draw considerations useful for developing the control algorithms. The simulation results prove the advantages in terms of gearshift quality and ride comfort of the analysed transmission.

  19. Automated target recognition and tracking using an optical pattern recognition neural network

    Science.gov (United States)

    Chao, Tien-Hsin

    1991-01-01

    The on-going development of an automatic target recognition and tracking system at the Jet Propulsion Laboratory is presented. This system is an optical pattern recognition neural network (OPRNN) that is an integration of an innovative optical parallel processor and a feature extraction based neural net training algorithm. The parallel optical processor provides high speed and vast parallelism as well as full shift invariance. The neural network algorithm enables simultaneous discrimination of multiple noisy targets in spite of their scales, rotations, perspectives, and various deformations. This fully developed OPRNN system can be effectively utilized for the automated spacecraft recognition and tracking that will lead to success in the Automated Rendezvous and Capture (AR&C) of the unmanned Cargo Transfer Vehicle (CTV). One of the most powerful optical parallel processors for automatic target recognition is the multichannel correlator. With the inherent advantages of parallel processing capability and shift invariance, multiple objects can be simultaneously recognized and tracked using this multichannel correlator. This target tracking capability can be greatly enhanced by utilizing a powerful feature extraction based neural network training algorithm such as the neocognitron. The OPRNN, currently under investigation at JPL, is constructed with an optical multichannel correlator where holographic filters have been prepared using the neocognitron training algorithm. The computation speed of the neocognitron-type OPRNN is up to 10(exp 14) analog connections/sec that enabling the OPRNN to outperform its state-of-the-art electronics counterpart by at least two orders of magnitude.

  20. Introduction to Social Network Analysis

    Science.gov (United States)

    Zaphiris, Panayiotis; Ang, Chee Siang

    Social Network analysis focuses on patterns of relations between and among people, organizations, states, etc. It aims to describe networks of relations as fully as possible, identify prominent patterns in such networks, trace the flow of information through them, and discover what effects these relations and networks have on people and organizations. Social network analysis offers a very promising potential for analyzing human-human interactions in online communities (discussion boards, newsgroups, virtual organizations). This Tutorial provides an overview of this analytic technique and demonstrates how it can be used in Human Computer Interaction (HCI) research and practice, focusing especially on Computer Mediated Communication (CMC). This topic acquires particular importance these days, with the increasing popularity of social networking websites (e.g., youtube, myspace, MMORPGs etc.) and the research interest in studying them.

  1. Tourism Destinations Network Analysis, Social Network Analysis Approach

    Directory of Open Access Journals (Sweden)

    2015-09-01

    Full Text Available The tourism industry is becoming one of the world's largest economical sources, and is expected to become the world's first industry by 2020. Previous studies have focused on several aspects of this industry including sociology, geography, tourism management and development, but have paid less attention to analytical and quantitative approaches. This study introduces some network analysis techniques and measures aiming at studying the structural characteristics of tourism networks. More specifically, it presents a methodology to analyze tourism destinations network. We apply the methodology to analyze mazandaran’s Tourism destination network, one of the most famous tourism areas of Iran.

  2. Urban Automation Networks: Current and Emerging Solutions for Sensed Data Collection and Actuation in Smart Cities.

    Science.gov (United States)

    Gomez, Carles; Paradells, Josep

    2015-09-10

    Urban Automation Networks (UANs) are being deployed worldwide in order to enable Smart City applications. Given the crucial role of UANs, as well as their diversity, it is critically important to assess their properties and trade-offs. This article introduces the requirements and challenges for UANs, characterizes the main current and emerging UAN paradigms, provides guidelines for their design and/or choice, and comparatively examines their performance in terms of a variety of parameters including coverage, power consumption, latency, standardization status and economic cost.

  3. Automated system for load flow prediction in power substations using artificial neural networks

    Directory of Open Access Journals (Sweden)

    Arlys Michel Lastre Aleaga

    2015-09-01

    Full Text Available The load flow is of great importance in assisting the process of decision making and planning of generation, distribution and transmission of electricity. Ignorance of the values in this indicator, as well as their inappropriate prediction, difficult decision making and efficiency of the electricity service, and can cause undesirable situations such as; the on demand, overheating of the components that make up a substation, and incorrect planning processes electricity generation and distribution. Given the need for prediction of flow of electric charge of the substations in Ecuador this research proposes the concept for the development of an automated prediction system employing the use of Artificial Neural Networks.

  4. A New Modular Strategy For Action Sequence Automation Using Neural Networks And Hidden Markov Models

    OpenAIRE

    Mohamed Adel Taher; Mostapha Abdeljawad

    2013-01-01

    In this paper, the authors propose a new hybrid strategy (using artificial neural networks and hidden Markov models) for skill automation. The strategy is based on the concept of using an “adaptive desired†that is introduced in the paper. The authors explain how using an adaptive desired can help a system for which an explicit model is not available or is difficult to obtain to smartly cope with environmental disturbances without requiring explicit rules specification (as with fuzzy syste...

  5. Urban Automation Networks: Current and Emerging Solutions for Sensed Data Collection and Actuation in Smart Cities

    Directory of Open Access Journals (Sweden)

    Carles Gomez

    2015-09-01

    Full Text Available Urban Automation Networks (UANs are being deployed worldwide in order to enable Smart City applications. Given the crucial role of UANs, as well as their diversity, it is critically important to assess their properties and trade-offs. This article introduces the requirements and challenges for UANs, characterizes the main current and emerging UAN paradigms, provides guidelines for their design and/or choice, and comparatively examines their performance in terms of a variety of parameters including coverage, power consumption, latency, standardization status and economic cost.

  6. Urban Automation Networks: Current and Emerging Solutions for Sensed Data Collection and Actuation in Smart Cities.

    Science.gov (United States)

    Gomez, Carles; Paradells, Josep

    2015-01-01

    Urban Automation Networks (UANs) are being deployed worldwide in order to enable Smart City applications. Given the crucial role of UANs, as well as their diversity, it is critically important to assess their properties and trade-offs. This article introduces the requirements and challenges for UANs, characterizes the main current and emerging UAN paradigms, provides guidelines for their design and/or choice, and comparatively examines their performance in terms of a variety of parameters including coverage, power consumption, latency, standardization status and economic cost. PMID:26378534

  7. Automated quantitative analysis of ventilation-perfusion lung scintigrams

    International Nuclear Information System (INIS)

    An automated computer analysis of ventilation (Kr-81m) and perfusion (Tc-99m) lung images has been devised that produces a graphical image of the distribution of ventilation and perfusion, and of ventilation-perfusion ratios. The analysis has overcome the following problems: the identification of the midline between two lungs and the lung boundaries, the exclusion of extrapulmonary radioactivity, the superimposition of lung images of different sizes, and the format for presentation of the data. Therefore, lung images of different sizes and shapes may be compared with each other. The analysis has been used to develop normal ranges from 55 volunteers. Comparison of younger and older age groups of men and women show small but significant differences in the distribution of ventilation and perfusion, but no differences in ventilation-perfusion ratios

  8. Analysis of automated highway system risks and uncertainties. Volume 5

    Energy Technology Data Exchange (ETDEWEB)

    Sicherman, A.

    1994-10-01

    This volume describes a risk analysis performed to help identify important Automated Highway System (AHS) deployment uncertainties and quantify their effect on costs and benefits for a range of AHS deployment scenarios. The analysis identified a suite of key factors affecting vehicle and roadway costs, capacities and market penetrations for alternative AHS deployment scenarios. A systematic protocol was utilized for obtaining expert judgments of key factor uncertainties in the form of subjective probability percentile assessments. Based on these assessments, probability distributions on vehicle and roadway costs, capacity and market penetration were developed for the different scenarios. The cost/benefit risk methodology and analysis provide insights by showing how uncertainties in key factors translate into uncertainties in summary cost/benefit indices.

  9. Trends in biomedical informatics: automated topic analysis of JAMIA articles.

    Science.gov (United States)

    Han, Dong; Wang, Shuang; Jiang, Chao; Jiang, Xiaoqian; Kim, Hyeon-Eui; Sun, Jimeng; Ohno-Machado, Lucila

    2015-11-01

    Biomedical Informatics is a growing interdisciplinary field in which research topics and citation trends have been evolving rapidly in recent years. To analyze these data in a fast, reproducible manner, automation of certain processes is needed. JAMIA is a "generalist" journal for biomedical informatics. Its articles reflect the wide range of topics in informatics. In this study, we retrieved Medical Subject Headings (MeSH) terms and citations of JAMIA articles published between 2009 and 2014. We use tensors (i.e., multidimensional arrays) to represent the interaction among topics, time and citations, and applied tensor decomposition to automate the analysis. The trends represented by tensors were then carefully interpreted and the results were compared with previous findings based on manual topic analysis. A list of most cited JAMIA articles, their topics, and publication trends over recent years is presented. The analyses confirmed previous studies and showed that, from 2012 to 2014, the number of articles related to MeSH terms Methods, Organization & Administration, and Algorithms increased significantly both in number of publications and citations. Citation trends varied widely by topic, with Natural Language Processing having a large number of citations in particular years, and Medical Record Systems, Computerized remaining a very popular topic in all years.

  10. Quantifying biodiversity using digital cameras and automated image analysis.

    Science.gov (United States)

    Roadknight, C. M.; Rose, R. J.; Barber, M. L.; Price, M. C.; Marshall, I. W.

    2009-04-01

    Monitoring the effects on biodiversity of extensive grazing in complex semi-natural habitats is labour intensive. There are also concerns about the standardization of semi-quantitative data collection. We have chosen to focus initially on automating the most time consuming aspect - the image analysis. The advent of cheaper and more sophisticated digital camera technology has lead to a sudden increase in the number of habitat monitoring images and information that is being collected. We report on the use of automated trail cameras (designed for the game hunting market) to continuously capture images of grazer activity in a variety of habitats at Moor House National Nature Reserve, which is situated in the North of England at an average altitude of over 600m. Rainfall is high, and in most areas the soil consists of deep peat (1m to 3m), populated by a mix of heather, mosses and sedges. The cameras have been continuously in operation over a 6 month period, daylight images are in full colour and night images (IR flash) are black and white. We have developed artificial intelligence based methods to assist in the analysis of the large number of images collected, generating alert states for new or unusual image conditions. This paper describes the data collection techniques, outlines the quantitative and qualitative data collected and proposes online and offline systems that can reduce the manpower overheads and increase focus on important subsets in the collected data. By converting digital image data into statistical composite data it can be handled in a similar way to other biodiversity statistics thus improving the scalability of monitoring experiments. Unsupervised feature detection methods and supervised neural methods were tested and offered solutions to simplifying the process. Accurate (85 to 95%) categorization of faunal content can be obtained, requiring human intervention for only those images containing rare animals or unusual (undecidable) conditions, and

  11. Automated Signature Creator for a Signature Based Intrusion Detection System with Network Attack Detection Capabilities (Pancakes

    Directory of Open Access Journals (Sweden)

    Frances Bernadette C. De Ocampo

    2015-05-01

    Full Text Available Signature-based Intrusion Detection System (IDS helps in maintaining the integrity of data in a network controlled environment. Unfortunately, this type of IDS depends on predetermined intrusion patterns that are manually created. If the signature database of the Signature-based IDS is not updated, network attacks just pass through this type of IDS without being noticed. To avoid this, an Anomaly-based IDS is used in order to countercheck if a network traffic that is not detected by Signature-based IDS is a true malicious traffic or not. In doing so, the Anomaly-based IDS might come up with several numbers of logs containing numerous network attacks which could possibly be a false positive. This is the reason why the Anomaly-based IDS is not perfect, it would readily alarm the system that a network traffic is an attack just because it is not on its baseline. In order to resolve the problem between these two IDSs, the goal is to correlate data between the logs of the Anomaly-based IDS and the packet that has been captured in order to determine if a network traffic is really malicious or not. With the supervision of a security expert, the malicious network traffic would be verified as malicious. Using machine learning, the researchers can identify which algorithm is better than the other algorithms in classifying if a certain network traffic is really malicious. Upon doing so, the creation of signatures would follow by basing the automated creation of signatures from the detected malicious traffic.

  12. Web Services Dependency Networks Analysis

    CERN Document Server

    Cherifi1, Chantal; Santucci, Jean-François

    2013-01-01

    Along with a continuously growing number of publicly available Web services (WS), we are witnessing a rapid development in semantic-related web technologies, which lead to the apparition of semantically described WS. In this work, we perform a comparative analysis of the syntactic and semantic approaches used to describe WS, from a complex network perspective. First, we extract syntactic and semantic WS dependency networks from a collection of publicly available WS descriptions. Then, we take advantage of tools from the complex network field to analyze them and determine their topological properties. We show WS dependency networks exhibit some of the typical characteristics observed in real-world networks, such as small world and scale free properties, as well as community structure. By comparing syntactic and semantic networks through their topological properties, we show the introduction of semantics in WS description allows modeling more accurately the dependencies between parameters, which in turn could l...

  13. Automated image analysis for quantification of filamentous bacteria

    DEFF Research Database (Denmark)

    Fredborg, Marlene; Rosenvinge, Flemming Schønning; Spillum, Erik;

    2015-01-01

    Background Antibiotics of the β-lactam group are able to alter the shape of the bacterial cell wall, e.g. filamentation or a spheroplast formation. Early determination of antimicrobial susceptibility may be complicated by filamentation of bacteria as this can be falsely interpreted as growth...... in systems relying on colorimetry or turbidometry (such as Vitek-2, Phoenix, MicroScan WalkAway). The objective was to examine an automated image analysis algorithm for quantification of filamentous bacteria using the 3D digital microscopy imaging system, oCelloScope. Results Three E. coli strains displaying...... different resistant profiles and differences in filamentation kinetics were used to study a novel image analysis algorithm to quantify length of bacteria and bacterial filamentation. A total of 12 β-lactam antibiotics or β-lactam–β-lactamase inhibitor combinations were analyzed for their ability to induce...

  14. AutoGate: automating analysis of flow cytometry data.

    Science.gov (United States)

    Meehan, Stephen; Walther, Guenther; Moore, Wayne; Orlova, Darya; Meehan, Connor; Parks, David; Ghosn, Eliver; Philips, Megan; Mitsunaga, Erin; Waters, Jeffrey; Kantor, Aaron; Okamura, Ross; Owumi, Solomon; Yang, Yang; Herzenberg, Leonard A; Herzenberg, Leonore A

    2014-05-01

    Nowadays, one can hardly imagine biology and medicine without flow cytometry to measure CD4 T cell counts in HIV, follow bone marrow transplant patients, characterize leukemias, etc. Similarly, without flow cytometry, there would be a bleak future for stem cell deployment, HIV drug development and full characterization of the cells and cell interactions in the immune system. But while flow instruments have improved markedly, the development of automated tools for processing and analyzing flow data has lagged sorely behind. To address this deficit, we have developed automated flow analysis software technology, provisionally named AutoComp and AutoGate. AutoComp acquires sample and reagent labels from users or flow data files, and uses this information to complete the flow data compensation task. AutoGate replaces the manual subsetting capabilities provided by current analysis packages with newly defined statistical algorithms that automatically and accurately detect, display and delineate subsets in well-labeled and well-recognized formats (histograms, contour and dot plots). Users guide analyses by successively specifying axes (flow parameters) for data subset displays and selecting statistically defined subsets to be used for the next analysis round. Ultimately, this process generates analysis "trees" that can be applied to automatically guide analyses for similar samples. The first AutoComp/AutoGate version is currently in the hands of a small group of users at Stanford, Emory and NIH. When this "early adopter" phase is complete, the authors expect to distribute the software free of charge to .edu, .org and .gov users.

  15. Fully automated diabetic retinopathy screening using morphological component analysis.

    Science.gov (United States)

    Imani, Elaheh; Pourreza, Hamid-Reza; Banaee, Touka

    2015-07-01

    Diabetic retinopathy is the major cause of blindness in the world. It has been shown that early diagnosis can play a major role in prevention of visual loss and blindness. This diagnosis can be made through regular screening and timely treatment. Besides, automation of this process can significantly reduce the work of ophthalmologists and alleviate inter and intra observer variability. This paper provides a fully automated diabetic retinopathy screening system with the ability of retinal image quality assessment. The novelty of the proposed method lies in the use of Morphological Component Analysis (MCA) algorithm to discriminate between normal and pathological retinal structures. To this end, first a pre-screening algorithm is used to assess the quality of retinal images. If the quality of the image is not satisfactory, it is examined by an ophthalmologist and must be recaptured if necessary. Otherwise, the image is processed for diabetic retinopathy detection. In this stage, normal and pathological structures of the retinal image are separated by MCA algorithm. Finally, the normal and abnormal retinal images are distinguished by statistical features of the retinal lesions. Our proposed system achieved 92.01% sensitivity and 95.45% specificity on the Messidor dataset which is a remarkable result in comparison with previous work. PMID:25863517

  16. Automated retinal image analysis for diabetic retinopathy in telemedicine.

    Science.gov (United States)

    Sim, Dawn A; Keane, Pearse A; Tufail, Adnan; Egan, Catherine A; Aiello, Lloyd Paul; Silva, Paolo S

    2015-03-01

    There will be an estimated 552 million persons with diabetes globally by the year 2030. Over half of these individuals will develop diabetic retinopathy, representing a nearly insurmountable burden for providing diabetes eye care. Telemedicine programmes have the capability to distribute quality eye care to virtually any location and address the lack of access to ophthalmic services. In most programmes, there is currently a heavy reliance on specially trained retinal image graders, a resource in short supply worldwide. These factors necessitate an image grading automation process to increase the speed of retinal image evaluation while maintaining accuracy and cost effectiveness. Several automatic retinal image analysis systems designed for use in telemedicine have recently become commercially available. Such systems have the potential to substantially improve the manner by which diabetes eye care is delivered by providing automated real-time evaluation to expedite diagnosis and referral if required. Furthermore, integration with electronic medical records may allow a more accurate prognostication for individual patients and may provide predictive modelling of medical risk factors based on broad population data. PMID:25697773

  17. Software fault tree analysis of an automated control system device written in Ada

    OpenAIRE

    Winter, Mathias William.

    1995-01-01

    Software Fault Tree Analysis (SFTA) is a technique used to analyze software for faults that could lead to hazardous conditions in systems which contain software components. Previous thesis works have developed three Ada-based, semi-automated software analysis tools, the Automated Code Translation Tool (ACm) an Ada statement template generator, the Fault Tree Editor (Fm) a graphical fault tree editor, and the Fault Isolator (Fl) an automated software fault tree isolator. These previous works d...

  18. Computational Social Network Analysis

    CERN Document Server

    Hassanien, Aboul-Ella

    2010-01-01

    Presents insight into the social behaviour of animals (including the study of animal tracks and learning by members of the same species). This book provides web-based evidence of social interaction, perceptual learning, information granulation and the behaviour of humans and affinities between web-based social networks

  19. Malaria: the value of the automated depolarization analysis.

    Science.gov (United States)

    Josephine, F P; Nissapatorn, V

    2005-01-01

    This retrospective and descriptive study was carried out in the University of Malaya Medical Center (UMMC) from January to September, 2004. This study aimed to evaluate the diagnostic utility of the Cell-Dyn 4000 hematology analyzer's depolarization analysis and to determine the sensitivity and specificity of this technique in the context of malaria diagnosis. A total of 889 cases presenting with pyrexia of unknown origin or clinically suspected of malaria were examined. Sixteen of these blood samples were found to be positive; 12 for P. vivax, 3 for P. malariae, and 1 for P. falciparum by peripheral blood smear as the standard technique for parasite detection and species identification. Demographic characteristics showed that the majority of patients were in the age range of 20-57 with a mean of 35.9 (+/- SD) 11.4 years, and male foreign workers. Of these, 16 positive blood samples were also processed by Cell-Dyne 4000 analyzer in the normal complete blood count (CBC) operational mode. Malaria parasites produce hemozoin, which depolarizes light and this allows the automated detection of malaria during routine complete blood count analysis with the Abbot Cell-Dyn CD4000 instrument. The white blood cell (WBC) differential plots of all malaria positive samples showed abnormal depolarization events in the NEU-EOS and EOS I plots. This was not seen in the negative samples. In 12 patients with P. vivax infection, a cluster pattern in the Neu-EOS and EOS I plots was observed, and appeared color-coded green or black. In 3 patients with P. malariae infection, few random depolarization events in the NEU-EOS and EOS I plots were seen, and appeared color-coded green, black or blue. While in the patient with P. falciparum infection, the sample was color-coded green with a few random purple depolarizing events in the NEU-EOS and EOS I plots. This study confirms that automated depolarization analysis is a highly sensitive and specific method to diagnose whether or not a patient

  20. Automated quantitative gait analysis in animal models of movement disorders

    Directory of Open Access Journals (Sweden)

    Vandeputte Caroline

    2010-08-01

    Full Text Available Abstract Background Accurate and reproducible behavioral tests in animal models are of major importance in the development and evaluation of new therapies for central nervous system disease. In this study we investigated for the first time gait parameters of rat models for Parkinson's disease (PD, Huntington's disease (HD and stroke using the Catwalk method, a novel automated gait analysis test. Static and dynamic gait parameters were measured in all animal models, and these data were compared to readouts of established behavioral tests, such as the cylinder test in the PD and stroke rats and the rotarod tests for the HD group. Results Hemiparkinsonian rats were generated by unilateral injection of the neurotoxin 6-hydroxydopamine in the striatum or in the medial forebrain bundle. For Huntington's disease, a transgenic rat model expressing a truncated huntingtin fragment with multiple CAG repeats was used. Thirdly, a stroke model was generated by a photothrombotic induced infarct in the right sensorimotor cortex. We found that multiple gait parameters were significantly altered in all three disease models compared to their respective controls. Behavioural deficits could be efficiently measured using the cylinder test in the PD and stroke animals, and in the case of the PD model, the deficits in gait essentially confirmed results obtained by the cylinder test. However, in the HD model and the stroke model the Catwalk analysis proved more sensitive than the rotarod test and also added new and more detailed information on specific gait parameters. Conclusion The automated quantitative gait analysis test may be a useful tool to study both motor impairment and recovery associated with various neurological motor disorders.

  1. Topological analysis of telecommunications networks

    Directory of Open Access Journals (Sweden)

    Milojko V. Jevtović

    2011-01-01

    Full Text Available A topological analysis of the structure of telecommunications networks is a very interesting topic in the network research, but also a key issue in their design and planning. Satisfying multiple criteria in terms of locations of switching nodes as well as their connectivity with respect to the requests for capacity, transmission speed, reliability, availability and cost are the main research objectives. There are three ways of presenting the topology of telecommunications networks: table, matrix or graph method. The table method is suitable for a network of a relatively small number of nodes in relation to the number of links. The matrix method involves the formation of a connection matrix in which its columns present source traffic nodes and its rows are the switching systems that belong to the destination. The method of the topology graph means that the network nodes are connected via directional or unidirectional links. We can thus easily analyze the structural parameters of telecommunications networks. This paper presents the mathematical analysis of the star-, ring-, fully connected loop- and grid (matrix-shaped topology as well as the topology based on the shortest path tree. For each of these topologies, the expressions for determining the number of branches, the middle level of reliability, the medium length and the average length of the link are given in tables. For the fully connected loop network with five nodes the values of all topological parameters are calculated. Based on the topological parameters, the relationships that represent integral and distributed indicators of reliability are given in this work as well as the values of the particular network. The main objectives of the topology optimization of telecommunications networks are: achieving the minimum complexity, maximum capacity, the shortest path message transfer, the maximum speed of communication and maximum economy. The performance of telecommunications networks is

  2. Automated generation of burnup chain for reactor analysis applications

    International Nuclear Information System (INIS)

    This paper presents the development of an automated generation of a new burnup chain for reactor analysis applications. The JENDL FP Decay Data File 2011 and Fission Yields Data File 2011 were used as the data sources. The nuclides in the new chain are determined by restrictions of the half-life and cumulative yield of fission products or from a given list. Then, decay modes, branching ratios and fission yields are recalculated taking into account intermediate reactions. The new burnup chain is output according to the format for the SRAC code system. Verification was performed to evaluate the accuracy of the new burnup chain. The results show that the new burnup chain reproduces well the results of a reference one with 193 fission products used in SRAC. Further development and applications are being planned with the burnup chain code. (author)

  3. Topology analysis of social networks extracted from literature.

    Science.gov (United States)

    Waumans, Michaël C; Nicodème, Thibaut; Bersini, Hugues

    2015-01-01

    In a world where complex networks are an increasingly important part of science, it is interesting to question how the new reading of social realities they provide applies to our cultural background and in particular, popular culture. Are authors of successful novels able to reproduce social networks faithful to the ones found in reality? Is there any common trend connecting an author's oeuvre, or a genre of fiction? Such an analysis could provide new insight on how we, as a culture, perceive human interactions and consume media. The purpose of the work presented in this paper is to define the signature of a novel's story based on the topological analysis of its social network of characters. For this purpose, an automated tool was built that analyses the dialogs in novels, identifies characters and computes their relationships in a time-dependent manner in order to assess the network's evolution over the course of the story.

  4. Topology analysis of social networks extracted from literature.

    Directory of Open Access Journals (Sweden)

    Michaël C Waumans

    Full Text Available In a world where complex networks are an increasingly important part of science, it is interesting to question how the new reading of social realities they provide applies to our cultural background and in particular, popular culture. Are authors of successful novels able to reproduce social networks faithful to the ones found in reality? Is there any common trend connecting an author's oeuvre, or a genre of fiction? Such an analysis could provide new insight on how we, as a culture, perceive human interactions and consume media. The purpose of the work presented in this paper is to define the signature of a novel's story based on the topological analysis of its social network of characters. For this purpose, an automated tool was built that analyses the dialogs in novels, identifies characters and computes their relationships in a time-dependent manner in order to assess the network's evolution over the course of the story.

  5. Topology analysis of social networks extracted from literature.

    Science.gov (United States)

    Waumans, Michaël C; Nicodème, Thibaut; Bersini, Hugues

    2015-01-01

    In a world where complex networks are an increasingly important part of science, it is interesting to question how the new reading of social realities they provide applies to our cultural background and in particular, popular culture. Are authors of successful novels able to reproduce social networks faithful to the ones found in reality? Is there any common trend connecting an author's oeuvre, or a genre of fiction? Such an analysis could provide new insight on how we, as a culture, perceive human interactions and consume media. The purpose of the work presented in this paper is to define the signature of a novel's story based on the topological analysis of its social network of characters. For this purpose, an automated tool was built that analyses the dialogs in novels, identifies characters and computes their relationships in a time-dependent manner in order to assess the network's evolution over the course of the story. PMID:26039072

  6. Automated Large-Scale Shoreline Variability Analysis From Video

    Science.gov (United States)

    Pearre, N. S.

    2006-12-01

    Land-based video has been used to quantify changes in nearshore conditions for over twenty years. By combining the ability to track rapid, short-term shoreline change and changes associated with longer term or seasonal processes, video has proved to be a cost effective and versatile tool for coastal science. Previous video-based studies of shoreline change have typically examined the position of the shoreline along a small number of cross-shore lines as a proxy for the continuous coast. The goal of this study is twofold: (1) to further develop automated shoreline extraction algorithms for continuous shorelines, and (2) to track the evolution of a nourishment project at Rehoboth Beach, DE that was concluded in June 2005. Seven cameras are situated approximately 30 meters above mean sea level and 70 meters from the shoreline. Time exposure and variance images are captured hourly during daylight and transferred to a local processing computer. After correcting for lens distortion and geo-rectifying to a shore-normal coordinate system, the images are merged to form a composite planform image of 6 km of coast. Automated extraction algorithms establish shoreline and breaker positions throughout a tidal cycle on a daily basis. Short and long term variability in the daily shoreline will be characterized using empirical orthogonal function (EOF) analysis. Periodic sediment volume information will be extracted by incorporating the results of monthly ground-based LIDAR surveys and by correlating the hourly shorelines to the corresponding tide level under conditions with minimal wave activity. The Delaware coast in the area downdrift of the nourishment site is intermittently interrupted by short groins. An Even/Odd analysis of the shoreline response around these groins will be performed. The impact of groins on the sediment volume transport along the coast during periods of accretive and erosive conditions will be discussed. [This work is being supported by DNREC and the

  7. galaxieEST: addressing EST identity through automated phylogenetic analysis

    Directory of Open Access Journals (Sweden)

    Larsson Karl-Henrik

    2004-07-01

    Full Text Available Abstract Background Research involving expressed sequence tags (ESTs is intricately coupled to the existence of large, well-annotated sequence repositories. Comparatively complete and satisfactory annotated public sequence libraries are, however, available only for a limited range of organisms, rendering the absence of sequences and gene structure information a tangible problem for those working with taxa lacking an EST or genome sequencing project. Paralogous genes belonging to the same gene family but distinguished by derived characteristics are particularly prone to misidentification and erroneous annotation; high but incomplete levels of sequence similarity are typically difficult to interpret and have formed the basis of many unsubstantiated assumptions of orthology. In these cases, a phylogenetic study of the query sequence together with the most similar sequences in the database may be of great value to the identification process. In order to facilitate this laborious procedure, a project to employ automated phylogenetic analysis in the identification of ESTs was initiated. Results galaxieEST is an open source Perl-CGI script package designed to complement traditional similarity-based identification of EST sequences through employment of automated phylogenetic analysis. It uses a series of BLAST runs as a sieve to retrieve nucleotide and protein sequences for inclusion in neighbour joining and parsimony analyses; the output includes the BLAST output, the results of the phylogenetic analyses, and the corresponding multiple alignments. galaxieEST is available as an on-line web service for identification of fungal ESTs and for download / local installation for use with any organism group at http://galaxie.cgb.ki.se/galaxieEST.html. Conclusions By addressing sequence relatedness in addition to similarity, galaxieEST provides an integrative view on EST origin and identity, which may prove particularly useful in cases where similarity searches

  8. 14 CFR 1261.413 - Analysis of costs; automation; prevention of overpayments, delinquencies, or defaults.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 5 2010-01-01 2010-01-01 false Analysis of costs; automation; prevention of overpayments, delinquencies, or defaults. 1261.413 Section 1261.413 Aeronautics and Space NATIONAL...) § 1261.413 Analysis of costs; automation; prevention of overpayments, delinquencies, or defaults....

  9. Automated grading of left ventricular segmental wall motion by an artificial neural network using color kinesis images

    Directory of Open Access Journals (Sweden)

    L.O. Murta Jr.

    2006-01-01

    Full Text Available The present study describes an auxiliary tool in the diagnosis of left ventricular (LV segmental wall motion (WM abnormalities based on color-coded echocardiographic WM images. An artificial neural network (ANN was developed and validated for grading LV segmental WM using data from color kinesis (CK images, a technique developed to display the timing and magnitude of global and regional WM in real time. We evaluated 21 normal subjects and 20 patients with LVWM abnormalities revealed by two-dimensional echocardiography. CK images were obtained in two sets of viewing planes. A method was developed to analyze CK images, providing quantitation of fractional area change in each of the 16 LV segments. Two experienced observers analyzed LVWM from two-dimensional images and scored them as: 1 normal, 2 mild hypokinesia, 3 moderate hypokinesia, 4 severe hypokinesia, 5 akinesia, and 6 dyskinesia. Based on expert analysis of 10 normal subjects and 10 patients, we trained a multilayer perceptron ANN using a back-propagation algorithm to provide automated grading of LVWM, and this ANN was then tested in the remaining subjects. Excellent concordance between expert and ANN analysis was shown by ROC curve analysis, with measured area under the curve of 0.975. An excellent correlation was also obtained for global LV segmental WM index by expert and ANN analysis (R² = 0.99. In conclusion, ANN showed high accuracy for automated semi-quantitative grading of WM based on CK images. This technique can be an important aid, improving diagnostic accuracy and reducing inter-observer variability in scoring segmental LVWM.

  10. Automated grading of left ventricular segmental wall motion by an artificial neural network using color kinesis images.

    Science.gov (United States)

    Murta, L O; Ruiz, E E S; Pazin-Filho, A; Schmidt, A; Almeida-Filho, O C; Simões, M V; Marin-Neto, J A; Maciel, B C

    2006-01-01

    The present study describes an auxiliary tool in the diagnosis of left ventricular (LV) segmental wall motion (WM) abnormalities based on color-coded echocardiographic WM images. An artificial neural network (ANN) was developed and validated for grading LV segmental WM using data from color kinesis (CK) images, a technique developed to display the timing and magnitude of global and regional WM in real time. We evaluated 21 normal subjects and 20 patients with LVWM abnormalities revealed by two-dimensional echocardiography. CK images were obtained in two sets of viewing planes. A method was developed to analyze CK images, providing quantitation of fractional area change in each of the 16 LV segments. Two experienced observers analyzed LVWM from two-dimensional images and scored them as: 1) normal, 2) mild hypokinesia, 3) moderate hypokinesia, 4) severe hypokinesia, 5) akinesia, and 6) dyskinesia. Based on expert analysis of 10 normal subjects and 10 patients, we trained a multilayer perceptron ANN using a back-propagation algorithm to provide automated grading of LVWM, and this ANN was then tested in the remaining subjects. Excellent concordance between expert and ANN analysis was shown by ROC curve analysis, with measured area under the curve of 0.975. An excellent correlation was also obtained for global LV segmental WM index by expert and ANN analysis (R2 = 0.99). In conclusion, ANN showed high accuracy for automated semi-quantitative grading of WM based on CK images. This technique can be an important aid, improving diagnostic accuracy and reducing inter-observer variability in scoring segmental LVWM.

  11. StakeSource: harnessing the power of crowdsourcing and social networks in stakeholder analysis

    OpenAIRE

    Lim, S L; Quercia, D.; Finkelstein, A.

    2010-01-01

    Projects often fail because they overlook stakeholders. Unfortunately, existing stakeholder analysis tools only capture stakeholders' information, relying on experts to manually identify them. StakeSource is a web-based tool that automates stakeholder analysis. It "crowdsources" the stakeholders themselves for recommendations about other stakeholders and aggregates their answers using social network analysis.

  12. Automated seismic event location by waveform coherence analysis

    OpenAIRE

    Grigoli, Francesco

    2014-01-01

    Automated location of seismic events is a very important task in microseismic monitoring operations as well for local and regional seismic monitoring. Since microseismic records are generally characterised by low signal-to-noise ratio, such methods are requested to be noise robust and sufficiently accurate. Most of the standard automated location routines are based on the automated picking, identification and association of the first arrivals of P and S waves and on the minimization of the re...

  13. Automated SEM Modal Analysis Applied to the Diogenites

    Science.gov (United States)

    Bowman, L. E.; Spilde, M. N.; Papike, James J.

    1996-01-01

    Analysis of volume proportions of minerals, or modal analysis, is routinely accomplished by point counting on an optical microscope, but the process, particularly on brecciated samples such as the diogenite meteorites, is tedious and prone to error by misidentification of very small fragments, which may make up a significant volume of the sample. Precise volume percentage data can be gathered on a scanning electron microscope (SEM) utilizing digital imaging and an energy dispersive spectrometer (EDS). This form of automated phase analysis reduces error, and at the same time provides more information than could be gathered using simple point counting alone, such as particle morphology statistics and chemical analyses. We have previously studied major, minor, and trace-element chemistry of orthopyroxene from a suite of diogenites. This abstract describes the method applied to determine the modes on this same suite of meteorites and the results of that research. The modal abundances thus determined add additional information on the petrogenesis of the diogenites. In addition, low-abundance phases such as spinels were located for further analysis by this method.

  14. The Development of the Routing Pattern of the Backbone Data Transmission Network for the Automation of the Krasnoyarsk Railway

    Directory of Open Access Journals (Sweden)

    Sergey Victorovich Makarov

    2016-06-01

    Full Text Available The paper deals with the data transmission network of the Krasnoyarsk Railway, its structure, the topology of data transmission and the routing protocol, which supports its operation, as well as the specifics of data transmission networking. The combination of the railway automation applications and the data transmission network make up the automation systems, making it possible to improve performance, increase the freight traffic volume, and improve the quality of passenger service. The objective of this paper is to study the existing data transmission network of the Krasnoyarsk Railway and to develop ways of its modernization, in order to improve the reliability of the network and the automated systems that use this network. It was found that the IS-IS and OSPF routing protocols have many differences, primarily due to the fact that the IS-IS protocol does not use IP addresses as paths. On the one hand, it makes it possible to use the IS-IS protocol for the operation in IPv6 networks, whereas OSPF version 2 doesn’t provide this opportunity; therefore, OSPF version 3 was developed to solve this problem. However, on the other hand, in case of using IPv4, the routing configuration by means of the IS-IS protocol will imply the need to study a significant volume of information and use unusual methods of routing configuration.

  15. Granulometric profiling of aeolian dust deposits by automated image analysis

    Science.gov (United States)

    Varga, György; Újvári, Gábor; Kovács, János; Jakab, Gergely; Kiss, Klaudia; Szalai, Zoltán

    2016-04-01

    Determination of granulometric parameters is of growing interest in the Earth sciences. Particle size data of sedimentary deposits provide insights into the physicochemical environment of transport, accumulation and post-depositional alterations of sedimentary particles, and are important proxies applied in paleoclimatic reconstructions. It is especially true for aeolian dust deposits with a fairly narrow grain size range as a consequence of the extremely selective nature of wind sediment transport. Therefore, various aspects of aeolian sedimentation (wind strength, distance to source(s), possible secondary source regions and modes of sedimentation and transport) can be reconstructed only from precise grain size data. As terrestrial wind-blown deposits are among the most important archives of past environmental changes, proper explanation of the proxy data is a mandatory issue. Automated imaging provides a unique technique to gather direct information on granulometric characteristics of sedimentary particles. Granulometric data obtained from automatic image analysis of Malvern Morphologi G3-ID is a rarely applied new technique for particle size and shape analyses in sedimentary geology. Size and shape data of several hundred thousand (or even million) individual particles were automatically recorded in this study from 15 loess and paleosoil samples from the captured high-resolution images. Several size (e.g. circle-equivalent diameter, major axis, length, width, area) and shape parameters (e.g. elongation, circularity, convexity) were calculated by the instrument software. At the same time, the mean light intensity after transmission through each particle is automatically collected by the system as a proxy of optical properties of the material. Intensity values are dependent on chemical composition and/or thickness of the particles. The results of the automated imaging were compared to particle size data determined by three different laser diffraction instruments

  16. Analysis of neural networks

    CERN Document Server

    Heiden, Uwe

    1980-01-01

    The purpose of this work is a unified and general treatment of activity in neural networks from a mathematical pOint of view. Possible applications of the theory presented are indica­ ted throughout the text. However, they are not explored in de­ tail for two reasons : first, the universal character of n- ral activity in nearly all animals requires some type of a general approach~ secondly, the mathematical perspicuity would suffer if too many experimental details and empirical peculiarities were interspersed among the mathematical investigation. A guide to many applications is supplied by the references concerning a variety of specific issues. Of course the theory does not aim at covering all individual problems. Moreover there are other approaches to neural network theory (see e.g. Poggio-Torre, 1978) based on the different lev­ els at which the nervous system may be viewed. The theory is a deterministic one reflecting the average be­ havior of neurons or neuron pools. In this respect the essay is writt...

  17. The space physics analysis network

    Science.gov (United States)

    Green, James L.

    1988-04-01

    The Space Physics Analysis Network, or SPAN, is emerging as a viable method for solving an immediate communication problem for space and Earth scientists and has been operational for nearly 7 years. SPAN and its extension into Europe, utilizes computer-to-computer communications allowing mail, binary and text file transfer, and remote logon capability to over 1000 space science computer systems. The network has been used to successfully transfer real-time data to remote researchers for rapid data analysis but its primary function is for non-real-time applications. One of the major advantages for using SPAN is its spacecraft mission independence. Space science researchers using SPAN are located in universities, industries and government institutions all across the United States and Europe. These researchers are in such fields as magnetospheric physics, astrophysics, ionosperic physics, atmospheric physics, climatology, meteorology, oceanography, planetary physics and solar physics. SPAN users have access to space and Earth science data bases, mission planning and information systems, and computational facilities for the purposes of facilitating correlative space data exchange, data analysis and space research. For example, the National Space Science Data Center (NSSDC), which manages the network, is providing facilities on SPAN such as the Network Information Center (SPAN NIC). SPAN has interconnections with several national and international networks such as HEPNET and TEXNET forming a transparent DECnet network. The combined total number of computers now reachable over these combined networks is about 2000. In addition, SPAN supports full function capabilities over the international public packet switched networks (e.g. TELENET) and has mail gateways to ARPANET, BITNET and JANET.

  18. A Semi-Automated Functional Test Data Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Peng; Haves, Philip; Kim, Moosung

    2005-05-01

    The growing interest in commissioning is creating a demand that will increasingly be met by mechanical contractors and less experienced commissioning agents. They will need tools to help them perform commissioning effectively and efficiently. The widespread availability of standardized procedures, accessible in the field, will allow commissioning to be specified with greater certainty as to what will be delivered, enhancing the acceptance and credibility of commissioning. In response, a functional test data analysis tool is being developed to analyze the data collected during functional tests for air-handling units. The functional test data analysis tool is designed to analyze test data, assess performance of the unit under test and identify the likely causes of the failure. The tool has a convenient user interface to facilitate manual entry of measurements made during a test. A graphical display shows the measured performance versus the expected performance, highlighting significant differences that indicate the unit is not able to pass the test. The tool is described as semiautomated because the measured data need to be entered manually, instead of being passed from the building control system automatically. However, the data analysis and visualization are fully automated. The tool is designed to be used by commissioning providers conducting functional tests as part of either new building commissioning or retro-commissioning, as well as building owners and operators interested in conducting routine tests periodically to check the performance of their HVAC systems.

  19. Intelligent Control in Automation Based on Wireless Traffic Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kurt Derr; Milos Manic

    2007-08-01

    Wireless technology is a central component of many factory automation infrastructures in both the commercial and government sectors, providing connectivity among various components in industrial realms (distributed sensors, machines, mobile process controllers). However wireless technologies provide more threats to computer security than wired environments. The advantageous features of Bluetooth technology resulted in Bluetooth units shipments climbing to five million per week at the end of 2005 [1, 2]. This is why the real-time interpretation and understanding of Bluetooth traffic behavior is critical in both maintaining the integrity of computer systems and increasing the efficient use of this technology in control type applications. Although neuro-fuzzy approaches have been applied to wireless 802.11 behavior analysis in the past, a significantly different Bluetooth protocol framework has not been extensively explored using this technology. This paper presents a new neurofuzzy traffic analysis algorithm of this still new territory of Bluetooth traffic. Further enhancements of this algorithm are presented along with the comparison against the traditional, numerical approach. Through test examples, interesting Bluetooth traffic behavior characteristics were captured, and the comparative elegance of this computationally inexpensive approach was demonstrated. This analysis can be used to provide directions for future development and use of this prevailing technology in various control type applications, as well as making the use of it more secure.

  20. Intelligent Control in Automation Based on Wireless Traffic Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kurt Derr; Milos Manic

    2007-09-01

    Wireless technology is a central component of many factory automation infrastructures in both the commercial and government sectors, providing connectivity among various components in industrial realms (distributed sensors, machines, mobile process controllers). However wireless technologies provide more threats to computer security than wired environments. The advantageous features of Bluetooth technology resulted in Bluetooth units shipments climbing to five million per week at the end of 2005 [1, 2]. This is why the real-time interpretation and understanding of Bluetooth traffic behavior is critical in both maintaining the integrity of computer systems and increasing the efficient use of this technology in control type applications. Although neuro-fuzzy approaches have been applied to wireless 802.11 behavior analysis in the past, a significantly different Bluetooth protocol framework has not been extensively explored using this technology. This paper presents a new neurofuzzy traffic analysis algorithm of this still new territory of Bluetooth traffic. Further enhancements of this algorithm are presented along with the comparison against the traditional, numerical approach. Through test examples, interesting Bluetooth traffic behavior characteristics were captured, and the comparative elegance of this computationally inexpensive approach was demonstrated. This analysis can be used to provide directions for future development and use of this prevailing technology in various control type applications, as well as making the use of it more secure.

  1. Automated Identification of Core Regulatory Genes in Human Gene Regulatory Networks.

    Directory of Open Access Journals (Sweden)

    Vipin Narang

    Full Text Available Human gene regulatory networks (GRN can be difficult to interpret due to a tangle of edges interconnecting thousands of genes. We constructed a general human GRN from extensive transcription factor and microRNA target data obtained from public databases. In a subnetwork of this GRN that is active during estrogen stimulation of MCF-7 breast cancer cells, we benchmarked automated algorithms for identifying core regulatory genes (transcription factors and microRNAs. Among these algorithms, we identified K-core decomposition, pagerank and betweenness centrality algorithms as the most effective for discovering core regulatory genes in the network evaluated based on previously known roles of these genes in MCF-7 biology as well as in their ability to explain the up or down expression status of up to 70% of the remaining genes. Finally, we validated the use of K-core algorithm for organizing the GRN in an easier to interpret layered hierarchy where more influential regulatory genes percolate towards the inner layers. The integrated human gene and miRNA network and software used in this study are provided as supplementary materials (S1 Data accompanying this manuscript.

  2. BitTorrent Swarm Analysis through Automation and Enhanced Logging

    OpenAIRE

    R˘azvan Deaconescu; Marius Sandu-Popa; Adriana Dr˘aghici; Nicolae T˘apus

    2011-01-01

    Peer-to-Peer protocols currently form the most heavily used protocol class in the Internet, with BitTorrent, the most popular protocol for content distribution, as its flagship. A high number of studies and investigations have been undertaken to measure, analyse and improve the inner workings of the BitTorrent protocol. Approaches such as tracker message analysis, network probing and packet sniffing have been deployed to understand and enhance BitTorrent's internal behaviour. In this paper we...

  3. A fully automated entanglement-based quantum cryptography system for telecom fiber networks

    International Nuclear Information System (INIS)

    We present in this paper a quantum key distribution (QKD) system based on polarization entanglement for use in telecom fibers. A QKD exchange up to 50 km was demonstrated in the laboratory with a secure key rate of 550 bits s-1. The system is compact and portable with a fully automated start-up, and stabilization modules for polarization, synchronization and photon coupling allow hands-off operation. Stable and reliable key exchange in a deployed optical fiber of 16 km length was demonstrated. In this fiber network, we achieved over 2 weeks an automatic key generation with an average key rate of 2000 bits s-1 without manual intervention. During this period, the system had an average entanglement visibility of 93%, highlighting the technical level and stability achieved for entanglement-based quantum cryptography.

  4. Neural Network Approach to Automated Condition Classification of a Check Valve by Acoustic Emission Signals

    International Nuclear Information System (INIS)

    This paper presents new techniques under development for monitoring the health and vibration of the active components in nuclear power plants, The purpose of this study is to develop an automated system for condition classification of a check valve one of the components being used extensively in a safety system of a nuclear power plant. Acoustic emission testing for a check valve under controlled flow loop conditions was performed to detect and evaluate disc movement for valve failure such as wear and leakage due to foreign object interference in a check valve, It is clearly demonstrated that the evaluation of different types of failure types such as disc wear and check valve leakage were successful by systematically analyzing the characteristics of various AE parameters, It is also shown that the leak size can be determined with an artificial neural network

  5. Development of an Automated Technique for Failure Modes and Effect Analysis

    DEFF Research Database (Denmark)

    Blanke, M.; Borch, Ole; Allasia, G.;

    1999-01-01

    Advances in automation have provided integration of monitoring and control functions to enhance the operator's overview and ability to take remedy actions when faults occur. Automation in plant supervision is technically possible with integrated automation systems as platforms, but new design...... implementing an automated technique for Failure Modes and Effects Analysis (FMEA). This technique is based on the matrix formulation of FMEA for the investigation of failure propagation through a system. As main result, this technique will provide the design engineer with decision tables for fault handling...

  6. Development of an automated technique for failure modes and effect analysis

    DEFF Research Database (Denmark)

    Blanke, M.; Borch, Ole; Bagnoli, F.;

    Advances in automation have provided integration of monitoring and control functions to enhance the operator's overview and ability to take remedy actions when faults occur. Automation in plant supervision is technically possible with integrated automation systems as platforms, but new design...... implementing an automated technique for Failure Modes and Effects Analysis (FMEA). This technique is based on the matrix formulation of FMEA for the investigation of failure propagation through a system. As main result, this technique will provide the design engineer with decision tables for fault handling...

  7. NEAT: an efficient network enrichment analysis test

    OpenAIRE

    Signorelli, Mirko; Vinciotti, Veronica; Wit, Ernst C

    2016-01-01

    Background Network enrichment analysis is a powerful method, which allows to integrate gene enrichment analysis with the information on relationships between genes that is provided by gene networks. Existing tests for network enrichment analysis deal only with undirected networks, they can be computationally slow and are based on normality assumptions. Results We propose NEAT, a test for network enrichment analysis. The test is based on the hypergeometric distribution, which naturally arises ...

  8. Social Network Analysis and informal trade

    DEFF Research Database (Denmark)

    Walther, Olivier

    networks can be applied to better understand informal trade in developing countries, with a particular focus on Africa. The paper starts by discussing some of the fundamental concepts developed by social network analysis. Through a number of case studies, we show how social network analysis can i...... approaches. The paper finally highlights some of the applications of social network analysis and their implications for trade policies.......networks can be applied to better understand informal trade in developing countries, with a particular focus on Africa. The paper starts by discussing some of the fundamental concepts developed by social network analysis. Through a number of case studies, we show how social network analysis can...

  9. Traffic distribution and network capacity analysis in social opportunistic networks

    OpenAIRE

    Soelistijanto, B; Howarth, MP

    2012-01-01

    Social opportunistic networks are intermittently connected mobile ad hoc networks (ICNs) that exploit human mobility to physically carry messages between disconnected parts of the network. Human mobility thus plays an essential role in the performance of forwarding protocols in the networks, and people's movements are in turn affected by their social interactions with each other. In this paper we present an analysis of the traffic distribution among the nodes of social opportunistic networks ...

  10. Automated analysis for detecting beams in laser wakefield simulations

    Energy Technology Data Exchange (ETDEWEB)

    Ushizima, Daniela M.; Rubel, Oliver; Prabhat, Mr.; Weber, Gunther H.; Bethel, E. Wes; Aragon, Cecilia R.; Geddes, Cameron G.R.; Cormier-Michel, Estelle; Hamann, Bernd; Messmer, Peter; Hagen, Hans

    2008-07-03

    Laser wakefield particle accelerators have shown the potential to generate electric fields thousands of times higher than those of conventional accelerators. The resulting extremely short particle acceleration distance could yield a potential new compact source of energetic electrons and radiation, with wide applications from medicine to physics. Physicists investigate laser-plasma internal dynamics by running particle-in-cell simulations; however, this generates a large dataset that requires time-consuming, manual inspection by experts in order to detect key features such as beam formation. This paper describes a framework to automate the data analysis and classification of simulation data. First, we propose a new method to identify locations with high density of particles in the space-time domain, based on maximum extremum point detection on the particle distribution. We analyze high density electron regions using a lifetime diagram by organizing and pruning the maximum extrema as nodes in a minimum spanning tree. Second, we partition the multivariate data using fuzzy clustering to detect time steps in a experiment that may contain a high quality electron beam. Finally, we combine results from fuzzy clustering and bunch lifetime analysis to estimate spatially confined beams. We demonstrate our algorithms successfully on four different simulation datasets.

  11. 40 CFR 13.19 - Analysis of costs; automation; prevention of overpayments, delinquencies or defaults.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 1 2010-07-01 2010-07-01 false Analysis of costs; automation; prevention of overpayments, delinquencies or defaults. 13.19 Section 13.19 Protection of Environment...; automation; prevention of overpayments, delinquencies or defaults. (a) The Administrator may...

  12. Statistical analysis of network data with R

    CERN Document Server

    Kolaczyk, Eric D

    2014-01-01

    Networks have permeated everyday life through everyday realities like the Internet, social networks, and viral marketing. As such, network analysis is an important growth area in the quantitative sciences, with roots in social network analysis going back to the 1930s and graph theory going back centuries. Measurement and analysis are integral components of network research. As a result, statistical methods play a critical role in network analysis. This book is the first of its kind in network research. It can be used as a stand-alone resource in which multiple R packages are used to illustrate how to conduct a wide range of network analyses, from basic manipulation and visualization, to summary and characterization, to modeling of network data. The central package is igraph, which provides extensive capabilities for studying network graphs in R. This text builds on Eric D. Kolaczyk’s book Statistical Analysis of Network Data (Springer, 2009).

  13. Spectral analysis for automated exploration and sample acquisition

    Science.gov (United States)

    Eberlein, Susan; Yates, Gigi

    1992-05-01

    Future space exploration missions will rely heavily on the use of complex instrument data for determining the geologic, chemical, and elemental character of planetary surfaces. One important instrument is the imaging spectrometer, which collects complete images in multiple discrete wavelengths in the visible and infrared regions of the spectrum. Extensive computational effort is required to extract information from such high-dimensional data. A hierarchical classification scheme allows multispectral data to be analyzed for purposes of mineral classification while limiting the overall computational requirements. The hierarchical classifier exploits the tunability of a new type of imaging spectrometer which is based on an acousto-optic tunable filter. This spectrometer collects a complete image in each wavelength passband without spatial scanning. It may be programmed to scan through a range of wavelengths or to collect only specific bands for data analysis. Spectral classification activities employ artificial neural networks, trained to recognize a number of mineral classes. Analysis of the trained networks has proven useful in determining which subsets of spectral bands should be employed at each step of the hierarchical classifier. The network classifiers are capable of recognizing all mineral types which were included in the training set. In addition, the major components of many mineral mixtures can also be recognized. This capability may prove useful for a system designed to evaluate data in a strange environment where details of the mineral composition are not known in advance.

  14. Automated SIMS Isotopic Analysis Of Small Dust Particles

    Science.gov (United States)

    Nittler, L.; Alexander, C.; Gyngard, F.; Morgand, A.; Zinner, E. K.

    2009-12-01

    The isotopic compositions of sub-μm to μm sized dust grains are of increasing interest in cosmochemistry, nuclear forensics and terrestrial aerosol research. Because of its high sensitivity and spatial resolution, Secondary Ion Mass Spectrometry (SIMS) is the tool of choice for measuring isotopes in such small samples. Indeed, SIMS has enabled an entirely new sub-field of astronomy: presolar grains in meteorites. In recent years, the development of the Cameca NanoSIMS ion probe has extended the reach of isotopic measurements to particles as small as 100 nm in diameter, a regime where isotopic precision is strongly limited by the total number of atoms in the sample. Many applications require obtaining isotopic data on large numbers of particles, necessitating the development of automated techniques. One such method is isotopic imaging, wherein images of multiple isotopes are acquired, each containing multiple dispersed particles, and image processing is used to determine isotopic ratios for individual particles. This method is powerful, but relatively inefficient for raster-based imaging on the NanoSIMS. Modern computerized control of instrumentation has allowed for another approach, analogous to commercial automated SEM-EDS particle analysis systems, in which images are used solely to locate particles followed by fully automated grain-by-grain analysis. The first such system was developed on the Carnegie Institution’s Cameca ims-6f, and was used to generate large databases of presolar grains. We have recently developed a similar system for the NanoSIMS, whose high sensitivity allows for smaller grains to be analyzed with less sample consumption than is possible with the 6f system. The 6f and NanoSIMS systems are functionally identical: an image of dispersed grains is obtained with sufficient statistical precision for an algorithm to identify the positions of individual particles, the primary ion beam is deflected to each particle in turn and rastered in a small

  15. Application of automated image analysis to coal petrography

    Science.gov (United States)

    Chao, E.C.T.; Minkin, J.A.; Thompson, C.L.

    1982-01-01

    The coal petrologist seeks to determine the petrographic characteristics of organic and inorganic coal constituents and their lateral and vertical variations within a single coal bed or different coal beds of a particular coal field. Definitive descriptions of coal characteristics and coal facies provide the basis for interpretation of depositional environments, diagenetic changes, and burial history and determination of the degree of coalification or metamorphism. Numerous coal core or columnar samples must be studied in detail in order to adequately describe and define coal microlithotypes, lithotypes, and lithologic facies and their variations. The large amount of petrographic information required can be obtained rapidly and quantitatively by use of an automated image-analysis system (AIAS). An AIAS can be used to generate quantitative megascopic and microscopic modal analyses for the lithologic units of an entire columnar section of a coal bed. In our scheme for megascopic analysis, distinctive bands 2 mm or more thick are first demarcated by visual inspection. These bands consist of either nearly pure microlithotypes or lithotypes such as vitrite/vitrain or fusite/fusain, or assemblages of microlithotypes. Megascopic analysis with the aid of the AIAS is next performed to determine volume percentages of vitrite, inertite, minerals, and microlithotype mixtures in bands 0.5 to 2 mm thick. The microlithotype mixtures are analyzed microscopically by use of the AIAS to determine their modal composition in terms of maceral and optically observable mineral components. Megascopic and microscopic data are combined to describe the coal unit quantitatively in terms of (V) for vitrite, (E) for liptite, (I) for inertite or fusite, (M) for mineral components other than iron sulfide, (S) for iron sulfide, and (VEIM) for the composition of the mixed phases (Xi) i = 1,2, etc. in terms of the maceral groups vitrinite V, exinite E, inertinite I, and optically observable mineral

  16. Statistical network analysis for analyzing policy networks

    DEFF Research Database (Denmark)

    Robins, Garry; Lewis, Jenny; Wang, Peng

    2012-01-01

    To analyze social network data using standard statistical approaches is to risk incorrect inference. The dependencies among observations implied in a network conceptualization undermine standard assumptions of the usual general linear models. One of the most quickly expanding areas of social......), and stochastic actor-oriented models. We focus most attention on ERGMs by providing an illustrative example of a model for a strategic information network within a local government. We draw inferences about the structural role played by individuals recognized as key innovators and conclude that such an approach...... and policy network methodology is the development of statistical modeling approaches that can accommodate such dependent data. In this article, we review three network statistical methods commonly used in the current literature: quadratic assignment procedures, exponential random graph models (ERGMs...

  17. Automated absolute activation analysis with californium-252 sources

    Energy Technology Data Exchange (ETDEWEB)

    MacMurdo, K.W.; Bowman, W.W.

    1978-09-01

    A 100-mg /sup 252/Cf neutron activation analysis facility is used routinely at the Savannah River Laboratory for multielement analysis of many solid and liquid samples. An absolute analysis technique converts counting data directly to elemental concentration without the use of classical comparative standards and flux monitors. With the totally automated pneumatic sample transfer system, cyclic irradiation-decay-count regimes can be pre-selected for up to 40 samples, and samples can be analyzed with the facility unattended. An automatic data control system starts and stops a high-resolution gamma-ray spectrometer and/or a delayed-neutron detector; the system also stores data and controls output modes. Gamma ray data are reduced by three main programs in the IBM 360/195 computer: the 4096-channel spectrum and pertinent experimental timing, counting, and sample data are stored on magnetic tape; the spectrum is then reduced to a list of significant photopeak energies, integrated areas, and their associated statistical errors; and the third program assigns gamma ray photopeaks to the appropriate neutron activation product(s) by comparing photopeak energies to tabulated gamma ray energies. Photopeak areas are then converted to elemental concentration by using experimental timing and sample data, calculated elemental neutron capture rates, absolute detector efficiencies, and absolute spectroscopic decay data. Calculational procedures have been developed so that fissile material can be analyzed by cyclic neutron activation and delayed-neutron counting procedures. These calculations are based on a 6 half-life group model of delayed neutron emission; calculations include corrections for delayed neutron interference from /sup 17/O. Detection sensitivities of < or = 400 ppB for natural uranium and 8 ppB (< or = 0.5 (nCi/g)) for /sup 239/Pu were demonstrated with 15-g samples at a throughput of up to 140 per day. Over 40 elements can be detected at the sub-ppM level.

  18. HATSouth: a global network of fully automated identical wide-field telescopes

    CERN Document Server

    Bakos, G Á; Penev, K; Bayliss, D; Jordán, A; Afonso, C; Hartman, J D; Henning, T; Kovács, G; Noyes, R W; Béky, B; Suc, V; Csák, B; Rabus, M; Lázár, J; Papp, I; Sári, P; Conroy, P; Zhou, G; Sackett, P D; Schmidt, B; Mancini, L; Sasselov, D D; Ueltzhoeffer, K

    2012-01-01

    HATSouth is the world's first network of automated and homogeneous telescopes that is capable of year-round 24-hour monitoring of positions over an entire hemisphere of the sky. The primary scientific goal of the network is to discover and characterize a large number of transiting extrasolar planets, reaching out to long periods and down to small planetary radii. HATSouth achieves this by monitoring extended areas on the sky, deriving high precision light curves for a large number of stars, searching for the signature of planetary transits, and confirming planetary candidates with larger telescopes. HATSouth employs 6 telescope units spread over 3 locations with large longitude separation in the southern hemisphere (Las Campanas Observatory, Chile; HESS site, Namibia; Siding Spring Observatory, Australia). Each of the HATSouth units holds four 0.18m diameter f/2.8 focal ratio telescope tubes on a common mount producing an 8.2x8.2 arcdeg field, imaged using four 4Kx4K CCD cameras and Sloan r filters, to give a...

  19. Automated Position System Implementation over Vehicular Ad Hoc Networks in 2-Dimension Space

    Directory of Open Access Journals (Sweden)

    Soumen Kanrar

    2010-01-01

    Full Text Available Problem statement: The real world scenario have changed from the wired connection to wireless connection. Over the years software development has responded to the increasing growth of wireless connectivity in developing network enabled software. The problem arises in the wireless domain due to random packet loss in transport layer and as well as in data link layer for the end to end connection. The basic problem we have considered in this paper is to convert the real world scenario of "Vehicular ad hoc network"into lab oriented problem by used the APS-system and study the result to achieve better performance in wireless domain. Approach: We map the real world physical problem into analytical problem and simulate that analytic problem with respect to real world scenario by Automated Position System (APS for antenna mounted over the mobile node in 2D space. Results: We quantify the performance and the impact of the packet loss, delay, by the bit error rate and throughput with respect to the real world scenario of VANET in the MAC layer, data link layer and transport layer. Conclusion: We observe that the Directional the Antenna which is mounted over the vehicle gives less bit error in comparison to Isotropic and Discone antenna.

  20. MORPHY, a program for an automated "atoms in molecules" analysis

    Science.gov (United States)

    Popelier, Paul L. A.

    1996-02-01

    The operating manual for a structured FORTAN 77 program called MORPHY is presented. This code performs an automated topological analysis of a molecular electron density and its Laplacian. The program is written in a stylistically homogeneous, transparant and modular manner. The input is compact but flexible and allows for multiple jobs in one deck. The output is detailed and has an attractive lay-out. Critical points in the charge density and its Laplacian can be located in a robust and economic way and are displayed via an external on-line visualisation package. The gradient vector field of the charge density can be traced with great accuracy, planar contour, relief and one-dimensional line plots of many scalar properties can be generated. Non-bonded radii are calculated and analytical expressions for interatomic surfaces are computed (with error estimates) and plotted. MORPHY is interfaced with the AIMPAC suite of programs. The capabilities of the program are illustrated with two test runs and five selected figures.

  1. Transmission analysis in WDM networks

    DEFF Research Database (Denmark)

    Rasmussen, Christian Jørgen

    1999-01-01

    This thesis describes the development of a computer-based simulator for transmission analysis in optical wavelength division multiplexing networks. A great part of the work concerns fundamental optical network simulator issues. Among these issues are identification of the versatility and user......-friendliness demands which such a simulator must meet, development of the "spectral window representation" for representation of the optical signals and finding an effective way of handling the optical signals in the computer memory. One important issue more is the rules for the determination of the order in which...... the different component models are invoked during the simulation of a system. A simple set of rules which makes it possible to simulate any network architectures is laid down. The modelling of the nonlinear fibre and the optical receiver is also treated. The work on the fibre concerns the numerical solution...

  2. Moving Toward an Optimal and Automated Geospatial Network for CCUS Infrastructure

    Energy Technology Data Exchange (ETDEWEB)

    Hoover, Brendan Arthur [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-08-05

    Modifications in the global climate are being driven by the anthropogenic release of greenhouse gases (GHG) including carbon dioxide (CO2) (Middleton et al. 2014). CO2 emissions have, for example, been directly linked to an increase in total global temperature (Seneviratne et al. 2016). Strategies that limit CO2 emissions—like CO2 capture, utilization, and storage (CCUS) technology—can greatly reduce emissions by capturing CO2 before it is released to the atmosphere. However, to date CCUS technology has not been developed at a large commercial scale despite several promising high profile demonstration projects (Middleton et al. 2015). Current CCUS research has often focused on capturing CO2 emissions from coal-fired power plants, but recent research at Los Alamos National Laboratory (LANL) suggests focusing CCUS CO2 capture research upon industrial sources might better encourage CCUS deployment. To further promote industrial CCUS deployment, this project builds off current LANL research by continuing the development of a software tool called SimCCS, which estimates a regional system of transport to inject CO2 into sedimentary basins. The goal of SimCCS, which was first developed by Middleton and Bielicki (2009), is to output an automated and optimal geospatial industrial CCUS pipeline that accounts for industrial source and sink locations by estimating a Delaunay triangle network which also minimizes topographic and social costs (Middleton and Bielicki 2009). Current development of SimCCS is focused on creating a new version that accounts for spatial arrangements that were not available in the previous version. This project specifically addresses the issue of non-unique Delaunay triangles by adding additional triangles to the network, which can affect how the CCUS network is calculated.

  3. Spectral Analysis of Rich Network Topology in Social Networks

    Science.gov (United States)

    Wu, Leting

    2013-01-01

    Social networks have received much attention these days. Researchers have developed different methods to study the structure and characteristics of the network topology. Our focus is on spectral analysis of the adjacency matrix of the underlying network. Recent work showed good properties in the adjacency spectral space but there are few…

  4. Interobserver and Intraobserver Variability in pH-Impedance Analysis between 10 Experts and Automated Analysis

    DEFF Research Database (Denmark)

    Loots, Clara M; van Wijk, Michiel P; Blondeau, Kathleen;

    2011-01-01

    OBJECTIVE: To determine interobserver and intraobserver variability in pH-impedance interpretation between experts and accuracy of automated analysis (AA). STUDY DESIGN: Ten pediatric 24-hour pH-impedance tracings were analyzed by 10 observers from 7 world groups and with AA. Detection of gastroe......OBJECTIVE: To determine interobserver and intraobserver variability in pH-impedance interpretation between experts and accuracy of automated analysis (AA). STUDY DESIGN: Ten pediatric 24-hour pH-impedance tracings were analyzed by 10 observers from 7 world groups and with AA. Detection....... CONCLUSION: Interobserver agreement in combined pH-multichannel intraluminal impedance analysis in experts is moderate; only 42% of GER episodes were detected by the majority of observers. Detection of total GER numbers is more consistent. Considering these poor outcomes, AA seems favorable compared...

  5. COalitions in COOperation Networks (COCOON): Social Network Analysis and Game Theory to Enhance Cooperation Networks

    NARCIS (Netherlands)

    Sie, Rory

    2012-01-01

    Sie, R. L. L. (2012). COalitions in COOperation Networks (COCOON): Social Network Analysis and Game Theory to Enhance Cooperation Networks (Unpublished doctoral dissertation). September, 28, 2012, Open Universiteit in the Netherlands (CELSTEC), Heerlen, The Netherlands.

  6. Bittorrent Swarm Analysis Through Automation and Enhanced Logging

    Directory of Open Access Journals (Sweden)

    R˘azvan Deaconescu

    2011-01-01

    Full Text Available Peer-to-Peer protocols currently form the most heavily used protocol class in the Internet,with BitTorrent, the most popular protocol for content distribution, as its flagship.A high number ofstudies and investigations have been undertaken to measure, analyse and improve the inner workings ofthe BitTorrent protocol. Approaches such as tracker message analysis, network probing and packetsniffing have been deployed to understand and enhance BitTorrent’s internal behaviour. In this paper wepresent a novel approach that aims to collect, process and analyse large amounts of local peerinformation in BitTorrent swarms. We classify the information as periodic status information able to bemonitored in real time and as verbose logging information to be used for subsequent analysis. We havedesigned and implemented a retrieval, storage and presentation infrastructure that enables easy analysisof BitTorrent protocol internals. Our approach can be employed both as a comparison tool, as well as ameasurement system of how network characteristics and protocol implementation influence the overallBitTorrent swarm performance.We base our approach on a framework that allows easy swarm creationand control for different BitTorrent clients.With the help of a virtualized infrastructure and a client-serversoftware layer we are able to create, command and manage large sized BitTorrent swarms. Theframework allows a user to run, schedule, start, stop clients within a swarm and collect informationregarding their behavior.

  7. Automated Design and Analysis Tool for CEV Structural and TPS Components Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation of the proposed effort is a unique automated process for the analysis, design, and sizing of CEV structures and TPS. This developed process will...

  8. Automated Design and Analysis Tool for CLV/CEV Composite and Metallic Structural Components Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation of the proposed effort is a unique automated process for the analysis, design, and sizing of CLV/CEV composite and metallic structures. This...

  9. Analysis of Semantic Networks using Complex Networks Concepts

    DEFF Research Database (Denmark)

    Ortiz-Arroyo, Daniel

    2013-01-01

    In this paper we perform a preliminary analysis of semantic networks to determine the most important terms that could be used to optimize a summarization task. In our experiments, we measure how the properties of a semantic network change, when the terms in the network are removed. Our preliminary...... results indicate that this approach provides good results on the semantic network analyzed in this paper....

  10. Automated Production Flow Line Failure Rate Mathematical Analysis with Probability Theory

    Directory of Open Access Journals (Sweden)

    Tan Chan Sin

    2014-12-01

    Full Text Available Automated lines have been widely used in the industries especially for mass production and to customize product. Productivity of automated line is a crucial indicator to show the output and performance of the production. Failure or breakdown of station or mechanisms is commonly occurs in the automated line in real condition due to the technological and technical problem which is highly affect the productivity. The failure rates of automated line are not express or analyse in terms of mathematic form. This paper presents the mathematic analysis by using probability theory towards the failure condition in automated line. The mathematic express for failure rates can produce and forecast the output of productivity accurately

  11. Hydraulic Modeling: Pipe Network Analysis

    OpenAIRE

    Datwyler, Trevor T.

    2012-01-01

    Water modeling is becoming an increasingly important part of hydraulic engineering. One application of hydraulic modeling is pipe network analysis. Using programmed algorithms to repeatedly solve continuity and energy equations, computer software can greatly reduce the amount of time required to analyze a closed conduit system. Such hydraulic models can become a valuable tool for cities to maintain their water systems and plan for future growth. The Utah Division of Drinking Water regulations...

  12. Automated Quality Control Methods for Sensor Data: A Novel Observatory Approach at the National Ecological Observatory Network

    Science.gov (United States)

    Taylor, J. R.; Loescher, H.

    2013-05-01

    National and international networks and observatories of terrestrial-based sensors are emerging rapidly. As such, there is demand for a standardized approach to data quality control, as well as interoperability of data among sensor networks. The National Ecological Observatory Network (NEON) has begun constructing their first terrestrial observing sites with 60 locations expected to be distributed across the US by 2017. This will result in over 14 000 automated sensors recording more than >100 Tb of data per year. These data are then used to create other datasets and subsequent "higherlevel" data products. In anticipation of this challenge, an overall data quality assurance plan has been developed and the first suite of data quality control measures defined. This data-driven approach focuses on automated methods for defining a suite of plausibility test parameter thresholds. Specifically, these plausibility tests scrutinize data range, persistence, and stochasticity on each measurement type by employing a suite of binary checks. The statistical basis for each of these tests is developed and the methods for calculating test parameter thresholds are explored here. While these tests have been used elsewhere, we apply them in a novel approach by calculating their relevant test parameter thresholds. Finally, implementing automated quality control is demonstrated with preliminary data from a NEON prototype site. Results focus on novel approaches that advance the quality control techniques that have been historically employed in other networks (DOE-ARM, AmeriFlux, USDA ARS, OK Mesonet) to new state-of-the-art functionality. These automated and semi-automated approaches are also used to inform automated problem tracking to efficiently deploy field staff. Ultimately, NEON will define its own standards for QA/QC and maintenance by building upon these existing frameworks. The overarching philosophy relies on attaining the highest levels of accuracy, precision, and operational

  13. Automated analysis of eclipsing binary lightcurves. I. EBAS -- a new Eclipsing Binary Automated Solver with EBOP

    CERN Document Server

    Tamuz, O; North, P; Mazeh, Tsevi; North, Pierre; Tamuz, Omer

    2006-01-01

    We present a new algorithm -- Eclipsing Binary Automated Solver (EBAS), to analyse lightcurves of eclipsing binaries. The algorithm is designed to analyse large numbers of lightcurves, and is therefore based on the relatively fast EBOP code. To facilitate the search for the best solution, EBAS uses two parameter transformations. Instead of the radii of the two stellar components, EBAS uses the sum of radii and their ratio, while the inclination is transformed into the impact parameter. To replace human visual assessment, we introduce a new 'alarm' goodness-of-fit statistic that takes into account correlation between neighbouring residuals. We perform extensive tests and simulations that show that our algorithm converges well, finds a good set of parameters and provides reasonable error estimation.

  14. Networks and network analysis for defence and security

    CERN Document Server

    Masys, Anthony J

    2014-01-01

    Networks and Network Analysis for Defence and Security discusses relevant theoretical frameworks and applications of network analysis in support of the defence and security domains. This book details real world applications of network analysis to support defence and security. Shocks to regional, national and global systems stemming from natural hazards, acts of armed violence, terrorism and serious and organized crime have significant defence and security implications. Today, nations face an uncertain and complex security landscape in which threats impact/target the physical, social, economic

  15. Future Control and Automation : Proceedings of the 2nd International Conference on Future Control and Automation

    CERN Document Server

    2012-01-01

    This volume Future Control and Automation- Volume 2 includes best papers from 2012 2nd International Conference on Future Control and Automation (ICFCA 2012) held on July 1-2, 2012, Changsha, China. Future control and automation is the use of control systems and information technologies to reduce the need for human work in the production of goods and services. This volume can be divided into six sessions on the basis of the classification of manuscripts considered, which is listed as follows: Mathematical Modeling, Analysis and Computation, Control Engineering, Reliable Networks Design, Vehicular Communications and Networking, Automation and Mechatronics.

  16. Isochronous wireless network for real-time communication in industrial automation

    CERN Document Server

    Trsek, Henning

    2016-01-01

    This dissertation proposes and investigates an isochronous wireless network for industrial control applications with guaranteed latencies and jitter. Based on a requirements analysis of real industrial applications and the characterisation of the wireless channel, the solution approach is developed. It consists of a TDMA-based medium access control, a dynamic resource allocation and the provision of a global time base for the wired and the wireless network. Due to the global time base, the solution approach allows a seamless and synchronous integration into existing wired Real-time Ethernet systems.

  17. A simple and robust method for automated photometric classification of supernovae using neural networks

    CERN Document Server

    Karpenka, N V; Hobson, M P

    2012-01-01

    A method is presented for automated photometric classification of supernovae (SNe) as Type-Ia or non-Ia. A two-step approach is adopted in which: (i) the SN lightcurve flux measurements in each observing filter are fitted separately; and (ii) the fitted function parameters and their associated uncertainties, along with the number of flux measurements, the maximum-likelihood value of the fit and Bayesian evidence for the model, are used as the input feature vector to a classification neural network (NN) that outputs the probability that the SN under consideration is of Type-Ia. The method is trained and tested using data released following the SuperNova Photometric Classification Challenge (SNPCC). We consider several random divisions of the data into training and testing sets: for instance, for our sample D_1 (D_4), a total of 10% (40%) of the data are involved in training the algorithm and the remainder used for blind testing of the resulting classifier; we make no selection cuts. Assigning a canonical thres...

  18. 美海军ADNS系统综述%Automated Digital Network System of US Navy

    Institute of Scientific and Technical Information of China (English)

    陈劲尧; 许卡

    2016-01-01

    以美国海军自动化数字网络系统(Automated Digital Network System,ADNS)为研究对象,系统性的分析了ADNS在网络中心战中的定位、发展历程、体系架构、系统组成、逻辑组织和关键特征等,对ADNS系统进行了较为全面、详细的综合介绍,总结了其主要优点.ADNS系统从1997年项目开始至今,其技术及能力始终处于持续增强中,充分说明了其具有先进的设计思想和合理的体系架构,值得我国在研究和设计船舶通信系统时借鉴.

  19. Prototype Software for Automated Structural Analysis of Systems

    DEFF Research Database (Denmark)

    Jørgensen, A.; Izadi-Zamanabadi, Roozbeh; Kristensen, M.

    2004-01-01

    In this paper we present a prototype software tool that is developed to analyse the structural model of automated systems in order to identify redundant information that is hence utilized for Fault detection and Isolation (FDI) purposes. The dedicated algorithms in this software tool use a tri...

  20. BitTorrent Swarm Analysis through Automation and Enhanced Logging

    CERN Document Server

    Deaconescu, Răzvan; Drăghici, Adriana; Tăpus, Nicolae

    2011-01-01

    Peer-to-Peer protocols currently form the most heavily used protocol class in the Internet, with BitTorrent, the most popular protocol for content distribution, as its flagship. A high number of studies and investigations have been undertaken to measure, analyse and improve the inner workings of the BitTorrent protocol. Approaches such as tracker message analysis, network probing and packet sniffing have been deployed to understand and enhance BitTorrent's internal behaviour. In this paper we present a novel approach that aims to collect, process and analyse large amounts of local peer information in BitTorrent swarms. We classify the information as periodic status information able to be monitored in real time and as verbose logging information to be used for subsequent analysis. We have designed and implemented a retrieval, storage and presentation infrastructure that enables easy analysis of BitTorrent protocol internals. Our approach can be employed both as a comparison tool, as well as a measurement syste...

  1. Kinetics analysis and automated online screening of aminocarbonylation of aryl halides in flow

    OpenAIRE

    Moore, Jason S.; Smith, Christopher D; Jensen, Klavs F.

    2016-01-01

    Temperature, pressure, gas stoichiometry, and residence time were varied to control the yield and product distribution of the palladium-catalyzed aminocarbonylation of aromatic bromides in both a silicon microreactor and a packed-bed tubular reactor. Automation of the system set points and product sampling enabled facile and repeatable reaction analysis with minimal operator supervision. It was observed that the reaction was divided into two temperature regimes. An automated system was used t...

  2. Automated red blood cell analysis compared with routine red blood cell morphology by smear review

    OpenAIRE

    Dr.Poonam Radadiya; Dr.Nandita Mehta; Dr.Hansa Goswami; Dr.R.N.Gonsai

    2015-01-01

    The RBC histogram is an integral part of automated haematology analysis and is now routinely available on all automated cell counters. This histogram and other associated complete blood count (CBC) parameters have been found abnormal in various haematological conditions and may provide major clues in the diagnosis and management of significant red cell disorders. Performing manual blood smears is important to ensure the quality of blood count results an...

  3. Structural Analysis of Complex Networks

    CERN Document Server

    Dehmer, Matthias

    2011-01-01

    Filling a gap in literature, this self-contained book presents theoretical and application-oriented results that allow for a structural exploration of complex networks. The work focuses not only on classical graph-theoretic methods, but also demonstrates the usefulness of structural graph theory as a tool for solving interdisciplinary problems. Applications to biology, chemistry, linguistics, and data analysis are emphasized. The book is suitable for a broad, interdisciplinary readership of researchers, practitioners, and graduate students in discrete mathematics, statistics, computer science,

  4. Topological Analysis of Urban Drainage Networks

    Science.gov (United States)

    Yang, Soohyun; Paik, Kyungrock; McGrath, Gavan; Rao, Suresh

    2016-04-01

    Urban drainage networks are an essential component of infrastructure, and comprise the aggregation of underground pipe networks carrying storm water and domestic waste water for eventual discharge to natural stream networks. Growing urbanization has contributed to rapid expansion of sewer networks, vastly increasing their complexity and scale. Importance of sewer networks has been well studied from an engineering perspective, including resilient management, optimal design, and malfunctioning impact. Yet, analysis of the urban drainage networks using complex networks approach are lacking. Urban drainage networks consist of manholes and conduits, which correspond to nodes and edges, analogous to junctions and streams in river networks. Converging water flows in these two networks are driven by elevation gradient. In this sense, engineered urban drainage networks share several attributes of flows in river networks. These similarities between the two directed, converging flow networks serve the basis for us to hypothesize that the functional topology of sewer networks, like river networks, is scale-invariant. We analyzed the exceedance probability distribution of upstream area for practical sewer networks in South Korea. We found that the exceedance probability distributions of upstream area follow power-law, implying that the sewer networks exhibit topological self-similarity. The power-law exponents for the sewer networks were similar, and within the range reported from analysis of natural river networks. Thus, in line with our hypothesis, these results suggest that engineered urban drainage networks share functional topological attributes regardless of their structural dissimilarity or different underlying network evolution processes (natural vs. engineered). Implications of these findings for optimal design of sewer networks and for modeling sewer flows will be discussed.

  5. Quantization of polyphenolic compounds in histological sections of grape berries by automated color image analysis

    Science.gov (United States)

    Clement, Alain; Vigouroux, Bertnand

    2003-04-01

    We present new results in applied color image analysis that put in evidence the significant influence of soil on localization and appearance of polyphenols in grapes. These results have been obtained with a new unsupervised classification algorithm founded on hierarchical analysis of color histograms. The process is automated thanks to a software platform we developed specifically for color image analysis and it's applications.

  6. A novel automated image analysis method for accurate adipocyte quantification

    OpenAIRE

    Osman, Osman S.; Selway, Joanne L; Kępczyńska, Małgorzata A; Stocker, Claire J.; O’Dowd, Jacqueline F; Cawthorne, Michael A.; Arch, Jonathan RS; Jassim, Sabah; Langlands, Kenneth

    2013-01-01

    Increased adipocyte size and number are associated with many of the adverse effects observed in metabolic disease states. While methods to quantify such changes in the adipocyte are of scientific and clinical interest, manual methods to determine adipocyte size are both laborious and intractable to large scale investigations. Moreover, existing computational methods are not fully automated. We, therefore, developed a novel automatic method to provide accurate measurements of the cross-section...

  7. CRITICAL ASSESSMENT OF AUTOMATED FLOW CYTOMETRY DATA ANALYSIS TECHNIQUES

    OpenAIRE

    Aghaeepour, Nima; Finak, Greg; ,; Hoos, Holger; Mosmann, Tim R; Gottardo, Raphael; Brinkman, Ryan; Scheuermann, Richard H.

    2013-01-01

    Traditional methods for flow cytometry (FCM) data processing rely on subjective manual gating. Recently, several groups have developed computational methods for identifying cell populations in multidimensional FCM data. The Flow Cytometry: Critical Assessment of Population Identification Methods (FlowCAP) challenges were established to compare the performance of these methods on two tasks – mammalian cell population identification to determine if automated algorithms can reproduce expert manu...

  8. Alert management for home healthcare based on home automation analysis.

    Science.gov (United States)

    Truong, T T; de Lamotte, F; Diguet, J-Ph; Said-Hocine, F

    2010-01-01

    Rising healthcare for elder and disabled people can be controlled by offering people autonomy at home by means of information technology. In this paper, we present an original and sensorless alert management solution which performs multimedia and home automation service discrimination and extracts highly regular home activities as sensors for alert management. The results of simulation data, based on real context, allow us to evaluate our approach before application to real data.

  9. Object Type Recognition for Automated Analysis of Protein Subcellular Location

    OpenAIRE

    Zhao, Ting; Velliste, Meel; Boland, Michael V.; Murphy, Robert F.

    2005-01-01

    The new field of location proteomics seeks to provide a comprehensive, objective characterization of the subcellular locations of all proteins expressed in a given cell type. Previous work has demonstrated that automated classifiers can recognize the patterns of all major subcellular organelles and structures in fluorescence microscope images with high accuracy. However, since some proteins may be present in more than one organelle, this paper addresses a more difficult task: recognizing a pa...

  10. Analysis of Defense Language Institute automated student questionnaire data

    OpenAIRE

    Strycharz, Theodore M.

    1996-01-01

    This thesis explores the dimensionality of the Defense Language Institute's (DLI) primary student feed back tool - the Automated Student Questionnaire (ASQ). In addition a data set from ASQ 2.0 (the newest version) is analyzed for trends in student satisfaction across the sub-scales of sex, pay grade, and Defense Language Proficiency Test (DLPT) results. The method of principal components is used to derive initial factors. Although an interpretation of those factors seems plausible, these are...

  11. Methods of automated cell analysis and their application in radiation biology

    International Nuclear Information System (INIS)

    The present review is concerned with the methods of automated analysis of biological microobjects and covers two groups into which all the systems of automated analysis can be divided-systems of flow ( flow cytometry) and scanning (image analysis systems) type. Particular emphasis has been placed on their use in radiobiological studies, namely, in the micronucleus test, a cytogenetic assay for monitoring the clastogenic action of ionizing radiation commonly used at present. Examples of suing methods described and actual setups in other biomedical researches are given. Analysis of advantages and disadvantages of the methods of automated cell analysis enables to choose more thoroughly between the systems of flow and scanning type to use them in particular research

  12. Extending and automating a Systems-Theoretic hazard analysis for requirements generation and analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Thomas, John (Massachusetts Institute of Technology)

    2012-05-01

    Systems Theoretic Process Analysis (STPA) is a powerful new hazard analysis method designed to go beyond traditional safety techniques - such as Fault Tree Analysis (FTA) - that overlook important causes of accidents like flawed requirements, dysfunctional component interactions, and software errors. While proving to be very effective on real systems, no formal structure has been defined for STPA and its application has been ad-hoc with no rigorous procedures or model-based design tools. This report defines a formal mathematical structure underlying STPA and describes a procedure for systematically performing an STPA analysis based on that structure. A method for using the results of the hazard analysis to generate formal safety-critical, model-based system and software requirements is also presented. Techniques to automate both the analysis and the requirements generation are introduced, as well as a method to detect conflicts between the safety and other functional model-based requirements during early development of the system.

  13. Google matrix analysis of directed networks

    CERN Document Server

    Ermann, Leonardo; Shepelyansky, Dima L

    2014-01-01

    In past ten years, modern societies developed enormous communication and social networks. Their classification and information retrieval processing become a formidable task for the society. Due to the rapid growth of World Wide Web, social and communication networks, new mathematical methods have been invented to characterize the properties of these networks on a more detailed and precise level. Various search engines are essentially using such methods. It is highly important to develop new tools to classify and rank enormous amount of network information in a way adapted to internal network structures and characteristics. This review describes the Google matrix analysis of directed complex networks demonstrating its efficiency on various examples including World Wide Web, Wikipedia, software architecture, world trade, social and citation networks, brain neural networks, DNA sequences and Ulam networks. The analytical and numerical matrix methods used in this analysis originate from the fields of Markov chain...

  14. Google matrix analysis of directed networks

    Science.gov (United States)

    Ermann, Leonardo; Frahm, Klaus M.; Shepelyansky, Dima L.

    2015-10-01

    In the past decade modern societies have developed enormous communication and social networks. Their classification and information retrieval processing has become a formidable task for the society. Because of the rapid growth of the World Wide Web, and social and communication networks, new mathematical methods have been invented to characterize the properties of these networks in a more detailed and precise way. Various search engines extensively use such methods. It is highly important to develop new tools to classify and rank a massive amount of network information in a way that is adapted to internal network structures and characteristics. This review describes the Google matrix analysis of directed complex networks demonstrating its efficiency using various examples including the World Wide Web, Wikipedia, software architectures, world trade, social and citation networks, brain neural networks, DNA sequences, and Ulam networks. The analytical and numerical matrix methods used in this analysis originate from the fields of Markov chains, quantum chaos, and random matrix theory.

  15. Social network analysis community detection and evolution

    CERN Document Server

    Missaoui, Rokia

    2015-01-01

    This book is devoted to recent progress in social network analysis with a high focus on community detection and evolution. The eleven chapters cover the identification of cohesive groups, core components and key players either in static or dynamic networks of different kinds and levels of heterogeneity. Other important topics in social network analysis such as influential detection and maximization, information propagation, user behavior analysis, as well as network modeling and visualization are also presented. Many studies are validated through real social networks such as Twitter. This edit

  16. Network analysis literacy a practical approach to the analysis of networks

    CERN Document Server

    Zweig, Katharina A

    2014-01-01

    Network Analysis Literacy focuses on design principles for network analytics projects. The text enables readers to: pose a defined network analytic question; build a network to answer the question; choose or design the right network analytic methods for a particular purpose, and more.

  17. ONLINE SOCIAL NETWORK INTERNETWORKING ANALYSIS

    Directory of Open Access Journals (Sweden)

    Bassant E.Youssef

    2014-10-01

    Full Text Available Online social networks (OSNs contain data about users, their relations, interests and daily activities andthe great value of this data results in ever growing popularity of OSNs. There are two types of OSNs data,semantic and topological. Both can be used to support decision making processes in many applicationssuch as in information diffusion, viral marketing and epidemiology. Online Social network analysis (OSNAresearch is used to maximize the benefits gained from OSNs’ data. This paper provides a comprehensive study of OSNs and OSNA to provide analysts with the knowledge needed to analyse OSNs. OSNs’internetworking was found to increase the wealth of the analysed data by depending on more than one OSNas the source of the analysed data. Paper proposes a generic model of OSNs’ internetworking system that an analyst can rely on. Twodifferent data sources in OSNs were identified in our efforts to provide a thorough study of OSNs, whichare the OSN User data and the OSN platform data. Additionally, we propose a classification of the OSNUser data according to its analysis models for different data types to shed some light into the current usedOSNA methodologies. We also highlight the different metrics and parameters that analysts can use toevaluate semantic or topologic OSN user data. Further, we present a classification of the other data typesand OSN platform data that can be used to compare the capabilities of different OSNs whether separate orin a OSNs’ internetworking system. To increase analysts’ awareness about the available tools they can use,we overview some of the currently publically available OSNs’ datasets and simulation tools and identifywhether they are capable of being used in semantic, topological OSNA, or both. The overview identifiesthat only few datasets includes both data types (semantic and topological and there are few analysis toolsthat can perform analysis on both data types. Finally paper present a scenario that

  18. Engineering Mathematical Analysis Method for Productivity Rate in Linear Arrangement Serial Structure Automated Flow Assembly Line

    Directory of Open Access Journals (Sweden)

    Tan Chan Sin

    2015-01-01

    Full Text Available Productivity rate (Q or production rate is one of the important indicator criteria for industrial engineer to improve the system and finish good output in production or assembly line. Mathematical and statistical analysis method is required to be applied for productivity rate in industry visual overviews of the failure factors and further improvement within the production line especially for automated flow line since it is complicated. Mathematical model of productivity rate in linear arrangement serial structure automated flow line with different failure rate and bottleneck machining time parameters becomes the basic model for this productivity analysis. This paper presents the engineering mathematical analysis method which is applied in an automotive company which possesses automated flow assembly line in final assembly line to produce motorcycle in Malaysia. DCAS engineering and mathematical analysis method that consists of four stages known as data collection, calculation and comparison, analysis, and sustainable improvement is used to analyze productivity in automated flow assembly line based on particular mathematical model. Variety of failure rate that causes loss of productivity and bottleneck machining time is shown specifically in mathematic figure and presents the sustainable solution for productivity improvement for this final assembly automated flow line.

  19. NOA: a novel Network Ontology Analysis method

    OpenAIRE

    Wang, Jiguang; Huang, Qiang; Liu, Zhi-Ping; Wang, Yong; Wu, Ling-Yun; Chen, Luonan; Zhang, Xiang-Sun

    2011-01-01

    Gene ontology analysis has become a popular and important tool in bioinformatics study, and current ontology analyses are mainly conducted in individual gene or a gene list. However, recent molecular network analysis reveals that the same list of genes with different interactions may perform different functions. Therefore, it is necessary to consider molecular interactions to correctly and specifically annotate biological networks. Here, we propose a novel Network Ontology Analysis (NOA) meth...

  20. Privacy Analysis in Mobile Social Networks

    DEFF Research Database (Denmark)

    Sapuppo, Antonio

    2012-01-01

    Nowadays, mobile social networks are capable of promoting social networking benefits during physical meetings, in order to leverage interpersonal affinities not only among acquaintances, but also between strangers. Due to their foundation on automated sharing of personal data in the physical...... surroundings of the user, these networks are subject to crucial privacy threats. Privacy management systems must be capable of accurate selection of data disclosure according to human data sensitivity evaluation. Therefore, it is crucial to research and comprehend an individual's personal information...... disclosure decisions happening in ordinary human communication. Consequently, in this paper we provide insight into influential factors of human data disclosure decisions, by presenting and analysing results of an empirical investigation comprising two online surveys. We focus on the following influential...

  1. An Automated Grass-Based Procedure to Assess the Geometrical Accuracy of the Openstreetmap Paris Road Network

    Science.gov (United States)

    Brovelli, M. A.; Minghini, M.; Molinari, M. E.

    2016-06-01

    OpenStreetMap (OSM) is the largest spatial database of the world. One of the most frequently occurring geospatial elements within this database is the road network, whose quality is crucial for applications such as routing and navigation. Several methods have been proposed for the assessment of OSM road network quality, however they are often tightly coupled to the characteristics of the authoritative dataset involved in the comparison. This makes it hard to replicate and extend these methods. This study relies on an automated procedure which was recently developed for comparing OSM with any road network dataset. It is based on three Python modules for the open source GRASS GIS software and provides measures of OSM road network spatial accuracy and completeness. Provided that the user is familiar with the authoritative dataset used, he can adjust the values of the parameters involved thanks to the flexibility of the procedure. The method is applied to assess the quality of the Paris OSM road network dataset through a comparison against the French official dataset provided by the French National Institute of Geographic and Forest Information (IGN). The results show that the Paris OSM road network has both a high completeness and spatial accuracy. It has a greater length than the IGN road network, and is found to be suitable for applications requiring spatial accuracies up to 5-6 m. Also, the results confirm the flexibility of the procedure for supporting users in carrying out their own comparisons between OSM and reference road datasets.

  2. Automated Live Forensics Analysis for Volatile Data Acquisition

    Directory of Open Access Journals (Sweden)

    Bharath B,

    2015-03-01

    Full Text Available The increase in sophisticated attack on computers needs the assistance of Live forensics to uncover the evidence since traditional forensics methods doesn’t collect volatile data. The volatile data can ease the difficulty towards investigation in fact it can provide investigator with rich information towards solving a case. Here we are trying to eliminate the complexity involved in normal process by automating the process of acquisition and analyzing at the same time providing integrity towards evidence data through python scripting.

  3. Application of quantum dots as analytical tools in automated chemical analysis: A review

    Energy Technology Data Exchange (ETDEWEB)

    Frigerio, Christian; Ribeiro, David S.M.; Rodrigues, S. Sofia M.; Abreu, Vera L.R.G.; Barbosa, Joao A.C.; Prior, Joao A.V.; Marques, Karine L. [REQUIMTE, Laboratory of Applied Chemistry, Department of Chemical Sciences, Faculty of Pharmacy of Porto University, Rua Jorge Viterbo Ferreira, 228, 4050-313 Porto (Portugal); Santos, Joao L.M., E-mail: joaolms@ff.up.pt [REQUIMTE, Laboratory of Applied Chemistry, Department of Chemical Sciences, Faculty of Pharmacy of Porto University, Rua Jorge Viterbo Ferreira, 228, 4050-313 Porto (Portugal)

    2012-07-20

    Highlights: Black-Right-Pointing-Pointer Review on quantum dots application in automated chemical analysis. Black-Right-Pointing-Pointer Automation by using flow-based techniques. Black-Right-Pointing-Pointer Quantum dots in liquid chromatography and capillary electrophoresis. Black-Right-Pointing-Pointer Detection by fluorescence and chemiluminescence. Black-Right-Pointing-Pointer Electrochemiluminescence and radical generation. - Abstract: Colloidal semiconductor nanocrystals or quantum dots (QDs) are one of the most relevant developments in the fast-growing world of nanotechnology. Initially proposed as luminescent biological labels, they are finding new important fields of application in analytical chemistry, where their photoluminescent properties have been exploited in environmental monitoring, pharmaceutical and clinical analysis and food quality control. Despite the enormous variety of applications that have been developed, the automation of QDs-based analytical methodologies by resorting to automation tools such as continuous flow analysis and related techniques, which would allow to take advantage of particular features of the nanocrystals such as the versatile surface chemistry and ligand binding ability, the aptitude to generate reactive species, the possibility of encapsulation in different materials while retaining native luminescence providing the means for the implementation of renewable chemosensors or even the utilisation of more drastic and even stability impairing reaction conditions, is hitherto very limited. In this review, we provide insights into the analytical potential of quantum dots focusing on prospects of their utilisation in automated flow-based and flow-related approaches and the future outlook of QDs applications in chemical analysis.

  4. Network stratification analysis for identifying function-specific network layers.

    Science.gov (United States)

    Zhang, Chuanchao; Wang, Jiguang; Zhang, Chao; Liu, Juan; Xu, Dong; Chen, Luonan

    2016-04-22

    A major challenge of systems biology is to capture the rewiring of biological functions (e.g. signaling pathways) in a molecular network. To address this problem, we proposed a novel computational framework, namely network stratification analysis (NetSA), to stratify the whole biological network into various function-specific network layers corresponding to particular functions (e.g. KEGG pathways), which transform the network analysis from the gene level to the functional level by integrating expression data, the gene/protein network and gene ontology information altogether. The application of NetSA in yeast and its comparison with a traditional network-partition both suggest that NetSA can more effectively reveal functional implications of network rewiring and extract significant phenotype-related biological processes. Furthermore, for time-series or stage-wise data, the function-specific network layer obtained by NetSA is also shown to be able to characterize the disease progression in a dynamic manner. In particular, when applying NetSA to hepatocellular carcinoma and type 1 diabetes, we can derive functional spectra regarding the progression of the disease, and capture active biological functions (i.e. active pathways) in different disease stages. The additional comparison between NetSA and SPIA illustrates again that NetSA could discover more complete biological functions during disease progression. Overall, NetSA provides a general framework to stratify a network into various layers of function-specific sub-networks, which can not only analyze a biological network on the functional level but also investigate gene rewiring patterns in biological processes. PMID:26879865

  5. Sharing Feelings Online: Studying Emotional Well-Being via Automated Text Analysis of Facebook Posts

    Directory of Open Access Journals (Sweden)

    Michele eSettanni

    2015-07-01

    Full Text Available Digital traces of activity on social network sites represent a vast source of ecological data with potential connections with individual behavioral and psychological characteristics. The present study investigates the relationship between user-generated textual content shared on Facebook and emotional well-being. Self-report measures of depression, anxiety and stress were collected from 201 adult Facebook users from North Italy. Emotion-related textual indicators, including emoticon use, were extracted form users’ Facebook posts via automated text analysis. Correlation analyses revealed that individuals with higher levels of depression, anxiety expressed negative emotions on Facebook more frequently. In addition, use of emoticons expressing positive emotions correlated negatively with stress level. When comparing age groups, younger users reported higher frequency of both emotion-related words and emoticon use in their posts. Also, the relationship between online emotional expression and self-report emotional well-being was generally stronger in the younger group. Overall, findings support the feasibility and validity of studying individual emotional well-being by means of examination of Facebook profiles. Implications for online screening purposes and future research directions are discussed.

  6. Sharing feelings online: studying emotional well-being via automated text analysis of Facebook posts.

    Science.gov (United States)

    Settanni, Michele; Marengo, Davide

    2015-01-01

    Digital traces of activity on social network sites represent a vast source of ecological data with potential connections with individual behavioral and psychological characteristics. The present study investigates the relationship between user-generated textual content shared on Facebook and emotional well-being. Self-report measures of depression, anxiety, and stress were collected from 201 adult Facebook users from North Italy. Emotion-related textual indicators, including emoticon use, were extracted form users' Facebook posts via automated text analysis. Correlation analyses revealed that individuals with higher levels of depression, anxiety expressed negative emotions on Facebook more frequently. In addition, use of emoticons expressing positive emotions correlated negatively with stress level. When comparing age groups, younger users reported higher frequency of both emotion-related words and emoticon use in their posts. Also, the relationship between online emotional expression and self-report emotional well-being was generally stronger in the younger group. Overall, findings support the feasibility and validity of studying individual emotional well-being by means of examination of Facebook profiles. Implications for online screening purposes and future research directions are discussed.

  7. Sharing feelings online: studying emotional well-being via automated text analysis of Facebook posts

    Science.gov (United States)

    Settanni, Michele; Marengo, Davide

    2015-01-01

    Digital traces of activity on social network sites represent a vast source of ecological data with potential connections with individual behavioral and psychological characteristics. The present study investigates the relationship between user-generated textual content shared on Facebook and emotional well-being. Self-report measures of depression, anxiety, and stress were collected from 201 adult Facebook users from North Italy. Emotion-related textual indicators, including emoticon use, were extracted form users’ Facebook posts via automated text analysis. Correlation analyses revealed that individuals with higher levels of depression, anxiety expressed negative emotions on Facebook more frequently. In addition, use of emoticons expressing positive emotions correlated negatively with stress level. When comparing age groups, younger users reported higher frequency of both emotion-related words and emoticon use in their posts. Also, the relationship between online emotional expression and self-report emotional well-being was generally stronger in the younger group. Overall, findings support the feasibility and validity of studying individual emotional well-being by means of examination of Facebook profiles. Implications for online screening purposes and future research directions are discussed. PMID:26257692

  8. Sharing feelings online: studying emotional well-being via automated text analysis of Facebook posts.

    Science.gov (United States)

    Settanni, Michele; Marengo, Davide

    2015-01-01

    Digital traces of activity on social network sites represent a vast source of ecological data with potential connections with individual behavioral and psychological characteristics. The present study investigates the relationship between user-generated textual content shared on Facebook and emotional well-being. Self-report measures of depression, anxiety, and stress were collected from 201 adult Facebook users from North Italy. Emotion-related textual indicators, including emoticon use, were extracted form users' Facebook posts via automated text analysis. Correlation analyses revealed that individuals with higher levels of depression, anxiety expressed negative emotions on Facebook more frequently. In addition, use of emoticons expressing positive emotions correlated negatively with stress level. When comparing age groups, younger users reported higher frequency of both emotion-related words and emoticon use in their posts. Also, the relationship between online emotional expression and self-report emotional well-being was generally stronger in the younger group. Overall, findings support the feasibility and validity of studying individual emotional well-being by means of examination of Facebook profiles. Implications for online screening purposes and future research directions are discussed. PMID:26257692

  9. Automated Microfluidic Platform for Serial Polymerase Chain Reaction and High-Resolution Melting Analysis.

    Science.gov (United States)

    Cao, Weidong; Bean, Brian; Corey, Scott; Coursey, Johnathan S; Hasson, Kenton C; Inoue, Hiroshi; Isano, Taisuke; Kanderian, Sami; Lane, Ben; Liang, Hongye; Murphy, Brian; Owen, Greg; Shinoda, Nobuhiko; Zeng, Shulin; Knight, Ivor T

    2016-06-01

    We report the development of an automated genetic analyzer for human sample testing based on microfluidic rapid polymerase chain reaction (PCR) with high-resolution melting analysis (HRMA). The integrated DNA microfluidic cartridge was used on a platform designed with a robotic pipettor system that works by sequentially picking up different test solutions from a 384-well plate, mixing them in the tips, and delivering mixed fluids to the DNA cartridge. A novel image feedback flow control system based on a Canon 5D Mark II digital camera was developed for controlling fluid movement through a complex microfluidic branching network without the use of valves. The same camera was used for measuring the high-resolution melt curve of DNA amplicons that were generated in the microfluidic chip. Owing to fast heating and cooling as well as sensitive temperature measurement in the microfluidic channels, the time frame for PCR and HRMA was dramatically reduced from hours to minutes. Preliminary testing results demonstrated that rapid serial PCR and HRMA are possible while still achieving high data quality that is suitable for human sample testing. PMID:25827436

  10. Statistical Analysis of Bus Networks in India

    CERN Document Server

    Chatterjee, Atanu; Ramadurai, Gitakrishnan

    2015-01-01

    Through the past decade the field of network science has established itself as a common ground for the cross-fertilization of exciting inter-disciplinary studies which has motivated researchers to model almost every physical system as an interacting network consisting of nodes and links. Although public transport networks such as airline and railway networks have been extensively studied, the status of bus networks still remains in obscurity. In developing countries like India, where bus networks play an important role in day-to-day commutation, it is of significant interest to analyze its topological structure and answer some of the basic questions on its evolution, growth, robustness and resiliency. In this paper, we model the bus networks of major Indian cities as graphs in \\textit{L}-space, and evaluate their various statistical properties using concepts from network science. Our analysis reveals a wide spectrum of network topology with the common underlying feature of small-world property. We observe tha...

  11. An automated system for whole microscopic image acquisition and analysis.

    Science.gov (United States)

    Bueno, Gloria; Déniz, Oscar; Fernández-Carrobles, María Del Milagro; Vállez, Noelia; Salido, Jesús

    2014-09-01

    The field of anatomic pathology has experienced major changes over the last decade. Virtual microscopy (VM) systems have allowed experts in pathology and other biomedical areas to work in a safer and more collaborative way. VMs are automated systems capable of digitizing microscopic samples that were traditionally examined one by one. The possibility of having digital copies reduces the risk of damaging original samples, and also makes it easier to distribute copies among other pathologists. This article describes the development of an automated high-resolution whole slide imaging (WSI) system tailored to the needs and problems encountered in digital imaging for pathology, from hardware control to the full digitization of samples. The system has been built with an additional digital monochromatic camera together with the color camera by default and LED transmitted illumination (RGB). Monochrome cameras are the preferred method of acquisition for fluorescence microscopy. The system is able to digitize correctly and form large high resolution microscope images for both brightfield and fluorescence. The quality of the digital images has been quantified using three metrics based on sharpness, contrast and focus. It has been proved on 150 tissue samples of brain autopsies, prostate biopsies and lung cytologies, at five magnifications: 2.5×, 10×, 20×, 40×, and 63×. The article is focused on the hardware set-up and the acquisition software, although results of the implemented image processing techniques included in the software and applied to the different tissue samples are also presented.

  12. Analysis by fracture network modelling

    International Nuclear Information System (INIS)

    This report describes the Fracture Network Modelling and Performance Assessment Support performed by Golder Associates Inc. during the Heisei-11 (1999-2000) fiscal year. The primary objective of the Golder Associates work scope during HY-11 was to provide theoretical and review support to the JNC HY-12 Performance assessment effort. In addition, Golder Associates provided technical support to JNC for the Aespoe Project. Major efforts for performance assessment support included analysis of PAWorks pathways and software documentation, verification, and performance assessment visualization. Support for the Aespoe project including 'Task 4' predictive modelling of sorbing tracer transport in TRUE-1 rock block, and integrated hydrogeological and geochemical modelling of Aespoe island for 'Task 5'. Technical information about Golder Associates HY-11 support to JNC is provided in the appendices to this report. (author)

  13. Multilayer motif analysis of brain networks

    OpenAIRE

    Battiston, Federico; Nicosia, Vincenzo; Chavez, Mario; Latora, Vito

    2016-01-01

    In the last decade network science has shed new light on the anatomical connectivity and on correlations in the activity of different areas of the human brain. The study of brain networks has made possible in fact to detect the central areas of a neural system, and to identify its building blocks by looking at overabundant small subgraphs, known as motifs. However, network analysis of the brain has so far mainly focused on structural and functional networks as separate entities. The recently ...

  14. Analysis of the Features of Network Words

    Institute of Scientific and Technical Information of China (English)

    阳艳萍

    2015-01-01

    The information society makes people’s lives gradually enter a digital state for living. And the popularity of the Inter⁃net has led to the unique phenomenon of network words. What impact will network and the combination of language bring about? This article will explore the relation between the phenomenon of network words and social context from the angle of so⁃cial linguistic through the analysis of network words and grammatical features.

  15. Object-oriented database design for the contaminant analysis automation project

    International Nuclear Information System (INIS)

    The Contaminant Analysis Automation project's automated soil analysis laboratory uses an Object-Oriented database for storage of runtime and archive information. Data which is generated by the processing of a sample, and is relevant for verification of the specifics of that process, is retained in the database. The database also contains intermediate results which are generated by one step of the process and used for decision making by later steps. The description of this database reveals design considerations of the objects used to model the behavior of the chemical laboratory and its components

  16. Social network analysis and supply chain management

    Directory of Open Access Journals (Sweden)

    Raúl Rodríguez Rodríguez

    2016-01-01

    Full Text Available This paper deals with social network analysis and how it could be integrated within supply chain management from a decision-making point of view. Even though the benefits of using social analysis have are widely accepted at both academic and industry/services context, there is still a lack of solid frameworks that allow decision-makers to connect the usage and obtained results of social network analysis – mainly both information and knowledge flows and derived results- with supply chain management objectives and goals. This paper gives an overview of social network analysis, the main social network analysis metrics, supply chain performance and, finally, it identifies how future frameworks could close the gap and link the results of social network analysis with the supply chain management decision-making processes.

  17. Cross-Disciplinary Detection and Analysis of Network Motifs

    OpenAIRE

    Ngoc Tam L. Tran; Luke DeLuccia; McDonald, Aidan F; Chun-Hsi Huang

    2015-01-01

    The detection of network motifs has recently become an important part of network analysis across all disciplines. In this work, we detected and analyzed network motifs from undirected and directed networks of several different disciplines, including biological network, social network, ecological network, as well as other networks such as airlines, power grid, and co-purchase of political books networks. Our analysis revealed that undirected networks are similar at the basic three and four nod...

  18. A computational tool for quantitative analysis of vascular networks.

    Directory of Open Access Journals (Sweden)

    Enrique Zudaire

    Full Text Available Angiogenesis is the generation of mature vascular networks from pre-existing vessels. Angiogenesis is crucial during the organism' development, for wound healing and for the female reproductive cycle. Several murine experimental systems are well suited for studying developmental and pathological angiogenesis. They include the embryonic hindbrain, the post-natal retina and allantois explants. In these systems vascular networks are visualised by appropriate staining procedures followed by microscopical analysis. Nevertheless, quantitative assessment of angiogenesis is hampered by the lack of readily available, standardized metrics and software analysis tools. Non-automated protocols are being used widely and they are, in general, time--and labour intensive, prone to human error and do not permit computation of complex spatial metrics. We have developed a light-weight, user friendly software, AngioTool, which allows for quick, hands-off and reproducible quantification of vascular networks in microscopic images. AngioTool computes several morphological and spatial parameters including the area covered by a vascular network, the number of vessels, vessel length, vascular density and lacunarity. In addition, AngioTool calculates the so-called "branching index" (branch points/unit area, providing a measurement of the sprouting activity of a specimen of interest. We have validated AngioTool using images of embryonic murine hindbrains, post-natal retinas and allantois explants. AngioTool is open source and can be downloaded free of charge.

  19. Automated Image Processing for the Analysis of DNA Repair Dynamics

    CERN Document Server

    Riess, Thorsten; Tomas, Martin; Ferrando-May, Elisa; Merhof, Dorit

    2011-01-01

    The efficient repair of cellular DNA is essential for the maintenance and inheritance of genomic information. In order to cope with the high frequency of spontaneous and induced DNA damage, a multitude of repair mechanisms have evolved. These are enabled by a wide range of protein factors specifically recognizing different types of lesions and finally restoring the normal DNA sequence. This work focuses on the repair factor XPC (xeroderma pigmentosum complementation group C), which identifies bulky DNA lesions and initiates their removal via the nucleotide excision repair pathway. The binding of XPC to damaged DNA can be visualized in living cells by following the accumulation of a fluorescent XPC fusion at lesions induced by laser microirradiation in a fluorescence microscope. In this work, an automated image processing pipeline is presented which allows to identify and quantify the accumulation reaction without any user interaction. The image processing pipeline comprises a preprocessing stage where the ima...

  20. 3D Assembly Group Analysis for Cognitive Automation

    Directory of Open Access Journals (Sweden)

    Christian Brecher

    2012-01-01

    Full Text Available A concept that allows the cognitive automation of robotic assembly processes is introduced. An assembly cell comprised of two robots was designed to verify the concept. For the purpose of validation a customer-defined part group consisting of Hubelino bricks is assembled. One of the key aspects for this process is the verification of the assembly group. Hence a software component was designed that utilizes the Microsoft Kinect to perceive both depth and color data in the assembly area. This information is used to determine the current state of the assembly group and is compared to a CAD model for validation purposes. In order to efficiently resolve erroneous situations, the results are interactively accessible to a human expert. The implications for an industrial application are demonstrated by transferring the developed concepts to an assembly scenario for switch-cabinet systems.

  1. AUTOMATION OF MORPHOMETRIC MEASUREMENTS FOR PLANETARY SURFACE ANALYSIS AND CARTOGRAPHY

    Directory of Open Access Journals (Sweden)

    A. A. Kokhanov

    2016-06-01

    Full Text Available For automation of measurements of morphometric parameters of surface relief various tools were developed and integrated into GIS. We have created a tool, which calculates statistical characteristics of the surface: interquartile range of heights, and slopes, as well as second derivatives of height fields as measures of topographic roughness. Other tools were created for morphological studies of craters. One of them allows automatic placing of topographic profiles through the geometric center of a crater. Another tool was developed for calculation of small crater depths and shape estimation, using C++ programming language. Additionally, we have prepared tool for calculating volumes of relief features from DTM rasters. The created software modules and models will be available in a new developed web-GIS system, operating in distributed cloud environment.

  2. Automation of Morphometric Measurements for Planetary Surface Analysis and Cartography

    Science.gov (United States)

    Kokhanov, A. A.; Bystrov, A. Y.; Kreslavsky, M. A.; Matveev, E. V.; Karachevtseva, I. P.

    2016-06-01

    For automation of measurements of morphometric parameters of surface relief various tools were developed and integrated into GIS. We have created a tool, which calculates statistical characteristics of the surface: interquartile range of heights, and slopes, as well as second derivatives of height fields as measures of topographic roughness. Other tools were created for morphological studies of craters. One of them allows automatic placing of topographic profiles through the geometric center of a crater. Another tool was developed for calculation of small crater depths and shape estimation, using C++ programming language. Additionally, we have prepared tool for calculating volumes of relief features from DTM rasters. The created software modules and models will be available in a new developed web-GIS system, operating in distributed cloud environment.

  3. Automated Multivariate Optimization Tool for Energy Analysis: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Ellis, P. G.; Griffith, B. T.; Long, N.; Torcellini, P. A.; Crawley, D.

    2006-07-01

    Building energy simulations are often used for trial-and-error evaluation of ''what-if'' options in building design--a limited search for an optimal solution, or ''optimization''. Computerized searching has the potential to automate the input and output, evaluate many options, and perform enough simulations to account for the complex interactions among combinations of options. This paper describes ongoing efforts to develop such a tool. The optimization tool employs multiple modules, including a graphical user interface, a database, a preprocessor, the EnergyPlus simulation engine, an optimization engine, and a simulation run manager. Each module is described and the overall application architecture is summarized.

  4. Research Prototype: Automated Analysis of Scientific and Engineering Semantics

    Science.gov (United States)

    Stewart, Mark E. M.; Follen, Greg (Technical Monitor)

    2001-01-01

    Physical and mathematical formulae and concepts are fundamental elements of scientific and engineering software. These classical equations and methods are time tested, universally accepted, and relatively unambiguous. The existence of this classical ontology suggests an ideal problem for automated comprehension. This problem is further motivated by the pervasive use of scientific code and high code development costs. To investigate code comprehension in this classical knowledge domain, a research prototype has been developed. The prototype incorporates scientific domain knowledge to recognize code properties (including units, physical, and mathematical quantity). Also, the procedure implements programming language semantics to propagate these properties through the code. This prototype's ability to elucidate code and detect errors will be demonstrated with state of the art scientific codes.

  5. An investigation and comparison on network performance analysis

    OpenAIRE

    2012-01-01

    This thesis is generally about network performance analysis. It contains two parts. The theory part summarizes what network performance is and inducts the methods of doing network performance analysis. To answer what network performance is, a study into what network services are is done. And based on the background research, there are two important network performance metrics: Network delay and Throughput should be included in network performance analysis. Among the methods of network a...

  6. Automated reduction and interpretation of multidimensional mass spectra for analysis of complex peptide mixtures

    Science.gov (United States)

    Gambin, Anna; Dutkowski, Janusz; Karczmarski, Jakub; Kluge, Boguslaw; Kowalczyk, Krzysztof; Ostrowski, Jerzy; Poznanski, Jaroslaw; Tiuryn, Jerzy; Bakun, Magda; Dadlez, Michal

    2007-01-01

    Here we develop a fully automated procedure for the analysis of liquid chromatography-mass spectrometry (LC-MS) datasets collected during the analysis of complex peptide mixtures. We present the underlying algorithm and outcomes of several experiments justifying its applicability. The novelty of our approach is to exploit the multidimensional character of the datasets. It is common knowledge that highly complex peptide mixtures can be analyzed by liquid chromatography coupled with mass spectrometry, but we are not aware of any existing automated MS spectra interpretation procedure designed to take into account the multidimensional character of the data. Our work fills this gap by providing an effective algorithm for this task, allowing for automated conversion of raw data to the list of masses of peptides.

  7. Automated analysis for scintigraphic evaluation of gastric emptying using invariant moments.

    Science.gov (United States)

    Abutaleb, A; Delalic, Z J; Ech, R; Siegel, J A

    1989-01-01

    This study introduces a method for automated analysis of the standard solid-meal gastric emptying test. The purpose was to develop a diagnostic tool to characterize more reproducibly abnormalities of solid-phase gastric emptying. The processing of gastric emptying is automated using geometrical moments that are invariant to scaling, rotation, and shift. Twenty subjects were studied. The first step was to obtain images of the stomach using a nuclear gamma camera immediately after the subject had eaten a radio-labeled meal. The second step was to process and analyze the images by a recently developed automated gastric emptying analysis (AGEA) method, which determines the gastric contour and the geometrical properties include such parameters as area, centroid, orientation, and moments of inertia. Statistical tests showed that some of the moments were sensitive to the patient's gastric status (normal versus abnormal). The difference between the normal and abnormal patients became noticeable approximately 1 h after meal ingestion. PMID:18230536

  8. 4th International Conference in Network Analysis

    CERN Document Server

    Koldanov, Petr; Pardalos, Panos

    2016-01-01

    The contributions in this volume cover a broad range of topics including maximum cliques, graph coloring, data mining, brain networks, Steiner forest, logistic and supply chain networks. Network algorithms and their applications to market graphs, manufacturing problems, internet networks and social networks are highlighted. The "Fourth International Conference in Network Analysis," held at the Higher School of Economics, Nizhny Novgorod in May 2014, initiated joint research between scientists, engineers and researchers from academia, industry and government; the major results of conference participants have been reviewed and collected in this Work. Researchers and students in mathematics, economics, statistics, computer science and engineering will find this collection a valuable resource filled with the latest research in network analysis.

  9. Functional MRI Preprocessing in Lesioned Brains: Manual Versus Automated Region of Interest Analysis.

    Science.gov (United States)

    Garrison, Kathleen A; Rogalsky, Corianne; Sheng, Tong; Liu, Brent; Damasio, Hanna; Winstein, Carolee J; Aziz-Zadeh, Lisa S

    2015-01-01

    Functional magnetic resonance imaging (fMRI) has significant potential in the study and treatment of neurological disorders and stroke. Region of interest (ROI) analysis in such studies allows for testing of strong a priori clinical hypotheses with improved statistical power. A commonly used automated approach to ROI analysis is to spatially normalize each participant's structural brain image to a template brain image and define ROIs using an atlas. However, in studies of individuals with structural brain lesions, such as stroke, the gold standard approach may be to manually hand-draw ROIs on each participant's non-normalized structural brain image. Automated approaches to ROI analysis are faster and more standardized, yet are susceptible to preprocessing error (e.g., normalization error) that can be greater in lesioned brains. The manual approach to ROI analysis has high demand for time and expertise, but may provide a more accurate estimate of brain response. In this study, commonly used automated and manual approaches to ROI analysis were directly compared by reanalyzing data from a previously published hypothesis-driven cognitive fMRI study, involving individuals with stroke. The ROI evaluated is the pars opercularis of the inferior frontal gyrus. Significant differences were identified in task-related effect size and percent-activated voxels in this ROI between the automated and manual approaches to ROI analysis. Task interactions, however, were consistent across ROI analysis approaches. These findings support the use of automated approaches to ROI analysis in studies of lesioned brains, provided they employ a task interaction design.

  10. Computer-automated multi-disciplinary analysis and design optimization of internally cooled turbine blades

    Science.gov (United States)

    Martin, Thomas Joseph

    This dissertation presents the theoretical methodology, organizational strategy, conceptual demonstration and validation of a fully automated computer program for the multi-disciplinary analysis, inverse design and optimization of convectively cooled axial gas turbine blades and vanes. Parametric computer models of the three-dimensional cooled turbine blades and vanes were developed, including the automatic generation of discretized computational grids. Several new analysis programs were written and incorporated with existing computational tools to provide computer models of the engine cycle, aero-thermodynamics, heat conduction and thermofluid physics of the internally cooled turbine blades and vanes. A generalized information transfer protocol was developed to provide the automatic mapping of geometric and boundary condition data between the parametric design tool and the numerical analysis programs. A constrained hybrid optimization algorithm controlled the overall operation of the system and guided the multi-disciplinary internal turbine cooling design process towards the objectives and constraints of engine cycle performance, aerodynamic efficiency, cooling effectiveness and turbine blade and vane durability. Several boundary element computer programs were written to solve the steady-state non-linear heat conduction equation inside the internally cooled and thermal barrier-coated turbine blades and vanes. The boundary element method (BEM) did not require grid generation inside the internally cooled turbine blades and vanes, so the parametric model was very robust. Implicit differentiations of the BEM thermal and thereto-elastic analyses were done to compute design sensitivity derivatives faster and more accurately than via explicit finite differencing. A factor of three savings of computer processing time was realized for two-dimensional thermal optimization problems, and a factor of twenty was obtained for three-dimensional thermal optimization problems

  11. Constructing an Intelligent Patent Network Analysis Method

    OpenAIRE

    Wu, Chao-Chan; Yao, Ching-Bang

    2012-01-01

    Patent network analysis, an advanced method of patent analysis, is a useful tool for technology management. This method visually displays all the relationships among the patents and enables the analysts to intuitively comprehend the overview of a set of patents in the field of the technology being studied. Although patent network analysis possesses relative advantages different from traditional methods of patent analysis, it is subject to several crucial limitations. To overcome the drawbacks...

  12. Physical explosion analysis in heat exchanger network design

    Science.gov (United States)

    Pasha, M.; Zaini, D.; Shariff, A. M.

    2016-06-01

    The failure of shell and tube heat exchangers is being extensively experienced by the chemical process industries. This failure can create a loss of production for long time duration. Moreover, loss of containment through heat exchanger could potentially lead to a credible event such as fire, explosion and toxic release. There is a need to analyse the possible worst case effect originated from the loss of containment of the heat exchanger at the early design stage. Physical explosion analysis during the heat exchanger network design is presented in this work. Baker and Prugh explosion models are deployed for assessing the explosion effect. Microsoft Excel integrated with process design simulator through object linking and embedded (OLE) automation for this analysis. Aspen HYSYS V (8.0) used as a simulation platform in this work. A typical heat exchanger network of steam reforming and shift conversion process was presented as a case study. It is investigated from this analysis that overpressure generated from the physical explosion of each heat exchanger can be estimated in a more precise manner by using Prugh model. The present work could potentially assist the design engineer to identify the critical heat exchanger in the network at the preliminary design stage.

  13. Automated analysis of small animal PET studies through deformable registration to an atlas

    International Nuclear Information System (INIS)

    This work aims to develop a methodology for automated atlas-guided analysis of small animal positron emission tomography (PET) data through deformable registration to an anatomical mouse model. A non-rigid registration technique is used to put into correspondence relevant anatomical regions of rodent CT images from combined PET/CT studies to corresponding CT images of the Digimouse anatomical mouse model. The latter provides a pre-segmented atlas consisting of 21 anatomical regions suitable for automated quantitative analysis. Image registration is performed using a package based on the Insight Toolkit allowing the implementation of various image registration algorithms. The optimal parameters obtained for deformable registration were applied to simulated and experimental mouse PET/CT studies. The accuracy of the image registration procedure was assessed by segmenting mouse CT images into seven regions: brain, lungs, heart, kidneys, bladder, skeleton and the rest of the body. This was accomplished prior to image registration using a semi-automated algorithm. Each mouse segmentation was transformed using the parameters obtained during CT to CT image registration. The resulting segmentation was compared with the original Digimouse atlas to quantify image registration accuracy using established metrics such as the Dice coefficient and Hausdorff distance. PET images were then transformed using the same technique and automated quantitative analysis of tracer uptake performed. The Dice coefficient and Hausdorff distance show fair to excellent agreement and a mean registration mismatch distance of about 6 mm. The results demonstrate good quantification accuracy in most of the regions, especially the brain, but not in the bladder, as expected. Normalized mean activity estimates were preserved between the reference and automated quantification techniques with relative errors below 10 % in most of the organs considered. The proposed automated quantification technique is

  14. Automated analysis of small animal PET studies through deformable registration to an atlas

    Energy Technology Data Exchange (ETDEWEB)

    Gutierrez, Daniel F. [Geneva University Hospital, Division of Nuclear Medicine and Molecular Imaging, Geneva 4 (Switzerland); Zaidi, Habib [Geneva University Hospital, Division of Nuclear Medicine and Molecular Imaging, Geneva 4 (Switzerland); Geneva University, Geneva Neuroscience Center, Geneva (Switzerland); University of Groningen, Department of Nuclear Medicine and Molecular Imaging, University Medical Center Groningen, Groningen (Netherlands)

    2012-11-15

    This work aims to develop a methodology for automated atlas-guided analysis of small animal positron emission tomography (PET) data through deformable registration to an anatomical mouse model. A non-rigid registration technique is used to put into correspondence relevant anatomical regions of rodent CT images from combined PET/CT studies to corresponding CT images of the Digimouse anatomical mouse model. The latter provides a pre-segmented atlas consisting of 21 anatomical regions suitable for automated quantitative analysis. Image registration is performed using a package based on the Insight Toolkit allowing the implementation of various image registration algorithms. The optimal parameters obtained for deformable registration were applied to simulated and experimental mouse PET/CT studies. The accuracy of the image registration procedure was assessed by segmenting mouse CT images into seven regions: brain, lungs, heart, kidneys, bladder, skeleton and the rest of the body. This was accomplished prior to image registration using a semi-automated algorithm. Each mouse segmentation was transformed using the parameters obtained during CT to CT image registration. The resulting segmentation was compared with the original Digimouse atlas to quantify image registration accuracy using established metrics such as the Dice coefficient and Hausdorff distance. PET images were then transformed using the same technique and automated quantitative analysis of tracer uptake performed. The Dice coefficient and Hausdorff distance show fair to excellent agreement and a mean registration mismatch distance of about 6 mm. The results demonstrate good quantification accuracy in most of the regions, especially the brain, but not in the bladder, as expected. Normalized mean activity estimates were preserved between the reference and automated quantification techniques with relative errors below 10 % in most of the organs considered. The proposed automated quantification technique is

  15. An Integrated Solution for both Monitoring and Controlling for Automization Using Wireless Sensor Networks: A Case Study

    Directory of Open Access Journals (Sweden)

    M Gnana Seelan

    2013-02-01

    Full Text Available Temperature monitoring plays a major role in controlling it according to its varied conditions. Thisprocess is common in all critical areas like data centre, server rooms, grid rooms and other datacommunication equipped rooms. This is mandatory for each organization/industry to impart suchprocess, as most of the critical data would be in data centre along with their network infrastructure whichhaving various electronic, electrical and mechanical devices are involved for data transmissions. Thesedevices are very much depend on the environmental factors such as temperature, moisture, humidity etc.,and also emit heat in the form of thermal energy when they are in functional. To overcome these heats,the server/data centre room(s would be engaged with multiple (distributed air-conditioning (ac systemsto provide cooling environment and maintain the temperature level of the room. The proposed paper isthe study of automization of monitoring and controlling temperature as per desired requirements withwsn network

  16. Automating Mid- and Long-Range Scheduling for the NASA Deep Space Network

    Science.gov (United States)

    Johnston, Mark D.; Tran, Daniel

    2012-01-01

    NASA has recently deployed a new mid-range scheduling system for the antennas of the Deep Space Network (DSN), called Service Scheduling Software, or S(sup 3). This system was designed and deployed as a modern web application containing a central scheduling database integrated with a collaborative environment, exploiting the same technologies as social web applications but applied to a space operations context. This is highly relevant to the DSN domain since the network schedule of operations is developed in a peer-to-peer negotiation process among all users of the DSN. These users represent not only NASA's deep space missions, but also international partners and ground-based science and calibration users. The initial implementation of S(sup 3) is complete and the system has been operational since July 2011. This paper describes some key aspects of the S(sup 3) system and on the challenges of modeling complex scheduling requirements and the ongoing extension of S(sup 3) to encompass long-range planning, downtime analysis, and forecasting, as the next step in developing a single integrated DSN scheduling tool suite to cover all time ranges.

  17. Semi-automated analysis of EEG spikes in the preterm fetal sheep using wavelet analysis

    International Nuclear Information System (INIS)

    Full text: Presentation Preference Oral Presentation Perinatal hypoxia plays a key role in the cause of brain injury in premature infants. Cerebral hypothermia commenced in the latent phase of evolving injury (first 6-8 h post hypoxic-ischemic insult) is the lead candidate for treatment however currently there is no means to identify which infants can benefit from treatment. Recent studies suggest that epileptiform transients in latent phase are predictive of neural outcome. To quantify this, an automated means of EEG analysis is required as EEG monitoring produces vast amounts of data which is timely to analyse manually. We have developed a semi-automated EEG spike detection method which employs a discretized version of the continuous wavelet transform (CWT). EEG data was obtained from a fetal sheep at approximately 0.7 of gestation. Fetal asphyxia was maintained for 25 min and the EEG recorded for 8 h before and after asphyxia. The CWT was calculated followed by the power of the wavelet transform coefficients. Areas of high power corresponded to spike waves so thresholding was employed to identify the spikes. The performance of the method was found have a good sensitivity and selectivity, thus demonstrating that this method is a simple, robust and potentially effective spike detection algorithm.

  18. Scanning probe image wizard: A toolbox for automated scanning probe microscopy data analysis

    Science.gov (United States)

    Stirling, Julian; Woolley, Richard A. J.; Moriarty, Philip

    2013-11-01

    We describe SPIW (scanning probe image wizard), a new image processing toolbox for SPM (scanning probe microscope) images. SPIW can be used to automate many aspects of SPM data analysis, even for images with surface contamination and step edges present. Specialised routines are available for images with atomic or molecular resolution to improve image visualisation and generate statistical data on surface structure.

  19. Chapter 2: Predicting Newcomer Integration in Online Knowledge Communities by Automated Dialog Analysis

    NARCIS (Netherlands)

    Nistor, Nicolae; Dascalu, Mihai; Stavarache, Lucia; Tarnai, Christian; Trausan-Matu, Stefan

    2016-01-01

    Nistor, N., Dascalu, M., Stavarache, L.L., Tarnai, C., & Trausan-Matu, S. (2015). Predicting Newcomer Integration in Online Knowledge Communities by Automated Dialog Analysis. In Y. Li, M. Chang, M. Kravcik, E. Popescu, R. Huang, Kinshuk & N.-S. Chen (Eds.), State-of-the-Art and Future Directions of

  20. Miniaturized Mass-Spectrometry-Based Analysis System for Fully Automated Examination of Conditioned Cell Culture Media

    NARCIS (Netherlands)

    Weber, E.; Pinkse, M.W.H.; Bener-Aksam, E.; Vellekoop, M.J.; Verhaert, P.D.E.M.

    2012-01-01

    We present a fully automated setup for performing in-line mass spectrometry (MS) analysis of conditioned media in cell cultures, in particular focusing on the peptides therein. The goal is to assess peptides secreted by cells in different culture conditions. The developed system is compatible with M

  1. Application of fluorescence-based semi-automated AFLP analysis in barley and wheat

    DEFF Research Database (Denmark)

    Schwarz, G.; Herz, M.; Huang, X.Q.;

    2000-01-01

    of semi-automated codominant analysis for hemizygous AFLP markers in an F-2 population was too low, proposing the use of dominant allele-typing defaults. Nevertheless, the efficiency of genetic mapping, especially of complex plant genomes, will be accelerated by combining the presented genotyping...

  2. Development of a novel and automated fluorescent immunoassay for the analysis of beta-lactam antibiotics

    NARCIS (Netherlands)

    Benito-Pena, E.; Moreno-Bondi, M.C.; Orellana, G.; Maquieira, K.; Amerongen, van A.

    2005-01-01

    An automated immunosensor for the rapid and sensitive analysis of penicillin type -lactam antibiotics has been developed and optimized. An immunogen was prepared by coupling the common structure of the penicillanic -lactam antibiotics, i.e., 6-aminopenicillanic acid to keyhole limpet hemocyanin. Pol

  3. Automated data acquisition and analysis system for inventory verification

    International Nuclear Information System (INIS)

    A real-time system is proposed which would allow CLO Safeguards Branch to conduct a meaningful inventory verification using a variety of NDA instruments. The overall system would include the NDA instruments, automated data handling equipment, and a vehicle to house and transport the instruments and equipment. For the purpose of the preliminary cost estimate a specific data handling system and vehicle were required. A Tracor Northern TN-11 data handling system including a PDP-11 minicomputer and a measurement vehicle similar to the Commission's Regulatory Region I van were used. The basic system is currently estimated to cost about $100,000, and future add-ons which would expand the systems' capabilities are estimated to cost about $40,000. The concept of using a vehicle in order to permanently rack mount the data handling equipmentoffers a number of benefits such as control of equipment environment and allowance for improvements, expansion, and flexibility in the system. Justification is also presented for local design and assembly of the overall system. A summary of the demonstration system which illustrates the advantages and feasibility of the overall system is included in this discussion. Two ideas are discussed which are not considered to be viable alternatives to the proposed system: addition of the data handling capabilities to the semiportable ''cart'' and use of a telephone link to a large computer center

  4. An Analysis of Intelligent Automation Demands in Taiwanese Firms

    Directory of Open Access Journals (Sweden)

    Ying-Mei Tai

    2016-03-01

    Full Text Available To accurately elucidate the production deployment, process intelligent automation (IA, and production bottlenecks of Taiwanese companies, as well as the IA application status, this research conducted a structured questionnaire survey on the participants of the IA promotion activities arranged by the Industrial Development Bureau, Ministry of Economic Affairs. A total of 35 valid questionnaires were recovered. Research findings indicated that the majority of participants were large-scale enterprises. These enterprises anticipated adding production bases in Taiwan and China to transit and upgrade their operations or strengthen influence in the domestic market. The degrees of various process IA and production bottlenecks were relatively low, which was associated with the tendency to small amount of diversified products. The majority of sub-categories of hardware equipment and simulation technologies have reached maturity, and the effective application of these technologies can enhance production efficiency. Technologies of intelligent software remain immature and need further development and application. More importantly, they can meet customer values and create new business models, so as to satisfy the purpose of sustainable development.

  5. Filtering Genes for Cluster and Network Analysis

    Directory of Open Access Journals (Sweden)

    Parkhomenko Elena

    2009-06-01

    Full Text Available Abstract Background Prior to cluster analysis or genetic network analysis it is customary to filter, or remove genes considered to be irrelevant from the set of genes to be analyzed. Often genes whose variation across samples is less than an arbitrary threshold value are deleted. This can improve interpretability and reduce bias. Results This paper introduces modular models for representing network structure in order to study the relative effects of different filtering methods. We show that cluster analysis and principal components are strongly affected by filtering. Filtering methods intended specifically for cluster and network analysis are introduced and compared by simulating modular networks with known statistical properties. To study more realistic situations, we analyze simulated "real" data based on well-characterized E. coli and S. cerevisiae regulatory networks. Conclusion The methods introduced apply very generally, to any similarity matrix describing gene expression. One of the proposed methods, SUMCOV, performed well for all models simulated.

  6. Detailed interrogation of trypanosome cell biology via differential organelle staining and automated image analysis

    Directory of Open Access Journals (Sweden)

    Wheeler Richard J

    2012-01-01

    Full Text Available Abstract Background Many trypanosomatid protozoa are important human or animal pathogens. The well defined morphology and precisely choreographed division of trypanosomatid cells makes morphological analysis a powerful tool for analyzing the effect of mutations, chemical insults and changes between lifecycle stages. High-throughput image analysis of micrographs has the potential to accelerate collection of quantitative morphological data. Trypanosomatid cells have two large DNA-containing organelles, the kinetoplast (mitochondrial DNA and nucleus, which provide useful markers for morphometric analysis; however they need to be accurately identified and often lie in close proximity. This presents a technical challenge. Accurate identification and quantitation of the DNA content of these organelles is a central requirement of any automated analysis method. Results We have developed a technique based on double staining of the DNA with a minor groove binding (4'', 6-diamidino-2-phenylindole (DAPI and a base pair intercalating (propidium iodide (PI or SYBR green fluorescent stain and color deconvolution. This allows the identification of kinetoplast and nuclear DNA in the micrograph based on whether the organelle has DNA with a more A-T or G-C rich composition. Following unambiguous identification of the kinetoplasts and nuclei the resulting images are amenable to quantitative automated analysis of kinetoplast and nucleus number and DNA content. On this foundation we have developed a demonstrative analysis tool capable of measuring kinetoplast and nucleus DNA content, size and position and cell body shape, length and width automatically. Conclusions Our approach to DNA staining and automated quantitative analysis of trypanosomatid morphology accelerated analysis of trypanosomatid protozoa. We have validated this approach using Leishmania mexicana, Crithidia fasciculata and wild-type and mutant Trypanosoma brucei. Automated analysis of T. brucei

  7. Automated interpretation of ventilation-perfusion lung scintigrams for the diagnosis of pulmonary embolism using artificial neural networks

    International Nuclear Information System (INIS)

    The purpose of this study was to develop a completely automated method for the interpretation of ventilation-perfusion (V-P) lung scintigrams used in the diagnosis of pulmonary embolism. An artificial neural network was trained for the diagnosis of pulmonary embolism using 18 automatically obtained features from each set of V-P scintigrams. The techniques used to process the images included their alignment to templates, the construction of quotient images based on the ventilation and perfusion images, and the calculation of measures describing V-P mismatches in the quotient images. The templates represented lungs of normal size and shape without any pathological changes. Images that could not be properly aligned to the templates were detected and excluded automatically. After exclusion of those V-P scintigrams not properly aligned to the templates, 478 V-P scintigrams remained in a training group of consecutive patients with suspected pulmonary embolism, and a further 87 V-P scintigrams formed a separate test group comprising patients who had undergone pulmonary angiography. The performance of the neural network, measured as the area under the receiver operating characteristic curve, was 0.87 (95% confidence limits 0.82-0.92) in the training group and 0.79 (0.69-0.88) in the test group. It is concluded that a completely automated method can be used for the interpretation of V-P scintigrams. The performance of this method is similar to others previously presented, whereby features were extracted manually. (orig.)

  8. Automated interpretation of ventilation-perfusion lung scintigrams for the diagnosis of pulmonary embolism using artificial neural networks

    Energy Technology Data Exchange (ETDEWEB)

    Holst, H.; Jaerund, A.; Traegil, K.; Evander, E.; Edenbrandt, L. [Department of Clinical Physiology, Lund University, Lund (Sweden); Aastroem, K.; Heyden, A.; Kahl, F.; Sparr, G. [Department of Mathematics, Lund Institute of Technology, Lund (Sweden); Palmer, J. [Department of Radiation Physics, Lund University, Lund (Sweden)

    2000-04-01

    The purpose of this study was to develop a completely automated method for the interpretation of ventilation-perfusion (V-P) lung scintigrams used in the diagnosis of pulmonary embolism. An artificial neural network was trained for the diagnosis of pulmonary embolism using 18 automatically obtained features from each set of V-P scintigrams. The techniques used to process the images included their alignment to templates, the construction of quotient images based on the ventilation and perfusion images, and the calculation of measures describing V-P mismatches in the quotient images. The templates represented lungs of normal size and shape without any pathological changes. Images that could not be properly aligned to the templates were detected and excluded automatically. After exclusion of those V-P scintigrams not properly aligned to the templates, 478 V-P scintigrams remained in a training group of consecutive patients with suspected pulmonary embolism, and a further 87 V-P scintigrams formed a separate test group comprising patients who had undergone pulmonary angiography. The performance of the neural network, measured as the area under the receiver operating characteristic curve, was 0.87 (95% confidence limits 0.82-0.92) in the training group and 0.79 (0.69-0.88) in the test group. It is concluded that a completely automated method can be used for the interpretation of V-P scintigrams. The performance of this method is similar to others previously presented, whereby features were extracted manually. (orig.)

  9. Automated Inference System for End-To-End Diagnosis of Network Performance Issues in Client-Terminal Devices

    Directory of Open Access Journals (Sweden)

    Chathuranga Widanapathirana

    2012-06-01

    Full Text Available Traditional network diagnosis methods of Client-Terminal Device (CTD problems tend to be laborintensive, time consuming, and contribute to increased customer dissatisfaction. In this paper, we propose an automated solution for rapidly diagnose the root causes of network performance issues in CTD. Based on a new intelligent inference technique, we create the Intelligent Automated Client Diagnostic (IACD system, which only relies on collection of Transmission Control Protocol (TCP packet traces. Using soft-margin Support Vector Machine (SVM classifiers, the system (i distinguishes link problems from client problems and (ii identifies characteristics unique to the specific fault to report the root cause. The modular design of the system enables support for new access link and fault types. Experimental evaluation demonstrated the capability of the IACD system to distinguish between faulty and healthy links and to diagnose the client faults with 98% accuracy. The system can perform fault diagnosis independent of the user’s specific TCP implementation, enabling diagnosis of diverse range of client devices.

  10. Automated red blood cell analysis compared with routine red blood cell morphology by smear review

    Directory of Open Access Journals (Sweden)

    Dr.Poonam Radadiya

    2015-01-01

    Full Text Available The RBC histogram is an integral part of automated haematology analysis and is now routinely available on all automated cell counters. This histogram and other associated complete blood count (CBC parameters have been found abnormal in various haematological conditions and may provide major clues in the diagnosis and management of significant red cell disorders. Performing manual blood smears is important to ensure the quality of blood count results and to make presumptive diagnosis. In this article we have taken 100 samples for comparative study between RBC histograms obtained by automated haematology analyzer with peripheral blood smear. This article discusses some morphological features of dimorphism and the ensuing characteristic changes in their RBC histograms.

  11. Trend Analysis on the Automation of the Notebook PC Production Process

    Directory of Open Access Journals (Sweden)

    Chin-Ching Yeh

    2012-09-01

    Full Text Available Notebook PCs are among the Taiwanese electronic products that generate the highest production value and market share. According to the ITRI IEK statistics, the domestic Notebook PC - production value in 2011 was about NT $2.3 trillion. Of about 200 million notebook PCs in global markets in 2011, Taiwan’s notebook PC output accounts for more than 90% of them, meaning that nine out of every ten notebook PCs in the world are manufactured by Taiwanese companies. For such a large industry with its output value and quantity, the degree of automation in production processes is not high. This means that there is still a great room for the automation of the notebook PC production process, or that the degree of automation of the production process of the laptops cannot be enhanced. This paper presents an analysis of the situation.

  12. Clustered Numerical Data Analysis Using Markov Lie Monoid Based Networks

    Science.gov (United States)

    Johnson, Joseph

    2016-03-01

    We have designed and build an optimal numerical standardization algorithm that links numerical values with their associated units, error level, and defining metadata thus supporting automated data exchange and new levels of artificial intelligence (AI). The software manages all dimensional and error analysis and computational tracing. Tables of entities verses properties of these generalized numbers (called ``metanumbers'') support a transformation of each table into a network among the entities and another network among their properties where the network connection matrix is based upon a proximity metric between the two items. We previously proved that every network is isomorphic to the Lie algebra that generates continuous Markov transformations. We have also shown that the eigenvectors of these Markov matrices provide an agnostic clustering of the underlying patterns. We will present this methodology and show how our new work on conversion of scientific numerical data through this process can reveal underlying information clusters ordered by the eigenvalues. We will also show how the linking of clusters from different tables can be used to form a ``supernet'' of all numerical information supporting new initiatives in AI.

  13. Direct evidence of intra- and interhemispheric corticomotor network degeneration in amyotrophic lateral sclerosis: an automated MRI structural connectivity study.

    Science.gov (United States)

    Rose, Stephen; Pannek, Kerstin; Bell, Christopher; Baumann, Fusun; Hutchinson, Nicole; Coulthard, Alan; McCombe, Pamela; Henderson, Robert

    2012-02-01

    Although the pathogenesis of amyotrophic lateral sclerosis (ALS) is uncertain, there is mounting neuroimaging evidence to suggest a mechanism involving the degeneration of multiple white matter (WM) motor and extramotor neural networks. This insight has been achieved, in part, by using MRI Diffusion Tensor Imaging (DTI) and the voxelwise analysis of anisotropy indices, along with DTI tractography to determine which specific motor pathways are involved with ALS pathology. Automated MRI structural connectivity analyses, which probe WM connections linking various functionally discrete cortical regions, have the potential to provide novel information about degenerative processes within multiple white matter (WM) pathways. Our hypothesis is that measures of altered intra- and interhemispheric structural connectivity of the primary motor and somatosensory cortex will provide an improved assessment of corticomotor involvement in ALS. To test this hypothesis, we acquired High Angular Resolution Diffusion Imaging (HARDI) scans along with high resolution structural images (sMRI) on 15 patients with clinical evidence of upper and lower motor neuron involvement, and 20 matched control participants. Whole brain probabilistic tractography was applied to define specific WM pathways connecting discrete corticomotor targets generated from anatomical parcellation of sMRI of the brain. The integrity of these connections was interrogated by comparing the mean fractional anisotropy (FA) derived for each WM pathway. To assist in the interpretation of results, we measured the reproducibility of the FA summary measures over time (6months) in control participants. We also incorporated into our analysis pipeline the evaluation and replacement of outlier voxels due to head motion and physiological noise. When assessing corticomotor connectivity, we found a significant reduction in mean FA within a number of intra- and interhemispheric motor pathways in ALS patients. The abnormal

  14. Social network analysis and supply chain management

    OpenAIRE

    Raúl Rodríguez Rodríguez

    2016-01-01

    This paper deals with social network analysis and how it could be integrated within supply chain management from a decision-making point of view. Even though the benefits of using social analysis have are widely accepted at both academic and industry/services context, there is still a lack of solid frameworks that allow decision-makers to connect the usage and obtained results of social network analysis – mainly both information and knowledge flows and derived results- with supply chain manag...

  15. 32 CFR 2001.50 - Telecommunications automated information systems and network security.

    Science.gov (United States)

    2010-07-01

    ... and network security. 2001.50 Section 2001.50 National Defense Other Regulations Relating to National Defense INFORMATION SECURITY OVERSIGHT OFFICE, NATIONAL ARCHIVES AND RECORDS ADMINISTRATION CLASSIFIED... network security. Each agency head shall ensure that classified information electronically...

  16. Comparison of semi-automated image analysis and manual methods for tissue quantification in pancreatic carcinoma

    Energy Technology Data Exchange (ETDEWEB)

    Sims, A.J. [Regional Medical Physics Department, Freeman Hospital, Newcastle upon Tyne (United Kingdom)]. E-mail: a.j.sims@newcastle.ac.uk; Murray, A. [Regional Medical Physics Department, Freeman Hospital, Newcastle upon Tyne (United Kingdom); Bennett, M.K. [Department of Histopathology, Newcastle upon Tyne Hospitals NHS Trust, Newcastle upon Tyne (United Kingdom)

    2002-04-21

    Objective measurements of tissue area during histological examination of carcinoma can yield valuable prognostic information. However, such measurements are not made routinely because the current manual approach is time consuming and subject to large statistical sampling error. In this paper, a semi-automated image analysis method for measuring tissue area in histological samples is applied to the measurement of stromal tissue, cell cytoplasm and lumen in samples of pancreatic carcinoma and compared with the standard manual point counting method. Histological samples from 26 cases of pancreatic carcinoma were stained using the sirius red, light-green method. Images from each sample were captured using two magnifications. Image segmentation based on colour cluster analysis was used to subdivide each image into representative colours which were classified manually into one of three tissue components. Area measurements made using this technique were compared to corresponding manual measurements and used to establish the comparative accuracy of the semi-automated image analysis technique, with a quality assurance study to measure the repeatability of the new technique. For both magnifications and for each tissue component, the quality assurance study showed that the semi-automated image analysis algorithm had better repeatability than its manual equivalent. No significant bias was detected between the measurement techniques for any of the comparisons made using the 26 cases of pancreatic carcinoma. The ratio of manual to semi-automatic repeatability errors varied from 2.0 to 3.6. Point counting would need to be increased to be between 400 and 1400 points to achieve the same repeatability as for the semi-automated technique. The results demonstrate that semi-automated image analysis is suitable for measuring tissue fractions in histological samples prepared with coloured stains and is a practical alternative to manual point counting. (author)

  17. Comparison of semi-automated image analysis and manual methods for tissue quantification in pancreatic carcinoma

    International Nuclear Information System (INIS)

    Objective measurements of tissue area during histological examination of carcinoma can yield valuable prognostic information. However, such measurements are not made routinely because the current manual approach is time consuming and subject to large statistical sampling error. In this paper, a semi-automated image analysis method for measuring tissue area in histological samples is applied to the measurement of stromal tissue, cell cytoplasm and lumen in samples of pancreatic carcinoma and compared with the standard manual point counting method. Histological samples from 26 cases of pancreatic carcinoma were stained using the sirius red, light-green method. Images from each sample were captured using two magnifications. Image segmentation based on colour cluster analysis was used to subdivide each image into representative colours which were classified manually into one of three tissue components. Area measurements made using this technique were compared to corresponding manual measurements and used to establish the comparative accuracy of the semi-automated image analysis technique, with a quality assurance study to measure the repeatability of the new technique. For both magnifications and for each tissue component, the quality assurance study showed that the semi-automated image analysis algorithm had better repeatability than its manual equivalent. No significant bias was detected between the measurement techniques for any of the comparisons made using the 26 cases of pancreatic carcinoma. The ratio of manual to semi-automatic repeatability errors varied from 2.0 to 3.6. Point counting would need to be increased to be between 400 and 1400 points to achieve the same repeatability as for the semi-automated technique. The results demonstrate that semi-automated image analysis is suitable for measuring tissue fractions in histological samples prepared with coloured stains and is a practical alternative to manual point counting. (author)

  18. Economic Perspectives on Automated Demand Responsive Transportation and Shared Taxi Services - Analytical models and simulations for policy analysis

    OpenAIRE

    Jokinen, Jani-Pekka

    2016-01-01

    The automated demand responsive transportation (DRT) and modern shared taxi services provide shared trips for passengers, adapting dynamically to trip requests by routing a fleet of vehicles operating without any fixed routes or schedules. Compared with traditional public transportation, these new services provide trips without transfers and free passengers from the necessity of using timetables and maps of route networks. Furthermore, automated DRT applies real-time traffic information in ve...

  19. Automating case reports for the analysis of digital evidence

    OpenAIRE

    Cassidy, Regis H. Friend

    2005-01-01

    The reporting process during computer analysis is critical in the practice of digital forensics. Case reports are used to review the process and results of an investigation and serve multiple purposes. The investigator may refer to these reports to monitor the progress of his analysis throughout the investigation. When acting as an expert witness, the investigator will refer to organized documentation to recall past analysis. A lot of time can elapse between the analysis and the actual testim...

  20. Network analysis of Zentralblatt MATH data

    OpenAIRE

    Cerinšek, Monika; Batagelj, Vladimir

    2014-01-01

    We analyze the data about works (papers, books) from the time period 1990-2010 that are collected in Zentralblatt MATH database. The data were converted into four 2-mode networks (works $\\times$ authors, works $\\times$ journals, works $\\times$ keywords and works $\\times$ MSCs) and into a partition of works by publication year. The networks were analyzed using Pajek -- a program for analysis and visualization of large networks. We explore the distributions of some properties of works and the c...

  1. Automated face analysis by feature point tracking has high concurrent validity with manual FACS coding.

    Science.gov (United States)

    Cohn, J F; Zlochower, A J; Lien, J; Kanade, T

    1999-01-01

    The face is a rich source of information about human behavior. Available methods for coding facial displays, however, are human-observer dependent, labor intensive, and difficult to standardize. To enable rigorous and efficient quantitative measurement of facial displays, we have developed an automated method of facial display analysis. In this report, we compare the results with this automated system with those of manual FACS (Facial Action Coding System, Ekman & Friesen, 1978a) coding. One hundred university students were videotaped while performing a series of facial displays. The image sequences were coded from videotape by certified FACS coders. Fifteen action units and action unit combinations that occurred a minimum of 25 times were selected for automated analysis. Facial features were automatically tracked in digitized image sequences using a hierarchical algorithm for estimating optical flow. The measurements were normalized for variation in position, orientation, and scale. The image sequences were randomly divided into a training set and a cross-validation set, and discriminant function analyses were conducted on the feature point measurements. In the training set, average agreement with manual FACS coding was 92% or higher for action units in the brow, eye, and mouth regions. In the cross-validation set, average agreement was 91%, 88%, and 81% for action units in the brow, eye, and mouth regions, respectively. Automated face analysis by feature point tracking demonstrated high concurrent validity with manual FACS coding.

  2. Automated image analysis for space debris identification and astrometric measurements

    Science.gov (United States)

    Piattoni, Jacopo; Ceruti, Alessandro; Piergentili, Fabrizio

    2014-10-01

    The space debris is a challenging problem for the human activity in the space. Observation campaigns are conducted around the globe to detect and track uncontrolled space objects. One of the main problems in optical observation is obtaining useful information about the debris dynamical state by the images collected. For orbit determination, the most relevant information embedded in optical observation is the precise angular position, which can be evaluated by astrometry procedures, comparing the stars inside the image with star catalogs. This is typically a time consuming process, if done by a human operator, which makes this task impractical when dealing with large amounts of data, in the order of thousands images per night, generated by routinely conducted observations. An automated procedure is investigated in this paper that is capable to recognize the debris track inside a picture, calculate the celestial coordinates of the image's center and use these information to compute the debris angular position in the sky. This procedure has been implemented in a software code, that does not require human interaction and works without any supplemental information besides the image itself, detecting space objects and solving for their angular position without a priori information. The algorithm for object detection was developed inside the research team. For the star field computation, the software code astrometry.net was used and released under GPL v2 license. The complete procedure was validated by an extensive testing, using the images obtained in the observation campaign performed in a joint project between the Italian Space Agency (ASI) and the University of Bologna at the Broglio Space center, Kenya.

  3. Wireless Networks Technologies in Home Automation Networks%家庭自动化网络中的无线网络技术

    Institute of Scientific and Technical Information of China (English)

    常东来; 江亿

    2001-01-01

    Home automation networks (HAN) is a kind of new products, which is emerged as the information times require. This paper introduces the new advances of wireless networks technologies applied in HAN, the prospect of the development of future digital home is included as well.%家庭自动化网络是伴随着信息时代的到来应运而生的一种新产品。本文主要介绍了最新无线网络技术在家庭自动化网络中的应用,并对未来数字家居的发展进行了展望。

  4. Networked Software’ Performance Metrics and Analysis with IBM SVC Config Advisor

    Directory of Open Access Journals (Sweden)

    Mr. Kushal S. Patel

    2014-06-01

    Full Text Available IBM SVC is a virtualization product by IBM which deals with storage virtualization. In this IBM SVC, numbers of hetrogenous hosts are connected by means of high speed communication network. To manage configuration of these networked component is very difficult task. To automate this configuration checking is need of the world. Hence IBM SVC Config Advisor tool is developed by IBM. This tool deals with remote configuration check for storage stand including IBM SVC and Storwize products. This paper introduces the IBM SVC Config Advisor tool along with performance statistics. This paper mainly deals with the networked tool’s performance statistics collection and analysis. Here IBM SVC Config Advisor is used as networked Tool for analysis. This paper can be useful to analyze software which are highly network dependent in nature.

  5. Multilayer motif analysis of brain networks

    CERN Document Server

    Battiston, Federico; Chavez, Mario; Latora, Vito

    2016-01-01

    In the last decade network science has shed new light on the anatomical connectivity and on correlations in the activity of different areas of the human brain. The study of brain networks has made possible in fact to detect the central areas of a neural system, and to identify its building blocks by looking at overabundant small subgraphs, known as motifs. However, network analysis of the brain has so far mainly focused on structural and functional networks as separate entities. The recently developed mathematical framework of multi-layer networks allows to perform a multiplex analysis of the human brain where the structural and functional layers are considered at the same time. In this work we describe how to classify subgraphs in multiplex networks, and we extend motif analysis to networks with many layers. We then extract multi-layer motifs in brain networks of healthy subjects by considering networks with two layers, respectively obtained from diffusion and functional magnetic resonance imaging. Results i...

  6. 3rd International Conference on Network Analysis

    CERN Document Server

    Kalyagin, Valery; Pardalos, Panos

    2014-01-01

    This volume compiles the major results of conference participants from the "Third International Conference in Network Analysis" held at the Higher School of Economics, Nizhny Novgorod in May 2013, with the aim to initiate further joint research among different groups. The contributions in this book cover a broad range of topics relevant to the theory and practice of network analysis, including the reliability of complex networks, software, theory, methodology, and applications.  Network analysis has become a major research topic over the last several years. The broad range of applications that can be described and analyzed by means of a network has brought together researchers, practitioners from numerous fields such as operations research, computer science, transportation, energy, biomedicine, computational neuroscience and social sciences. In addition, new approaches and computer environments such as parallel computing, grid computing, cloud computing, and quantum computing have helped to solve large scale...

  7. SOCIAL NETWORK ANALYSIS IN AN ONLINE BLOGOSPHERE

    Directory of Open Access Journals (Sweden)

    DR. M. MOHAMED SATHIK

    2011-01-01

    Full Text Available Social network is a social structure that exists among the similar interest of individuals, organizations or even on relations like friendship. Social network analysis is the measure of relationship between people, organizations and processing entities. In today’s scenario, Internet is acting as an interface for the people who spread across the globe to exchange their ideas. Social network analysis is a Web 2.0 application, which facilitates the users tointeract, response and express their views. Social network analysis has greatest attention as a research area in computer science in the recent past. Blogs or weblogs are like catalogs that are maintained by individuals related to a particular topic of interest. In this problem, we analyze the blog responses as social networks that are posted by AIDS patients over a period of time.

  8. Analysis of complex networks using aggressive abstraction.

    Energy Technology Data Exchange (ETDEWEB)

    Colbaugh, Richard; Glass, Kristin.; Willard, Gerald

    2008-10-01

    This paper presents a new methodology for analyzing complex networks in which the network of interest is first abstracted to a much simpler (but equivalent) representation, the required analysis is performed using the abstraction, and analytic conclusions are then mapped back to the original network and interpreted there. We begin by identifying a broad and important class of complex networks which admit abstractions that are simultaneously dramatically simplifying and property preserving we call these aggressive abstractions -- and which can therefore be analyzed using the proposed approach. We then introduce and develop two forms of aggressive abstraction: 1.) finite state abstraction, in which dynamical networks with uncountable state spaces are modeled using finite state systems, and 2.) onedimensional abstraction, whereby high dimensional network dynamics are captured in a meaningful way using a single scalar variable. In each case, the property preserving nature of the abstraction process is rigorously established and efficient algorithms are presented for computing the abstraction. The considerable potential of the proposed approach to complex networks analysis is illustrated through case studies involving vulnerability analysis of technological networks and predictive analysis for social processes.

  9. Consistency Analysis of Network Traffic Repositories

    NARCIS (Netherlands)

    Lastdrager, Elmer; Pras, Aiko

    2009-01-01

    Traffic repositories with TCP/IP header information are very important for network analysis. Researchers often assume that such repositories reliably represent all traffic that has been flowing over the network; little thoughts are made regarding the consistency of these repositories. Still, for var

  10. Automating Mid- and Long-Range Scheduling for NASA's Deep Space Network

    Science.gov (United States)

    Johnston, Mark D.; Tran, Daniel; Arroyo, Belinda; Sorensen, Sugi; Tay, Peter; Carruth, Butch; Coffman, Adam; Wallace, Mike

    2012-01-01

    NASA has recently deployed a new mid-range scheduling system for the antennas of the Deep Space Network (DSN), called Service Scheduling Software, or S(sup 3). This system is architected as a modern web application containing a central scheduling database integrated with a collaborative environment, exploiting the same technologies as social web applications but applied to a space operations context. This is highly relevant to the DSN domain since the network schedule of operations is developed in a peer-to-peer negotiation process among all users who utilize the DSN (representing 37 projects including international partners and ground-based science and calibration users). The initial implementation of S(sup 3) is complete and the system has been operational since July 2011. S(sup 3) has been used for negotiating schedules since April 2011, including the baseline schedules for three launching missions in late 2011. S(sup 3) supports a distributed scheduling model, in which changes can potentially be made by multiple users based on multiple schedule "workspaces" or versions of the schedule. This has led to several challenges in the design of the scheduling database, and of a change proposal workflow that allows users to concur with or to reject proposed schedule changes, and then counter-propose with alternative or additional suggested changes. This paper describes some key aspects of the S(sup 3) system and lessons learned from its operational deployment to date, focusing on the challenges of multi-user collaborative scheduling in a practical and mission-critical setting. We will also describe the ongoing project to extend S(sup 3) to encompass long-range planning, downtime analysis, and forecasting, as the next step in developing a single integrated DSN scheduling tool suite to cover all time ranges.

  11. Spectrum-Based and Collaborative Network Topology Analysis and Visualization

    Science.gov (United States)

    Hu, Xianlin

    2013-01-01

    Networks are of significant importance in many application domains, such as World Wide Web and social networks, which often embed rich topological information. Since network topology captures the organization of network nodes and links, studying network topology is very important to network analysis. In this dissertation, we study networks by…

  12. Automated acquisition and analysis of small angle X-ray scattering data

    International Nuclear Information System (INIS)

    Small Angle X-ray Scattering (SAXS) is a powerful tool in the study of biological macromolecules providing information about the shape, conformation, assembly and folding states in solution. Recent advances in robotic fluid handling make it possible to perform automated high throughput experiments including fast screening of solution conditions, measurement of structural responses to ligand binding, changes in temperature or chemical modifications. Here, an approach to full automation of SAXS data acquisition and data analysis is presented, which advances automated experiments to the level of a routine tool suitable for large scale structural studies. The approach links automated sample loading, primary data reduction and further processing, facilitating queuing of multiple samples for subsequent measurement and analysis and providing means of remote experiment control. The system was implemented and comprehensively tested in user operation at the BioSAXS beamlines X33 and P12 of EMBL at the DORIS and PETRA storage rings of DESY, Hamburg, respectively, but is also easily applicable to other SAXS stations due to its modular design.

  13. Fully Automated Sample Preparation for Ultrafast N-Glycosylation Analysis of Antibody Therapeutics.

    Science.gov (United States)

    Szigeti, Marton; Lew, Clarence; Roby, Keith; Guttman, Andras

    2016-04-01

    There is a growing demand in the biopharmaceutical industry for high-throughput, large-scale N-glycosylation profiling of therapeutic antibodies in all phases of product development, but especially during clone selection when hundreds of samples should be analyzed in a short period of time to assure their glycosylation-based biological activity. Our group has recently developed a magnetic bead-based protocol for N-glycosylation analysis of glycoproteins to alleviate the hard-to-automate centrifugation and vacuum-centrifugation steps of the currently used protocols. Glycan release, fluorophore labeling, and cleanup were all optimized, resulting in a automating all steps of the optimized magnetic bead-based protocol from endoglycosidase digestion, through fluorophore labeling and cleanup with high-throughput sample processing in 96-well plate format, using an automated laboratory workstation. Capillary electrophoresis analysis of the fluorophore-labeled glycans was also optimized for rapid (automated sample preparation workflow. Ultrafast N-glycosylation analyses of several commercially relevant antibody therapeutics are also shown and compared to their biosimilar counterparts, addressing the biological significance of the differences.

  14. Analysis and Testing of Mobile Wireless Networks

    Science.gov (United States)

    Alena, Richard; Evenson, Darin; Rundquist, Victor; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Wireless networks are being used to connect mobile computing elements in more applications as the technology matures. There are now many products (such as 802.11 and 802.11b) which ran in the ISM frequency band and comply with wireless network standards. They are being used increasingly to link mobile Intranet into Wired networks. Standard methods of analyzing and testing their performance and compatibility are needed to determine the limits of the technology. This paper presents analytical and experimental methods of determining network throughput, range and coverage, and interference sources. Both radio frequency (BE) domain and network domain analysis have been applied to determine wireless network throughput and range in the outdoor environment- Comparison of field test data taken under optimal conditions, with performance predicted from RF analysis, yielded quantitative results applicable to future designs. Layering multiple wireless network- sooners can increase performance. Wireless network components can be set to different radio frequency-hopping sequences or spreading functions, allowing more than one sooner to coexist. Therefore, we ran multiple 802.11-compliant systems concurrently in the same geographical area to determine interference effects and scalability, The results can be used to design of more robust networks which have multiple layers of wireless data communication paths and provide increased throughput overall.

  15. Equilibrium Analysis for Anycast in WDM Networks

    Institute of Scientific and Technical Information of China (English)

    唐矛宁; 王汉兴

    2005-01-01

    In this paper, the wavelength-routed WDM network, was analyzed for the dynamic case where the arrival of anycast requests was modeled by a state-dependent Poisson process. The equilibrium analysis was also given with the UWNC algorithm.

  16. GenePublisher: automated analysis of DNA microarray data

    DEFF Research Database (Denmark)

    Knudsen, Steen; Workman, Christopher; Sicheritz-Ponten, T.;

    2003-01-01

    GenePublisher, a system for automatic analysis of data from DNA microarray experiments, has been implemented with a web interface at http://www.cbs.dtu.dk/services/GenePublisher. Raw data are uploaded to the server together with aspecification of the data. The server performs normalization......, statistical analysis and visualization of the data. The results are run against databases of signal transduction pathways, metabolic pathways and promoter sequences in order to extract more information. The results of the entire analysis are summarized in report form and returned to the user....

  17. Extending Stochastic Network Calculus to Loss Analysis

    Directory of Open Access Journals (Sweden)

    Chao Luo

    2013-01-01

    Full Text Available Loss is an important parameter of Quality of Service (QoS. Though stochastic network calculus is a very useful tool for performance evaluation of computer networks, existing studies on stochastic service guarantees mainly focused on the delay and backlog. Some efforts have been made to analyse loss by deterministic network calculus, but there are few results to extend stochastic network calculus for loss analysis. In this paper, we introduce a new parameter named loss factor into stochastic network calculus and then derive the loss bound through the existing arrival curve and service curve via this parameter. We then prove that our result is suitable for the networks with multiple input flows. Simulations show the impact of buffer size, arrival traffic, and service on the loss factor.

  18. Computer network environment planning and analysis

    Science.gov (United States)

    Dalphin, John F.

    1989-01-01

    The GSFC Computer Network Environment provides a broadband RF cable between campus buildings and ethernet spines in buildings for the interlinking of Local Area Networks (LANs). This system provides terminal and computer linkage among host and user systems thereby providing E-mail services, file exchange capability, and certain distributed computing opportunities. The Environment is designed to be transparent and supports multiple protocols. Networking at Goddard has a short history and has been under coordinated control of a Network Steering Committee for slightly more than two years; network growth has been rapid with more than 1500 nodes currently addressed and greater expansion expected. A new RF cable system with a different topology is being installed during summer 1989; consideration of a fiber optics system for the future will begin soon. Summmer study was directed toward Network Steering Committee operation and planning plus consideration of Center Network Environment analysis and modeling. Biweekly Steering Committee meetings were attended to learn the background of the network and the concerns of those managing it. Suggestions for historical data gathering have been made to support future planning and modeling. Data Systems Dynamic Simulator, a simulation package developed at NASA and maintained at GSFC was studied as a possible modeling tool for the network environment. A modeling concept based on a hierarchical model was hypothesized for further development. Such a model would allow input of newly updated parameters and would provide an estimation of the behavior of the network.

  19. Infrascope: Full-Spectrum Phonocardiography with Automated Signal Analysis Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Using digital signal analysis tools, we will generate a repeatable output from the infrascope and compare it to the output of a traditional electrocardiogram, and...

  20. Automation of Safety Analysis with SysML Models Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This project was a small proof-of-concept case study, generating SysML model information as a side effect of safety analysis. A prototype FMEA Assistant was...

  1. Implicit media frames: Automated analysis of public debate on artificial sweeteners

    CERN Document Server

    Hellsten, Iina; Leydesdorff, Loet

    2010-01-01

    The framing of issues in the mass media plays a crucial role in the public understanding of science and technology. This article contributes to research concerned with diachronic analysis of media frames by making an analytical distinction between implicit and explicit media frames, and by introducing an automated method for analysing diachronic changes of implicit frames. In particular, we apply a semantic maps method to a case study on the newspaper debate about artificial sweeteners, published in The New York Times (NYT) between 1980 and 2006. Our results show that the analysis of semantic changes enables us to filter out the dynamics of implicit frames, and to detect emerging metaphors in public debates. Theoretically, we discuss the relation between implicit frames in public debates and codification of information in scientific discourses, and suggest further avenues for research interested in the automated analysis of frame changes and trends in public debates.

  2. Forecasting macroeconomic variables using neural network models and three automated model selection techniques

    DEFF Research Database (Denmark)

    Kock, Anders Bredahl; Teräsvirta, Timo

    2016-01-01

    When forecasting with neural network models one faces several problems, all of which influence the accuracy of the forecasts. First, neural networks are often hard to estimate due to their highly nonlinear structure. To alleviate the problem, White (2006) presented a solution (QuickNet) that conv......When forecasting with neural network models one faces several problems, all of which influence the accuracy of the forecasts. First, neural networks are often hard to estimate due to their highly nonlinear structure. To alleviate the problem, White (2006) presented a solution (Quick...

  3. Automated development of artificial neural networks for clinical purposes: Application for predicting the outcome of choledocholithiasis surgery.

    Science.gov (United States)

    Vukicevic, Arso M; Stojadinovic, Miroslav; Radovic, Milos; Djordjevic, Milena; Cirkovic, Bojana Andjelkovic; Pejovic, Tomislav; Jovicic, Gordana; Filipovic, Nenad

    2016-08-01

    Among various expert systems (ES), Artificial Neural Network (ANN) has shown to be suitable for the diagnosis of concurrent common bile duct stones (CBDS) in patients undergoing elective cholecystectomy. However, their application in practice remains limited since the development of ANNs represents a slow process that requires additional expertize from potential users. The aim of this study was to propose an ES for automated development of ANNs and validate its performances on the problem of prediction of CBDS. Automated development of the ANN was achieved by applying the evolutionary assembling approach, which assumes optimal configuring of the ANN parameters by using Genetic algorithm. Automated selection of optimal features for the ANN training was performed using a Backward sequential feature selection algorithm. The assessment of the developed ANN included the evaluation of predictive ability and clinical utility. For these purposes, we collected data from 303 patients who underwent surgery in the period from 2008 to 2014. The results showed that the total bilirubin, alanine aminotransferase, common bile duct diameter, number of stones, size of the smallest calculus, biliary colic, acute cholecystitis and pancreatitis had the best prognostic value of CBDS. Compared to the alternative approaches, the ANN obtained by the proposed ES had better sensitivity and clinical utility, which are considered to be the most important for the particular problem. Besides the fact that it enabled the development of ANNs with better performances, the proposed ES significantly reduced the complexity of ANNs' development compared to previous studies that required manual selection of optimal features and/or ANN configuration. Therefore, it is concluded that the proposed ES represents a robust and user-friendly framework that, apart from the prediction of CBDS, could advance and simplify the application of ANNs for solving a wider range of problems. PMID:27261565

  4. Automative Multi Classifier Framework for Medical Image Analysis

    Directory of Open Access Journals (Sweden)

    R. Edbert Rajan

    2015-04-01

    Full Text Available Medical image processing is the technique used to create images of the human body for medical purposes. Nowadays, medical image processing plays a major role and a challenging solution for the critical stage in the medical line. Several researches have done in this area to enhance the techniques for medical image processing. However, due to some demerits met by some advanced technologies, there are still many aspects that need further development. Existing study evaluate the efficacy of the medical image analysis with the level-set shape along with fractal texture and intensity features to discriminate PF (Posterior Fossa tumor from other tissues in the brain image. To develop the medical image analysis and disease diagnosis, to devise an automotive subjective optimality model for segmentation of images based on different sets of selected features from the unsupervised learning model of extracted features. After segmentation, classification of images is done. The classification is processed by adapting the multiple classifier frameworks in the previous work based on the mutual information coefficient of the selected features underwent for image segmentation procedures. In this study, to enhance the classification strategy, we plan to implement enhanced multi classifier framework for the analysis of medical images and disease diagnosis. The performance parameter used for the analysis of the proposed enhanced multi classifier framework for medical image analysis is Multiple Class intensity, image quality, time consumption.

  5. Forecasting Macroeconomic Variables using Neural Network Models and Three Automated Model Selection Techniques

    DEFF Research Database (Denmark)

    Kock, Anders Bredahl; Teräsvirta, Timo

    previous studies have indicated. When forecasting with neural network models one faces several problems, all of which influence the accuracy of the forecasts. First, neural networks are often hard to estimate due to their highly nonlinear structure. In fact, their parameters are not even globally...

  6. AMAB: Automated measurement and analysis of body motion

    NARCIS (Netherlands)

    Poppe, Ronald; Zee, van der Sophie; Heylen, Dirk K.J.; Taylor, Paul J.

    2014-01-01

    Technologies that measure human nonverbal behavior have existed for some time, and their use in the analysis of social behavior has become more popular following the development of sensor technologies that record full-body movement. However, a standardized methodology to efficiently represent and an

  7. Automated analysis of security requirements through risk-based argumentation

    NARCIS (Netherlands)

    Yu, Yijun; Franqueira, Virginia N.L.; Tun, Thein Tan; Wieringa, Roel J.; Nuseibeh, Bashar

    2015-01-01

    Computer-based systems are increasingly being exposed to evolving security threats, which often reveal new vulnerabilities. A formal analysis of the evolving threats is difficult due to a number of practical considerations such as incomplete knowledge about the design, limited information about atta

  8. Automated analysis of three-dimensional stress echocardiography

    NARCIS (Netherlands)

    K.Y.E. Leung (Esther); M. van Stralen (Marijn); M.G. Danilouchkine (Mikhail); G. van Burken (Gerard); M.L. Geleijnse (Marcel); J.H.C. Reiber (Johan); N. de Jong (Nico); A.F.W. van der Steen (Ton); J.G. Bosch (Johan)

    2011-01-01

    textabstractReal-time three-dimensional (3D) ultrasound imaging has been proposed as an alternative for two-dimensional stress echocardiography for assessing myocardial dysfunction and underlying coronary artery disease. Analysis of 3D stress echocardiography is no simple task and requires considera

  9. Analysis of the automated systems of planning of spatial constructions

    Directory of Open Access Journals (Sweden)

    М.С. Барабаш

    2004-04-01

    Full Text Available  The article is devoted to the questions of analysis of existing SAPR and questions of development of new information technologies of planning on the basis of integration of programmatic complexes with the use of united informatively-logical model of object.

  10. 1st International Conference on Network Analysis

    CERN Document Server

    Kalyagin, Valery; Pardalos, Panos

    2013-01-01

    This volume contains a selection of contributions from the "First International Conference in Network Analysis," held at the University of Florida, Gainesville, on December 14-16, 2011. The remarkable diversity of fields that take advantage of Network Analysis makes the endeavor of gathering up-to-date material in a single compilation a useful, yet very difficult, task. The purpose of this volume is to overcome this difficulty by collecting the major results found by the participants and combining them in one easily accessible compilation. Network analysis has become a major research topic over the last several years. The broad range of applications that can be described and analyzed by means of a network is bringing together researchers, practitioners and other scientific communities from numerous fields such as Operations Research, Computer Science, Transportation, Energy, Social Sciences, and more. The contributions not only come from different fields, but also cover a broad range of topics relevant to the...

  11. Visualization and Analysis of Complex Covert Networks

    DEFF Research Database (Denmark)

    Memon, Bisharat

    This report discusses and summarize the results of my work so far in relation to my Ph.D. project entitled "Visualization and Analysis of Complex Covert Networks". The focus of my research is primarily on development of methods and supporting tools for visualization and analysis of networked...... systems that are covert and hence inherently complex. My Ph.D. is positioned within the wider framework of CrimeFighter project. The framework envisions a number of key knowledge management processes that are involved in the workflow, and the toolbox provides supporting tools to assist human end......-users (intelligence analysts) in harvesting, filtering, storing, managing, structuring, mining, analyzing, interpreting, and visualizing data about offensive networks. The methods and tools proposed and discussed in this work can also be applied to analysis of more generic complex networks....

  12. An automated image analysis system to measure and count organisms in laboratory microcosms.

    Science.gov (United States)

    Mallard, François; Le Bourlot, Vincent; Tully, Thomas

    2013-01-01

    1. Because of recent technological improvements in the way computer and digital camera perform, the potential use of imaging for contributing to the study of communities, populations or individuals in laboratory microcosms has risen enormously. However its limited use is due to difficulties in the automation of image analysis. 2. We present an accurate and flexible method of image analysis for detecting, counting and measuring moving particles on a fixed but heterogeneous substrate. This method has been specifically designed to follow individuals, or entire populations, in experimental laboratory microcosms. It can be used in other applications. 3. The method consists in comparing multiple pictures of the same experimental microcosm in order to generate an image of the fixed background. This background is then used to extract, measure and count the moving organisms, leaving out the fixed background and the motionless or dead individuals. 4. We provide different examples (springtails, ants, nematodes, daphnia) to show that this non intrusive method is efficient at detecting organisms under a wide variety of conditions even on faintly contrasted and heterogeneous substrates. 5. The repeatability and reliability of this method has been assessed using experimental populations of the Collembola Folsomia candida. 6. We present an ImageJ plugin to automate the analysis of digital pictures of laboratory microcosms. The plugin automates the successive steps of the analysis and recursively analyses multiple sets of images, rapidly producing measurements from a large number of replicated microcosms. PMID:23734199

  13. An automated image analysis system to measure and count organisms in laboratory microcosms.

    Directory of Open Access Journals (Sweden)

    François Mallard

    Full Text Available 1. Because of recent technological improvements in the way computer and digital camera perform, the potential use of imaging for contributing to the study of communities, populations or individuals in laboratory microcosms has risen enormously. However its limited use is due to difficulties in the automation of image analysis. 2. We present an accurate and flexible method of image analysis for detecting, counting and measuring moving particles on a fixed but heterogeneous substrate. This method has been specifically designed to follow individuals, or entire populations, in experimental laboratory microcosms. It can be used in other applications. 3. The method consists in comparing multiple pictures of the same experimental microcosm in order to generate an image of the fixed background. This background is then used to extract, measure and count the moving organisms, leaving out the fixed background and the motionless or dead individuals. 4. We provide different examples (springtails, ants, nematodes, daphnia to show that this non intrusive method is efficient at detecting organisms under a wide variety of conditions even on faintly contrasted and heterogeneous substrates. 5. The repeatability and reliability of this method has been assessed using experimental populations of the Collembola Folsomia candida. 6. We present an ImageJ plugin to automate the analysis of digital pictures of laboratory microcosms. The plugin automates the successive steps of the analysis and recursively analyses multiple sets of images, rapidly producing measurements from a large number of replicated microcosms.

  14. Molecular Detection of Bladder Cancer by Fluorescence Microsatellite Analysis and an Automated Genetic Analyzing System

    Directory of Open Access Journals (Sweden)

    Sarel Halachmi

    2007-01-01

    Full Text Available To investigate the ability of an automated fluorescent analyzing system to detect microsatellite alterations, in patients with bladder cancer. We investigated 11 with pathology proven bladder Transitional Cell Carcinoma (TCC for microsatellite alterations in blood, urine, and tumor biopsies. DNA was prepared by standard methods from blood, urine and resected tumor specimens, and was used for microsatellite analysis. After the primers were fluorescent labeled, amplification of the DNA was performed with PCR. The PCR products were placed into the automated genetic analyser (ABI Prism 310, Perkin Elmer, USA and were subjected to fluorescent scanning with argon ion laser beams. The fluorescent signal intensity measured by the genetic analyzer measured the product size in terms of base pairs. We found loss of heterozygocity (LOH or microsatellite alterations (a loss or gain of nucleotides, which alter the original normal locus size in all the patients by using fluorescent microsatellite analysis and an automated analyzing system. In each case the genetic changes found in urine samples were identical to those found in the resected tumor sample. The studies demonstrated the ability to detect bladder tumor non-invasively by fluorescent microsatellite analysis of urine samples. Our study supports the worldwide trend for the search of non-invasive methods to detect bladder cancer. We have overcome major obstacles that prevented the clinical use of an experimental system. With our new tested system microsatellite analysis can be done cheaper, faster, easier and with higher scientific accuracy.

  15. Automated Performance Monitoring Data Analysis and Reporting within the Open Source R Environment

    Science.gov (United States)

    Kennel, J.; Tonkin, M. J.; Faught, W.; Lee, A.; Biebesheimer, F.

    2013-12-01

    Environmental scientists encounter quantities of data at a rate that in many cases outpaces our ability to appropriately store, visualize and convey the information. The free software environment, R, provides a framework for efficiently processing, analyzing, depicting and reporting on data from a multitude of formats in the form of traceable and quality-assured data summary reports. Automated data summary reporting leverages document markup languages such as markdown, HTML, or LaTeX using R-scripts capable of completing a variety of simple or sophisticated data processing, analysis and visualization tasks. Automated data summary reports seamlessly integrate analysis into report production with calculation outputs - such as plots, maps and statistics - included alongside report text. Once a site-specific template is set up, including data types, geographic base data and reporting requirements, reports can be (re-)generated trivially as the data evolve. The automated data summary report can be a stand-alone report, or it can be incorporated as an attachment to an interpretive report prepared by a subject-matter expert, thereby providing the technical basis to report on and efficiently evaluate large volumes of data resulting in a concise interpretive report. Hence, the data summary report does not replace the scientist, but relieves them of repetitive data processing tasks, facilitating a greater level of analysis. This is demonstrated using an implementation developed for monthly groundwater data reporting for a multi-constituent contaminated site, highlighting selected analysis techniques that can be easily incorporated in a data summary report.

  16. Automated analysis of protein subcellular location in time series images

    OpenAIRE

    Hu, Yanhua; Osuna-Highley, Elvira; Hua, Juchang; Nowicki, Theodore Scott; Stolz, Robert; McKayle, Camille; Murphy, Robert F.

    2010-01-01

    Motivation: Image analysis, machine learning and statistical modeling have become well established for the automatic recognition and comparison of the subcellular locations of proteins in microscope images. By using a comprehensive set of features describing static images, major subcellular patterns can be distinguished with near perfect accuracy. We now extend this work to time series images, which contain both spatial and temporal information. The goal is to use temporal features to improve...

  17. Social Networks Analysis in Discovering the Narrative Structure of Literary Fiction

    CERN Document Server

    Jarynowski, Andrzej

    2016-01-01

    In our paper we would like to make a cross-disciplinary leap and use the tools of network theory to understand and explore narrative structure in literary fiction, an approach that is still underestimated. However, the systems in fiction are sensitive to readers subjectivity and attention must to be paid to different methods of extracting networks. The project aims at investigating into different ways social interactions are read in texts by comparing networks produced by automated algorithms-natural language processing with those created by surveying more subjective human responses. Conversation networks from fiction have been already extracted by scientists, but the more general framework surrounding these interactions was missing. We propose several NLP methods for detecting interactions and test them against a range of human perceptions. In doing so, we have pointed to some limitations of using network analysis to test literary theory, e.g. interaction, which corresponds to the plot, does not form climax.

  18. Radial basis function (RBF) neural network control for mechanical systems design, analysis and Matlab simulation

    CERN Document Server

    Liu, Jinkun

    2013-01-01

    Radial Basis Function (RBF) Neural Network Control for Mechanical Systems is motivated by the need for systematic design approaches to stable adaptive control system design using neural network approximation-based techniques. The main objectives of the book are to introduce the concrete design methods and MATLAB simulation of stable adaptive RBF neural control strategies. In this book, a broad range of implementable neural network control design methods for mechanical systems are presented, such as robot manipulators, inverted pendulums, single link flexible joint robots, motors, etc. Advanced neural network controller design methods and their stability analysis are explored. The book provides readers with the fundamentals of neural network control system design.   This book is intended for the researchers in the fields of neural adaptive control, mechanical systems, Matlab simulation, engineering design, robotics and automation. Jinkun Liu is a professor at Beijing University of Aeronautics and Astronauti...

  19. Automated analysis of image mammogram for breast cancer diagnosis

    Science.gov (United States)

    Nurhasanah, Sampurno, Joko; Faryuni, Irfana Diah; Ivansyah, Okto

    2016-03-01

    Medical imaging help doctors in diagnosing and detecting diseases that attack the inside of the body without surgery. Mammogram image is a medical image of the inner breast imaging. Diagnosis of breast cancer needs to be done in detail and as soon as possible for determination of next medical treatment. The aim of this work is to increase the objectivity of clinical diagnostic by using fractal analysis. This study applies fractal method based on 2D Fourier analysis to determine the density of normal and abnormal and applying the segmentation technique based on K-Means clustering algorithm to image abnormal for determine the boundary of the organ and calculate the area of organ segmentation results. The results show fractal method based on 2D Fourier analysis can be used to distinguish between the normal and abnormal breast and segmentation techniques with K-Means Clustering algorithm is able to generate the boundaries of normal and abnormal tissue organs, so area of the abnormal tissue can be determined.

  20. Instrumentation, Field Network and Process Automation for the Cryogenic System of the LHC Test String

    OpenAIRE

    Suraci, A.; Bager, T.; Balle, Ch.; Blanco, E.; Casas, J.; Gomes, P.; Pelletier, S.; Serio, L.; Vauthier, N.

    2001-01-01

    CERN is now setting up String 2, a full-size prototype of a regular cell of the LHC arc. It is composed of two quadrupole, six dipole magnets, and a separate cryogenic distribution line (QRL) for the supply and recovery of the cryogen. An electrical feed box (DFB), with up to 38 High Temperature Superconducting (HTS) leads, powers the magnets. About 700 sensors and actuators are distributed along four Profibus DP and two Profibus PA field buses. The process automation is handled by two contro...

  1. Analysis and applications of complex networks

    Science.gov (United States)

    Bagrow, James Peter

    This thesis is concerned with three main areas of complex networks research. One is on developing and testing new methods to find communities, especially methods that do not need knowledge of the entire network. The second is on the application of shells and their usage when characterizing and identifying important network properties. Finally, we offer several contributions toward the usage of complex networks as a tool for studying social dynamics. The study of communities, densely interconnected subsets of nodes, is a difficult and important problem. Methods to identify communities are developed which have the rare ability to function with only local knowledge of the network. A new bench-marking and evaluation procedure is introduced to compare the performance of both existing and new local community algorithms. Using shells, we introduce a new matrix structure that allows for quantitative comparison and visualization of networks of all sizes, even extremely large ones. This "portrait" encodes a great deal of information including dimensionality and regularity, and imparts immediate intuition about the network at hand. A distance metric generated by comparing two portraits allows one to test if, e.g., two networks were created by the same underlying generating mechanism. Generalizations to weighted networks are studied, as is applicability to the Graph Isomorphism problem. We introduce the Patron-Artwork model as a new means of generating a distribution of fame or knowledge from an underlying social network, and give a full analysis for a network where all members are neighbors. In addition, the so-called Small World Phenomenon has been studied in the context of social networks, specifically that of Kleinberg navigation. We studied the impact of modifying the underlying Kleinberg lattice by introducing an anisotropy: the lattice is either stretched along one axis or long-distance connections are made more favorable along a preferred direction.

  2. Automated fine structure image analysis method for discrimination of diabetic retinopathy stage using conjunctival microvasculature images

    Science.gov (United States)

    Khansari, Maziyar M; O’Neill, William; Penn, Richard; Chau, Felix; Blair, Norman P; Shahidi, Mahnaz

    2016-01-01

    The conjunctiva is a densely vascularized mucus membrane covering the sclera of the eye with a unique advantage of accessibility for direct visualization and non-invasive imaging. The purpose of this study is to apply an automated quantitative method for discrimination of different stages of diabetic retinopathy (DR) using conjunctival microvasculature images. Fine structural analysis of conjunctival microvasculature images was performed by ordinary least square regression and Fisher linear discriminant analysis. Conjunctival images between groups of non-diabetic and diabetic subjects at different stages of DR were discriminated. The automated method’s discriminate rates were higher than those determined by human observers. The method allowed sensitive and rapid discrimination by assessment of conjunctival microvasculature images and can be potentially useful for DR screening and monitoring. PMID:27446692

  3. An Empirical Study on the Impact of Automation on the Requirements Analysis Process

    Institute of Scientific and Technical Information of China (English)

    Giuseppe Lami; Robert W. Ferguson

    2007-01-01

    Requirements analysis is an important phase in a software project. The analysis is often performed in aninformal way by specialists who review documents looking for ambiguities, technical inconsistencies and incomplete parts.Automation is still far from being applied in requirements analyses, above all since natural languages are informal andthus difficult to treat automatically. There are only a few tools that can analyse texts. One of them, called QuARS, wasdeveloped by the Istituto di Scienza e Tecnologie dell'Informazione and can analyse texts in terms of ambiguity. This paperdescribes how QuARS was used in a formal empirical experiment to assess the impact in terms of effectiveness and efficacyof the automation in the requirements review process of a software company.

  4. A Multi-Wavelength Analysis of Active Regions and Sunspots by Comparison of Automated Detection Algorithms

    OpenAIRE

    Verbeeck, Cis; Higgins, Paul A.; Colak, Tufan; Watson, Fraser T.; Delouille, Veronique; Mampaey, Benjamin; Qahwaji, Rami

    2011-01-01

    Since the Solar Dynamics Observatory (SDO) began recording ~ 1 TB of data per day, there has been an increased need to automatically extract features and events for further analysis. Here we compare the overall detection performance, correlations between extracted properties, and usability for feature tracking of four solar feature-detection algorithms: the Solar Monitor Active Region Tracker (SMART) detects active regions in line-of-sight magnetograms; the Automated Solar Activity Prediction...

  5. SCHUBOT: Machine Learning Tools for the Automated Analysis of Schubert’s Lieder

    OpenAIRE

    Nagler, Dylan Jeremy

    2014-01-01

    This paper compares various methods for automated musical analysis, applying machine learning techniques to gain insight about the Lieder (art songs) of com- poser Franz Schubert (1797-1828). Known as a rule-breaking, individualistic, and adventurous composer, Schubert produced hundreds of emotionally-charged songs that have challenged music theorists to this day. The algorithms presented in this paper analyze the harmonies, melodies, and texts of these songs. This paper begins with an explor...

  6. Pharmacokinetic analysis of topical tobramycin in equine tears by automated immunoassay

    OpenAIRE

    Czerwinski Sarah L; Lyon Andrew W; Skorobohach Brian; Léguillette Renaud

    2012-01-01

    Abstract Background Ophthalmic antibiotic therapy in large animals is often used empirically because of the lack of pharmacokinetics studies. The purpose of the study was to determine the pharmacokinetics of topical tobramycin 0.3% ophthalmic solution in the tears of normal horses using an automated immunoassay analysis. Results The mean tobramycin concentrations in the tears at 5, 10, 15, 30 minutes and 1, 2, 4, 6 hours after administration were 759 (±414), 489 (±237), 346 (±227), 147 (±264)...

  7. Network Anomaly Detection Based on Wavelet Analysis

    Science.gov (United States)

    Lu, Wei; Ghorbani, Ali A.

    2008-12-01

    Signal processing techniques have been applied recently for analyzing and detecting network anomalies due to their potential to find novel or unknown intrusions. In this paper, we propose a new network signal modelling technique for detecting network anomalies, combining the wavelet approximation and system identification theory. In order to characterize network traffic behaviors, we present fifteen features and use them as the input signals in our system. We then evaluate our approach with the 1999 DARPA intrusion detection dataset and conduct a comprehensive analysis of the intrusions in the dataset. Evaluation results show that the approach achieves high-detection rates in terms of both attack instances and attack types. Furthermore, we conduct a full day's evaluation in a real large-scale WiFi ISP network where five attack types are successfully detected from over 30 millions flows.

  8. Network Anomaly Detection Based on Wavelet Analysis

    Directory of Open Access Journals (Sweden)

    Ali A. Ghorbani

    2008-11-01

    Full Text Available Signal processing techniques have been applied recently for analyzing and detecting network anomalies due to their potential to find novel or unknown intrusions. In this paper, we propose a new network signal modelling technique for detecting network anomalies, combining the wavelet approximation and system identification theory. In order to characterize network traffic behaviors, we present fifteen features and use them as the input signals in our system. We then evaluate our approach with the 1999 DARPA intrusion detection dataset and conduct a comprehensive analysis of the intrusions in the dataset. Evaluation results show that the approach achieves high-detection rates in terms of both attack instances and attack types. Furthermore, we conduct a full day's evaluation in a real large-scale WiFi ISP network where five attack types are successfully detected from over 30 millions flows.

  9. Classification and Analysis of Computer Network Traffic

    DEFF Research Database (Denmark)

    Bujlow, Tomasz

    2014-01-01

    to create realistic traffic profiles of the selected applications, which can server as the training data for MLAs. We assessed the usefulness of C5.0 Machine Learning Algorithm (MLA) in the classification of computer network traffic. We showed that the application-layer payload is not needed to train the C5...... various classification modes (decision trees, rulesets, boosting, softening thresholds) regarding the classification accuracy and the time required to create the classifier. We showed how to use our VBS tool to obtain per-flow, per-application, and per-content statistics of traffic in computer networks......Traffic monitoring and analysis can be done for multiple different reasons: to investigate the usage of network resources, assess the performance of network applications, adjust Quality of Service (QoS) policies in the network, log the traffic to comply with the law, or create realistic models...

  10. Identifying and Managing Data Validity Challenges with Automated Data Checks in the AmeriFlux Flux Measurement Network

    Science.gov (United States)

    Poindexter, C.; Pastorello, G.; Papale, D.; Trotta, C.; Ribeca, A.; Canfora, E.; Faybishenko, B.; Samak, T.; Gunter, D.; Hollowgrass, R.; Agarwal, D.

    2014-12-01

    AmeriFlux is a network of sites managed by independent investigators measuring carbon, water and heat fluxes. Individual investigators perform many data validity checks. Network-level data validity checks are also being applied to increase network-wide data consistency. A number of different types or errors occur in flux data, and while corrections have been developed to address some types of errors, other error types can be difficult to detect. To identify errors rapidly and consistently, we have developed automated data validity checks that rely on theoretical limits or relationships for specific measured variables. We present an example of a data validity check that is being developed for the friction velocity u*. The friction velocity is a crucial variable used to identify when low turbulent mixing in the atmospheric boundary layer invalidates eddy covariance measurements of fluxes. It is measured via sonic anemometer and is related to the wind speed WS, the measurement height relative to the canopy height, and the surface roughness, through the log law. Comparing independent measurements of WS and u* can help identify issues related to the sensor but doesn't take into consideration changes in the canopy (e.g. due to leaf emergence). The u* data check proposed relies on recent work comparing multiple methods for determining the aerodynamic roughness length z0 and zero plane displacement d (Graf, A., A. van de Boer, A. Moene & H. Vereecken, 2014, Boundary-Layer Meteorol., 151, 373-387). These methods, each of which is most robust across a different atmospheric stability range, yield multiple estimates for z0 and d at daily resolution. We use these multiple estimates for z0 and d, as well as half-hourly wind speeds and Obukhov length scales and their uncertainties to generate a predicted u* and a tolerance around this predicted value. In testing, this check correctly identified as invalid u* data known to be erroneous but did not flag data that could look

  11. Automation Security

    OpenAIRE

    Mirzoev, Dr. Timur

    2014-01-01

    Web-based Automated Process Control systems are a new type of applications that use the Internet to control industrial processes with the access to the real-time data. Supervisory control and data acquisition (SCADA) networks contain computers and applications that perform key functions in providing essential services and commodities (e.g., electricity, natural gas, gasoline, water, waste treatment, transportation) to all Americans. As such, they are part of the nation s critical infrastructu...

  12. Development of automated preparation system for isotopocule analysis of N2O in various air samples

    Science.gov (United States)

    Toyoda, Sakae; Yoshida, Naohiro

    2016-05-01

    Nitrous oxide (N2O), an increasingly abundant greenhouse gas in the atmosphere, is the most important stratospheric ozone-depleting gas of this century. Natural abundance ratios of isotopocules of N2O, NNO molecules substituted with stable isotopes of nitrogen and oxygen, are a promising index of various sources or production pathways of N2O and of its sink or decomposition pathways. Several automated methods have been reported to improve the analytical precision for the isotopocule ratio of atmospheric N2O and to reduce the labor necessary for complicated sample preparation procedures related to mass spectrometric analysis. However, no method accommodates flask samples with limited volume or pressure. Here we present an automated preconcentration system which offers flexibility with respect to the available gas volume, pressure, and N2O concentration. The shortest processing time for a single analysis of typical atmospheric sample is 40 min. Precision values of isotopocule ratio analysis are < 0.1 ‰ for δ15Nbulk (average abundances of 14N15N16O and 15N14N16O relative to 14N14N16O), < 0.2 ‰ for δ18O (relative abundance of 14N14N18O), and < 0.5 ‰ for site preference (SP; difference between relative abundance of 14N15N16O and 15N14N16O). This precision is comparable to that of other automated systems, but better than that of our previously reported manual measurement system.

  13. Social network analysis applied to team sports analysis

    CERN Document Server

    Clemente, Filipe Manuel; Mendes, Rui Sousa

    2016-01-01

    Explaining how graph theory and social network analysis can be applied to team sports analysis, This book presents useful approaches, models and methods that can be used to characterise the overall properties of team networks and identify the prominence of each team player. Exploring the different possible network metrics that can be utilised in sports analysis, their possible applications and variances from situation to situation, the respective chapters present an array of illustrative case studies. Identifying the general concepts of social network analysis and network centrality metrics, readers are shown how to generate a methodological protocol for data collection. As such, the book provides a valuable resource for students of the sport sciences, sports engineering, applied computation and the social sciences.

  14. Tool for automated method design in activation analysis

    International Nuclear Information System (INIS)

    A computational approach to the optimization of the adjustable parameters of nuclear activation analysis has been developed for use in comprehensive method design calculations. An estimate of sample composition is used to predict the gamma-ray spectra to be expected for given sets of values of experimental parameters. These spectra are used to evaluate responses such as detection limits and measurement precision for application to optimization by the simplex method. This technique has been successfully implemented for the simultaneous determination of sample size and irradiation, decay and counting times by the optimization of either detection limit or precision. Both single-element and multielement determinations can be designed with the aid of these calculations. The combination of advance prediction and simplex optimization is both flexible and efficient and produces numerical results suitable for use in further computations

  15. INVESTIGATION OF NEURAL NETWORK ALGORITHM FOR DETECTION OF NETWORK HOST ANOMALIES IN THE AUTOMATED SEARCH FOR XSS VULNERABILITIES AND SQL INJECTIONS

    Directory of Open Access Journals (Sweden)

    Y. D. Shabalin

    2016-03-01

    Full Text Available A problem of aberrant behavior detection for network communicating computer is discussed. A novel approach based on dynamic response of computer is introduced. The computer is suggested as a multiple-input multiple-output (MIMO plant. To characterize dynamic response of the computer on incoming requests a correlation between input data rate and observed output response (outgoing data rate and performance metrics is used. To distinguish normal and aberrant behavior of the computer one-class neural network classifieris used. General idea of the algorithm is shortly described. Configuration of network testbed for experiments with real attacks and their detection is presented (the automated search for XSS and SQL injections. Real found-XSS and SQL injection attack software was used to model the intrusion scenario. It would be expectable that aberrant behavior of the server will reveal itself by some instantaneous correlation response which will be significantly different from any of normal ones. It is evident that correlation picture of attacks from different malware running, the site homepage overriding on the server (so called defacing, hardware and software failures will differ from correlation picture of normal functioning. Intrusion detection algorithm is investigated to estimate false positive and false negative rates in relation to algorithm parameters. The importance of correlation width value and threshold value selection was emphasized. False positive rate was estimated along the time series of experimental data. Some ideas about enhancement of the algorithm quality and robustness were mentioned.

  16. Enhancing the Authentication of Bank Cheque Signatures by Implementing Automated System Using Recurrent Neural Network

    CERN Document Server

    Rao, Mukta; Dhaka, Vijaypal Singh

    2010-01-01

    The associatie memory feature of the Hopfield type recurrent neural network is used for the pattern storage and pattern authentication.This paper outlines an optimization relaxation approach for signature verification based on the Hopfield neural network (HNN)which is a recurrent network.The standard sample signature of the customer is cross matched with the one supplied on the Cheque.The difference percentage is obtained by calculating the different pixels in both the images.The network topology is built so that each pixel in the difference image is a neuron in the network.Each neuron is categorized by its states,which in turn signifies that if the particular pixel is changed.The network converges to unwavering condition based on the energy function which is derived in experiments.The Hopfield's model allows each node to take on two binary state values (changed/unchanged)for each pixel.The performance of the proposed technique is evaluated by applying it in various binary and gray scale images.This paper con...

  17. A new web-based method for automated analysis of muscle histology

    Directory of Open Access Journals (Sweden)

    Pertl Cordula

    2013-01-01

    Full Text Available Abstract Background Duchenne Muscular Dystrophy is an inherited degenerative neuromuscular disease characterised by rapidly progressive muscle weakness. Currently, curative treatment is not available. Approaches for new treatments that improve muscle strength and quality of life depend on preclinical testing in animal models. The mdx mouse model is the most frequently used animal model for preclinical studies in muscular dystrophy research. Standardised pathology-relevant parameters of dystrophic muscle in mdx mice for histological analysis have been developed in international, collaborative efforts, but automation has not been accessible to most research groups. A standardised and mainly automated quantitative assessment of histopathological parameters in the mdx mouse model is desirable to allow an objective comparison between laboratories. Methods Immunological and histochemical reactions were used to obtain a double staining for fast and slow myosin. Additionally, fluorescence staining of the myofibre membranes allows defining the minimal Feret’s diameter. The staining of myonuclei with the fluorescence dye bisbenzimide H was utilised to identify nuclei located internally within myofibres. Relevant structures were extracted from the image as single objects and assigned to different object classes using web-based image analysis (MyoScan. Quantitative and morphometric data were analysed, e.g. the number of nuclei per fibre and minimal Feret’s diameter in 6 month old wild-type C57BL/10 mice and mdx mice. Results In the current version of the module “MyoScan”, essential parameters for histologic analysis of muscle sections were implemented including the minimal Feret’s diameter of the myofibres and the automated calculation of the percentage of internally nucleated myofibres. Morphometric data obtained in the present study were in good agreement with previously reported data in the literature and with data obtained from manual

  18. Kinetic analysis of complex metabolic networks

    Energy Technology Data Exchange (ETDEWEB)

    Stephanopoulos, G. [MIT, Cambridge, MA (United States)

    1996-12-31

    A new methodology is presented for the analysis of complex metabolic networks with the goal of metabolite overproduction. The objective is to locate a small number of reaction steps in a network that have maximum impact on network flux amplification and whose rate can also be increased without functional network derangement. This method extends the concepts of Metabolic Control Analysis to groups of reactions and offers the means for calculating group control coefficients as measures of the control exercised by groups of reactions on the overall network fluxes and intracellular metabolite pools. It is further demonstrated that the optimal strategy for the effective increase of network fluxes, while maintaining an uninterrupted supply of intermediate metabolites, is through the coordinated amplification of multiple (as opposed to a single) reaction steps. Satisfying this requirement invokes the concept of the concentration control to coefficient, which emerges as a critical parameter in the identification of feasible enzymatic modifications with maximal impact on the network flux. A case study of aromatic aminoacid production is provided to illustrate these concepts.

  19. Automation of Axisymmetric Drop Shape Analysis Using Digital Image Processing

    Science.gov (United States)

    Cheng, Philip Wing Ping

    The Axisymmetric Drop Shape Analysis - Profile (ADSA-P) technique, as initiated by Rotenberg, is a user -oriented scheme to determine liquid-fluid interfacial tensions and contact angles from the shape of axisymmetric menisci, i.e., from sessile as well as pendant drops. The ADSA -P program requires as input several coordinate points along the drop profile, the value of the density difference between the bulk phases, and gravity. The solution yields interfacial tension and contact angle. Although the ADSA-P technique was in principle complete, it was found that it was of very limited practical use. The major difficulty with the method is the need for very precise coordinate points along the drop profile, which, up to now, could not be obtained readily. In the past, the coordinate points along the drop profile were obtained by manual digitization of photographs or negatives. From manual digitization data, the surface tension values obtained had an average error of +/-5% when compared with literature values. Another problem with the ADSA-P technique was that the computer program failed to converge for the case of very elongated pendant drops. To acquire the drop profile coordinates automatically, a technique which utilizes recent developments in digital image acquisition and analysis was developed. In order to determine the drop profile coordinates as precisely as possible, the errors due to optical distortions were eliminated. In addition, determination of drop profile coordinates to pixel and sub-pixel resolution was developed. It was found that high precision could be obtained through the use of sub-pixel resolution and a spline fitting method. The results obtained using the automatic digitization technique in conjunction with ADSA-P not only compared well with the conventional methods, but also outstripped the precision of conventional methods considerably. To solve the convergence problem of very elongated pendant drops, it was found that the reason for the

  20. Conventional Versus Automated Implantation of Loose Seeds in Prostate Brachytherapy: Analysis of Dosimetric and Clinical Results

    Energy Technology Data Exchange (ETDEWEB)

    Genebes, Caroline, E-mail: genebes.caroline@claudiusregaud.fr [Radiation Oncology Department, Institut Claudius Regaud, Toulouse (France); Filleron, Thomas; Graff, Pierre [Radiation Oncology Department, Institut Claudius Regaud, Toulouse (France); Jonca, Frédéric [Department of Urology, Clinique Ambroise Paré, Toulouse (France); Huyghe, Eric; Thoulouzan, Matthieu; Soulie, Michel; Malavaud, Bernard [Department of Urology and Andrology, CHU Rangueil, Toulouse (France); Aziza, Richard; Brun, Thomas; Delannes, Martine; Bachaud, Jean-Marc [Radiation Oncology Department, Institut Claudius Regaud, Toulouse (France)

    2013-11-15

    Purpose: To review the clinical outcome of I-125 permanent prostate brachytherapy (PPB) for low-risk and intermediate-risk prostate cancer and to compare 2 techniques of loose-seed implantation. Methods and Materials: 574 consecutive patients underwent I-125 PPB for low-risk and intermediate-risk prostate cancer between 2000 and 2008. Two successive techniques were used: conventional implantation from 2000 to 2004 and automated implantation (Nucletron, FIRST system) from 2004 to 2008. Dosimetric and biochemical recurrence-free (bNED) survival results were reported and compared for the 2 techniques. Univariate and multivariate analysis researched independent predictors for bNED survival. Results: 419 (73%) and 155 (27%) patients with low-risk and intermediate-risk disease, respectively, were treated (median follow-up time, 69.3 months). The 60-month bNED survival rates were 95.2% and 85.7%, respectively, for patients with low-risk and intermediate-risk disease (P=.04). In univariate analysis, patients treated with automated implantation had worse bNED survival rates than did those treated with conventional implantation (P<.0001). By day 30, patients treated with automated implantation showed lower values of dose delivered to 90% of prostate volume (D90) and volume of prostate receiving 100% of prescribed dose (V100). In multivariate analysis, implantation technique, Gleason score, and V100 on day 30 were independent predictors of recurrence-free status. Grade 3 urethritis and urinary incontinence were observed in 2.6% and 1.6% of the cohort, respectively, with no significant differences between the 2 techniques. No grade 3 proctitis was observed. Conclusion: Satisfactory 60-month bNED survival rates (93.1%) and acceptable toxicity (grade 3 urethritis <3%) were achieved by loose-seed implantation. Automated implantation was associated with worse dosimetric and bNED survival outcomes.

  1. Knowledge Support and Automation for Performance Analysis with PerfExplorer 2.0

    Directory of Open Access Journals (Sweden)

    Kevin A. Huck

    2008-01-01

    Full Text Available The integration of scalable performance analysis in parallel development tools is difficult. The potential size of data sets and the need to compare results from multiple experiments presents a challenge to manage and process the information. Simply to characterize the performance of parallel applications running on potentially hundreds of thousands of processor cores requires new scalable analysis techniques. Furthermore, many exploratory analysis processes are repeatable and could be automated, but are now implemented as manual procedures. In this paper, we will discuss the current version of PerfExplorer, a performance analysis framework which provides dimension reduction, clustering and correlation analysis of individual trails of large dimensions, and can perform relative performance analysis between multiple application executions. PerfExplorer analysis processes can be captured in the form of Python scripts, automating what would otherwise be time-consuming tasks. We will give examples of large-scale analysis results, and discuss the future development of the framework, including the encoding and processing of expert performance rules, and the increasing use of performance metadata.

  2. Applying centrality measures to impact analysis: A coauthorship network analysis

    CERN Document Server

    Yan, Erjia

    2010-01-01

    Many studies on coauthorship networks focus on network topology and network statistical mechanics. This article takes a different approach by studying micro-level network properties, with the aim to apply centrality measures to impact analysis. Using coauthorship data from 16 journals in the field of library and information science (LIS) with a time span of twenty years (1988-2007), we construct an evolving coauthorship network and calculate four centrality measures (closeness, betweenness, degree and PageRank) for authors in this network. We find out that the four centrality measures are significantly correlated with citation counts. We also discuss the usability of centrality measures in author ranking, and suggest that centrality measures can be useful indicators for impact analysis.

  3. Crawling Facebook for Social Network Analysis Purposes

    OpenAIRE

    Catanese, Salvatore A.; De Meo, Pasquale; Ferrara, Emilio; Fiumara, Giacomo; Provetti, Alessandro

    2011-01-01

    We describe our work in the collection and analysis of massive data describing the connections between participants to online social networks. Alternative approaches to social network data collection are defined and evaluated in practice, against the popular Facebook Web site. Thanks to our ad-hoc, privacy-compliant crawlers, two large samples, comprising millions of connections, have been collected; the data is anonymous and organized as an undirected graph. We describe a set of tools that w...

  4. Social network analysis of study environment

    OpenAIRE

    Blaženka Divjak; Petra Peharda

    2010-01-01

    Student working environment influences student learning and achievement level. In this respect social aspects of students’ formal and non-formal learning play special role in learning environment. The main research problem of this paper is to find out if students' academic performance influences their position in different students' social networks. Further, there is a need to identify other predictors of this position. In the process of problem solving we use the Social Network Analysi...

  5. Density functional and neural network analysis

    DEFF Research Database (Denmark)

    Jalkanen, K. J.; Suhai, S.; Bohr, Henrik

    1997-01-01

    dichroism (VCD) intensities. The large changes due to hydration on the structures, relative stability of conformers, and in the VA and VCD spectra observed experimentally are reproduced by the DFT calculations. Furthermore a neural network was constructed for reproducing the inverse scattering data (infer...... the structural coordinates from spectroscopic data) that the DFT method could produce. Finally the neural network performances are used to monitor a sensitivity or dependence analysis of the importance of secondary structures....

  6. Performance Analysis of Asynchronous Multicarrier Wireless Networks

    OpenAIRE

    Lin, Xingqin; Jiang, Libin; Andrews, Jeffrey G.

    2014-01-01

    This paper develops a novel analytical framework for asynchronous wireless networks deploying multicarrier transmission. Nodes in the network have different notions of timing, so from the viewpoint of a typical receiver, the received signals from different transmitters are asynchronous, leading to a loss of orthogonality between subcarriers. We first develop a detailed link-level analysis based on OFDM, based on which we propose a tractable system-level signal-to-interference-plus-noise ratio...

  7. Multilayer Analysis and Visualization of Networks

    CERN Document Server

    De Domenico, Manlio; Arenas, Alex

    2014-01-01

    Multilayer relationships among and information about biological entities must be accompanied by the means to analyze, visualize, and obtain insights from such data. We report a methodology and a collection of algorithms for the analysis of multilayer networks in our new open-source software (muxViz). We demonstrate the ability of muxViz to analyze and interactively visualize multilayer data using empirical genetic and neuronal networks.

  8. ADGEN: a system for automated sensitivity analysis of predictive models

    International Nuclear Information System (INIS)

    A system that can automatically enhance computer codes with a sensitivity calculation capability is presented. With this new system, named ADGEN, rapid and cost-effective calculation of sensitivities can be performed in any FORTRAN code for all input data or parameters. The resulting sensitivities can be used in performance assessment studies related to licensing or interactions with the public to systematically and quantitatively prove the relative importance of each of the system parameters in calculating the final performance results. A general procedure calling for the systematic use of sensitivities in assessment studies is presented. The procedure can be used in modelling and model validation studies to avoid ''over modelling,'' in site characterization planning to avoid ''over collection of data,'' and in performance assessment to determine the uncertainties on the final calculated results. The added capability to formally perform the inverse problem, i.e., to determine the input data or parameters on which to focus additional research or analysis effort in order to improve the uncertainty of the final results, is also discussed

  9. Automated validation of patient safety clinical incident classification: macro analysis.

    Science.gov (United States)

    Gupta, Jaiprakash; Patrick, Jon

    2013-01-01

    Patient safety is the buzz word in healthcare. Incident Information Management System (IIMS) is electronic software that stores clinical mishaps narratives in places where patients are treated. It is estimated that in one state alone over one million electronic text documents are available in IIMS. In this paper we investigate the data density available in the fields entered to notify an incident and the validity of the built in classification used by clinician to categories the incidents. Waikato Environment for Knowledge Analysis (WEKA) software was used to test the classes. Four statistical classifier based on J48, Naïve Bayes (NB), Naïve Bayes Multinominal (NBM) and Support Vector Machine using radial basis function (SVM_RBF) algorithms were used to validate the classes. The data pool was 10,000 clinical incidents drawn from 7 hospitals in one state in Australia. In first part of the study 1000 clinical incidents were selected to determine type and number of fields worth investigating and in the second part another 5448 clinical incidents were randomly selected to validate 13 clinical incident types. Result shows 74.6% of the cells were empty and only 23 fields had content over 70% of the time. The percentage correctly classified classes on four algorithms using categorical dataset ranged from 42 to 49%, using free-text datasets from 65% to 77% and using both datasets from 72% to 79%. Kappa statistic ranged from 0.36 to 0.4. for categorical data, from 0.61 to 0.74. for free-text and from 0.67 to 0.77 for both datasets. Similar increases in performance in the 3 experiments was noted on true positive rate, precision, F-measure and area under curve (AUC) of receiver operating characteristics (ROC) scores. The study demonstrates only 14 of 73 fields in IIMS have data that is usable for machine learning experiments. Irrespective of the type of algorithms used when all datasets are used performance was better. Classifier NBM showed best performance. We think the

  10. Automated Software Analysis of Fetal Movement Recorded during a Pregnant Woman's Sleep at Home.

    Science.gov (United States)

    Nishihara, Kyoko; Ohki, Noboru; Kamata, Hideo; Ryo, Eiji; Horiuchi, Shigeko

    2015-01-01

    Fetal movement is an important biological index of fetal well-being. Since 2008, we have been developing an original capacitive acceleration sensor and device that a pregnant woman can easily use to record fetal movement by herself at home during sleep. In this study, we report a newly developed automated software system for analyzing recorded fetal movement. This study will introduce the system and compare its results to those of a manual analysis of the same fetal movement signals (Experiment I). We will also demonstrate an appropriate way to use the system (Experiment II). In Experiment I, fetal movement data reported previously for six pregnant women at 28-38 gestational weeks were used. We evaluated the agreement of the manual and automated analyses for the same 10-sec epochs using prevalence-adjusted bias-adjusted kappa (PABAK) including quantitative indicators for prevalence and bias. The mean PABAK value was 0.83, which can be considered almost perfect. In Experiment II, twelve pregnant women at 24-36 gestational weeks recorded fetal movement at night once every four weeks. Overall, mean fetal movement counts per hour during maternal sleep significantly decreased along with gestational weeks, though individual differences in fetal development were noted. This newly developed automated analysis system can provide important data throughout late pregnancy.

  11. Development of automated high throughput single molecular microfluidic detection platform for signal transduction analysis

    Science.gov (United States)

    Huang, Po-Jung; Baghbani Kordmahale, Sina; Chou, Chao-Kai; Yamaguchi, Hirohito; Hung, Mien-Chie; Kameoka, Jun

    2016-03-01

    Signal transductions including multiple protein post-translational modifications (PTM), protein-protein interactions (PPI), and protein-nucleic acid interaction (PNI) play critical roles for cell proliferation and differentiation that are directly related to the cancer biology. Traditional methods, like mass spectrometry, immunoprecipitation, fluorescence resonance energy transfer, and fluorescence correlation spectroscopy require a large amount of sample and long processing time. "microchannel for multiple-parameter analysis of proteins in single-complex (mMAPS)"we proposed can reduce the process time and sample volume because this system is composed by microfluidic channels, fluorescence microscopy, and computerized data analysis. In this paper, we will present an automated mMAPS including integrated microfluidic device, automated stage and electrical relay for high-throughput clinical screening. Based on this result, we estimated that this automated detection system will be able to screen approximately 150 patient samples in a 24-hour period, providing a practical application to analyze tissue samples in a clinical setting.

  12. Automated Software Analysis of Fetal Movement Recorded during a Pregnant Woman's Sleep at Home.

    Directory of Open Access Journals (Sweden)

    Kyoko Nishihara

    Full Text Available Fetal movement is an important biological index of fetal well-being. Since 2008, we have been developing an original capacitive acceleration sensor and device that a pregnant woman can easily use to record fetal movement by herself at home during sleep. In this study, we report a newly developed automated software system for analyzing recorded fetal movement. This study will introduce the system and compare its results to those of a manual analysis of the same fetal movement signals (Experiment I. We will also demonstrate an appropriate way to use the system (Experiment II. In Experiment I, fetal movement data reported previously for six pregnant women at 28-38 gestational weeks were used. We evaluated the agreement of the manual and automated analyses for the same 10-sec epochs using prevalence-adjusted bias-adjusted kappa (PABAK including quantitative indicators for prevalence and bias. The mean PABAK value was 0.83, which can be considered almost perfect. In Experiment II, twelve pregnant women at 24-36 gestational weeks recorded fetal movement at night once every four weeks. Overall, mean fetal movement counts per hour during maternal sleep significantly decreased along with gestational weeks, though individual differences in fetal development were noted. This newly developed automated analysis system can provide important data throughout late pregnancy.

  13. Automated cell colony counting and analysis using the circular Hough image transform algorithm (CHiTA)

    Energy Technology Data Exchange (ETDEWEB)

    Bewes, J M; Suchowerska, N; McKenzie, D R [School of Physics, University of Sydney, Sydney, NSW (Australia)], E-mail: jbewes@physics.usyd.edu.au

    2008-11-07

    We present an automated cell colony counting method that is flexible, robust and capable of providing more in-depth clonogenic analysis than existing manual and automated approaches. The full form of the Hough transform without approximation has been implemented, for the first time. Improvements in computing speed have facilitated this approach. Colony identification was achieved by pre-processing the raw images of the colonies in situ in the flask, including images of the flask edges, by erosion, dilation and Gaussian smoothing processes. Colony edges were then identified by intensity gradient field discrimination. Our technique eliminates the need for specialized hardware for image capture and enables the use of a standard desktop scanner for distortion-free image acquisition. Additional parameters evaluated included regional colony counts, average colony area, nearest neighbour distances and radial distribution. This spatial and qualitative information extends the utility of the clonogenic assay, allowing analysis of spatially-variant cytotoxic effects. To test the automated system, two flask types and three cell lines with different morphology, cell size and plating density were examined. A novel Monte Carlo method of simulating cell colony images, as well as manual counting, were used to quantify algorithm accuracy. The method was able to identify colonies with unusual morphology, to successfully resolve merged colonies and to correctly count colonies adjacent to flask edges.

  14. Automated cell colony counting and analysis using the circular Hough image transform algorithm (CHiTA)

    Science.gov (United States)

    Bewes, J. M.; Suchowerska, N.; McKenzie, D. R.

    2008-11-01

    We present an automated cell colony counting method that is flexible, robust and capable of providing more in-depth clonogenic analysis than existing manual and automated approaches. The full form of the Hough transform without approximation has been implemented, for the first time. Improvements in computing speed have facilitated this approach. Colony identification was achieved by pre-processing the raw images of the colonies in situ in the flask, including images of the flask edges, by erosion, dilation and Gaussian smoothing processes. Colony edges were then identified by intensity gradient field discrimination. Our technique eliminates the need for specialized hardware for image capture and enables the use of a standard desktop scanner for distortion-free image acquisition. Additional parameters evaluated included regional colony counts, average colony area, nearest neighbour distances and radial distribution. This spatial and qualitative information extends the utility of the clonogenic assay, allowing analysis of spatially-variant cytotoxic effects. To test the automated system, two flask types and three cell lines with different morphology, cell size and plating density were examined. A novel Monte Carlo method of simulating cell colony images, as well as manual counting, were used to quantify algorithm accuracy. The method was able to identify colonies with unusual morphology, to successfully resolve merged colonies and to correctly count colonies adjacent to flask edges.

  15. Towards the Procedure Automation of Full Stochastic Spectral Based Fatigue Analysis

    Directory of Open Access Journals (Sweden)

    Khurram Shehzad

    2013-05-01

    Full Text Available Fatigue is one of the most significant failure modes for marine structures such as ships and offshore platforms. Among numerous methods for fatigue life estimation, spectral method is considered as the most reliable one due to its ability to cater different sea states as well as their probabilities of occurrence. However, spectral based simulation procedure itself is quite complex and numerically intensive owing to various critical technical details. Present research study is focused on the application and automation of spectral based fatigue analysis procedure for ship structure using ANSYS software with 3D liner sea keeping code AQWA. Ansys Parametric Design Language (APDL macros are created and subsequently implemented to automate the workflow of simulation process by reducing the time spent on non-value added repetitive activity. A MATLAB program based on direct calculation procedure of spectral fatigue is developed to calculate total fatigue damage. The automation procedure is employed to predict the fatigue life of a ship structural detail using wave scatter data of North Atlantic and Worldwide trade. The current work will provide a system for efficient implementation of stochastic spectral fatigue analysis procedure for ship structures.

  16. RootGraph: a graphic optimization tool for automated image analysis of plant roots.

    Science.gov (United States)

    Cai, Jinhai; Zeng, Zhanghui; Connor, Jason N; Huang, Chun Yuan; Melino, Vanessa; Kumar, Pankaj; Miklavcic, Stanley J

    2015-11-01

    This paper outlines a numerical scheme for accurate, detailed, and high-throughput image analysis of plant roots. In contrast to existing root image analysis tools that focus on root system-average traits, a novel, fully automated and robust approach for the detailed characterization of root traits, based on a graph optimization process is presented. The scheme, firstly, distinguishes primary roots from lateral roots and, secondly, quantifies a broad spectrum of root traits for each identified primary and lateral root. Thirdly, it associates lateral roots and their properties with the specific primary root from which the laterals emerge. The performance of this approach was evaluated through comparisons with other automated and semi-automated software solutions as well as against results based on manual measurements. The comparisons and subsequent application of the algorithm to an array of experimental data demonstrate that this method outperforms existing methods in terms of accuracy, robustness, and the ability to process root images under high-throughput conditions.

  17. Automated DNA extraction of single dog hairs without roots for mitochondrial DNA analysis.

    Science.gov (United States)

    Bekaert, Bram; Larmuseau, Maarten H D; Vanhove, Maarten P M; Opdekamp, Anouschka; Decorte, Ronny

    2012-03-01

    Dogs are intensely integrated in human social life and their shed hairs can play a major role in forensic investigations. The overall aim of this study was to validate a semi-automated extraction method for mitochondrial DNA analysis of telogenic dog hairs. Extracted DNA was amplified with a 95% success rate from 43 samples using two new experimental designs in which the mitochondrial control region was amplified as a single large (± 1260 bp) amplicon or as two individual amplicons (HV1 and HV2; ± 650 and 350 bp) with tailed-primers. The results prove that the extraction of dog hair mitochondrial DNA can easily be automated to provide sufficient DNA yield for the amplification of a forensically useful long mitochondrial DNA fragment or alternatively two short fragments with minimal loss of sequence in case of degraded samples.

  18. Automated IR determination of petroleum products in water based on sequential injection analysis.

    Science.gov (United States)

    Falkova, Marina; Vakh, Christina; Shishov, Andrey; Zubakina, Ekaterina; Moskvin, Aleksey; Moskvin, Leonid; Bulatov, Andrey

    2016-02-01

    The simple and easy performed automated method for the IR determination of petroleum products (PP) in water using extraction-chromatographic cartridges has been developed. The method assumes two stages: on-site extraction of PP during a sampling by using extraction-chromatographic cartridges and subsequent determination of the extracted PP using sequential injection analysis (SIA) with IR detection. The appropriate experimental conditions for extraction of the dissolved in water PP and for automated SIA procedure were investigated. The calibration plot constructed using the developed procedure was linear in the range of 3-200 μg L(-1). The limit of detection (LOD), calculated from a blank test based on 3σ was 1 µg L(-1). The sample volume was 1L. The system throughput was found to be 12 h(-1). PMID:26653498

  19. Automated Bifurcation Analysis for Nonlinear Elliptic Partial Difference Equations on Graphs

    CERN Document Server

    Neuberger, John M; Swift, James W

    2010-01-01

    We seek solutions $u\\in\\R^n$ to the semilinear elliptic partial difference equation $-Lu + f_s(u) = 0$, where $L$ is the matrix corresponding to the Laplacian operator on a graph $G$ and $f_s$ is a one-parameter family of nonlinear functions. This article combines the ideas introduced by the authors in two papers: a) {\\it Nonlinear Elliptic Partial Difference Equations on Graphs} (J. Experimental Mathematics, 2006), which introduces analytical and numerical techniques for solving such equations, and b) {\\it Symmetry and Automated Branch Following for a Semilinear Elliptic PDE on a Fractal Region} wherein we present some of our recent advances concerning symmetry, bifurcation, and automation fo We apply the symmetry analysis found in the SIAM paper to arbitrary graphs in order to obtain better initial guesses for Newton's method, create informative graphics, and be in the underlying variational structure. We use two modified implementations of the gradient Newton-Galerkin algorithm (GNGA, Neuberger and Swift) ...

  20. AN OPTIMIZATION-BASED HEURISTIC FOR A CAPACITATED LOT-SIZING MODEL IN AN AUTOMATED TELLER MACHINES NETWORK

    Directory of Open Access Journals (Sweden)

    Supatchaya Chotayakul

    2013-01-01

    Full Text Available This research studies a cash inventory problem in an ATM Network to satisfy customer’s cash needs over multiple periods with deterministic demand. The objective is to determine the amount of money to place in Automated Teller Machines (ATMs and cash centers for each period over a given time horizon. The algorithms are designed as a multi-echelon inventory problem with single-item capacitated lot-sizing to minimize total costs of running ATM network. In this study, we formulate the problem as a Mixed Integer Program (MIP and develop an approach based on reformulating the model as a shortest path formulation for finding a near-optimal solution of the problem. This reformulation is the same as the traditional model, except the capacity constraints, inventory balance constraints and setup constraints related to the management of the money in ATMs are relaxed. This new formulation gives more variables and constraints, but has a much tighter linear relaxation than the original and is faster to solve for short term planning. Computational results show its effectiveness, especially for large sized problems.

  1. Micro-force compensation in automated micro-object positioning using adaptive neural networks

    Science.gov (United States)

    Shahini, M.; Melek, W. W.; Yeow, J. T. W.

    2009-09-01

    This paper proposes a novel approach for controlled pushing of a micro-sized object along a desired path. Challenges associated with this control task due to the presence of dominating micro-forces are carefully studied and a solution based on the application of artificial neural networks is introduced. A nonlinear controller is proposed for controlled pushing of micro-objects which guarantees the stability of the closed-loop system in the Lyapunov sense. An experimental setup is designed to validate the performance of the proposed controller. Results suggest that artificial neural networks present a promising tool for design of adaptive controllers to accurately manipulate objects in the microscopic scale.

  2. An Automated Planning Model for RoF Heterogeneous Wireless Networks

    DEFF Research Database (Denmark)

    Shawky, Ahmed; Bergheim, Hans; Ragnarsson, Ólafur;

    2010-01-01

    The number of users in wireless WANs is increasing like never before, at the same time as the bandwidth demands by users increase.The structure of the third generation Wireless WANs makes it expensive for Wireless ISPs to meet these demands.The FUTON architecture is a RoF heterogeneous wireless...... network architecture under development,that will be cheaper to deploy and operate.This paper shows a method to plan an implementation of this architecture.The planning is done as automatic as possible,covering radio planning, fiber planning and network dimensioning. The out come of the paper is a planning...

  3. NAPS: Network Analysis of Protein Structures.

    Science.gov (United States)

    Chakrabarty, Broto; Parekh, Nita

    2016-07-01

    Traditionally, protein structures have been analysed by the secondary structure architecture and fold arrangement. An alternative approach that has shown promise is modelling proteins as a network of non-covalent interactions between amino acid residues. The network representation of proteins provide a systems approach to topological analysis of complex three-dimensional structures irrespective of secondary structure and fold type and provide insights into structure-function relationship. We have developed a web server for network based analysis of protein structures, NAPS, that facilitates quantitative and qualitative (visual) analysis of residue-residue interactions in: single chains, protein complex, modelled protein structures and trajectories (e.g. from molecular dynamics simulations). The user can specify atom type for network construction, distance range (in Å) and minimal amino acid separation along the sequence. NAPS provides users selection of node(s) and its neighbourhood based on centrality measures, physicochemical properties of amino acids or cluster of well-connected residues (k-cliques) for further analysis. Visual analysis of interacting domains and protein chains, and shortest path lengths between pair of residues are additional features that aid in functional analysis. NAPS support various analyses and visualization views for identifying functional residues, provide insight into mechanisms of protein folding, domain-domain and protein-protein interactions for understanding communication within and between proteins. URL:http://bioinf.iiit.ac.in/NAPS/. PMID:27151201

  4. OpenComet: An automated tool for comet assay image analysis

    Directory of Open Access Journals (Sweden)

    Benjamin M. Gyori

    2014-01-01

    Full Text Available Reactive species such as free radicals are constantly generated in vivo and DNA is the most important target of oxidative stress. Oxidative DNA damage is used as a predictive biomarker to monitor the risk of development of many diseases. The comet assay is widely used for measuring oxidative DNA damage at a single cell level. The analysis of comet assay output images, however, poses considerable challenges. Commercial software is costly and restrictive, while free software generally requires laborious manual tagging of cells. This paper presents OpenComet, an open-source software tool providing automated analysis of comet assay images. It uses a novel and robust method for finding comets based on geometric shape attributes and segmenting the comet heads through image intensity profile analysis. Due to automation, OpenComet is more accurate, less prone to human bias, and faster than manual analysis. A live analysis functionality also allows users to analyze images captured directly from a microscope. We have validated OpenComet on both alkaline and neutral comet assay images as well as sample images from existing software packages. Our results show that OpenComet achieves high accuracy with significantly reduced analysis time.

  5. Social network analysis of study environment

    Directory of Open Access Journals (Sweden)

    Blaženka Divjak

    2010-06-01

    Full Text Available Student working environment influences student learning and achievement level. In this respect social aspects of students’ formal and non-formal learning play special role in learning environment. The main research problem of this paper is to find out if students' academic performance influences their position in different students' social networks. Further, there is a need to identify other predictors of this position. In the process of problem solving we use the Social Network Analysis (SNA that is based on the data we collected from the students at the Faculty of Organization and Informatics, University of Zagreb. There are two data samples: in the basic sample N=27 and in the extended sample N=52. We collected data on social-demographic position, academic performance, learning and motivation styles, student status (full-time/part-time, attitudes towards individual and teamwork as well as informal cooperation. Afterwards five different networks (exchange of learning materials, teamwork, informal communication, basic and aggregated social network were constructed. These networks were analyzed with different metrics and the most important were betweenness, closeness and degree centrality. The main result is, firstly, that the position in a social network cannot be forecast only by academic success and, secondly, that part-time students tend to form separate groups that are poorly connected with full-time students. In general, position of a student in social networks in study environment can influence student learning as well as her/his future employability and therefore it is worthwhile to be investigated.

  6. Traffic Analysis for Real-Time Communication Networks onboard Ships

    DEFF Research Database (Denmark)

    Schiøler, Henrik; Nielsen, Jens Frederik Dalsgaard; Jørgensen, N.;

    The paper presents a novel method for establishing worst case estimates of queue lenghts and transmission delays in networks of interconnected segments each of ring topology as defined by the ATOMOS project for marine automation. A non probalistic model for describing traffic is introduced as wel...

  7. Traffic Analysis for Real-Time Communication Networks onboard Ships

    DEFF Research Database (Denmark)

    Schiøler, Henrik; Nielsen, Jens Frederik Dalsgaard; Jørgensen, N.;

    1998-01-01

    The paper presents a novel method for establishing worst case estimates of queue lenghts and transmission delays in networks of interconnected segments each of ring topology as defined by the ATOMOS project for marine automation. A non probalistic model for describing traffic is introduced as wel...

  8. AGAPE (Automated Genome Analysis PipelinE for pan-genome analysis of Saccharomyces cerevisiae.

    Directory of Open Access Journals (Sweden)

    Giltae Song

    Full Text Available The characterization and public release of genome sequences from thousands of organisms is expanding the scope for genetic variation studies. However, understanding the phenotypic consequences of genetic variation remains a challenge in eukaryotes due to the complexity of the genotype-phenotype map. One approach to this is the intensive study of model systems for which diverse sources of information can be accumulated and integrated. Saccharomyces cerevisiae is an extensively studied model organism, with well-known protein functions and thoroughly curated phenotype data. To develop and expand the available resources linking genomic variation with function in yeast, we aim to model the pan-genome of S. cerevisiae. To initiate the yeast pan-genome, we newly sequenced or re-sequenced the genomes of 25 strains that are commonly used in the yeast research community using advanced sequencing technology at high quality. We also developed a pipeline for automated pan-genome analysis, which integrates the steps of assembly, annotation, and variation calling. To assign strain-specific functional annotations, we identified genes that were not present in the reference genome. We classified these according to their presence or absence across strains and characterized each group of genes with known functional and phenotypic features. The functional roles of novel genes not found in the reference genome and associated with strains or groups of strains appear to be consistent with anticipated adaptations in specific lineages. As more S. cerevisiae strain genomes are released, our analysis can be used to collate genome data and relate it to lineage-specific patterns of genome evolution. Our new tool set will enhance our understanding of genomic and functional evolution in S. cerevisiae, and will be available to the yeast genetics and molecular biology community.

  9. AGAPE (Automated Genome Analysis PipelinE) for pan-genome analysis of Saccharomyces cerevisiae.

    Science.gov (United States)

    Song, Giltae; Dickins, Benjamin J A; Demeter, Janos; Engel, Stacia; Gallagher, Jennifer; Choe, Kisurb; Dunn, Barbara; Snyder, Michael; Cherry, J Michael

    2015-01-01

    The characterization and public release of genome sequences from thousands of organisms is expanding the scope for genetic variation studies. However, understanding the phenotypic consequences of genetic variation remains a challenge in eukaryotes due to the complexity of the genotype-phenotype map. One approach to this is the intensive study of model systems for which diverse sources of information can be accumulated and integrated. Saccharomyces cerevisiae is an extensively studied model organism, with well-known protein functions and thoroughly curated phenotype data. To develop and expand the available resources linking genomic variation with function in yeast, we aim to model the pan-genome of S. cerevisiae. To initiate the yeast pan-genome, we newly sequenced or re-sequenced the genomes of 25 strains that are commonly used in the yeast research community using advanced sequencing technology at high quality. We also developed a pipeline for automated pan-genome analysis, which integrates the steps of assembly, annotation, and variation calling. To assign strain-specific functional annotations, we identified genes that were not present in the reference genome. We classified these according to their presence or absence across strains and characterized each group of genes with known functional and phenotypic features. The functional roles of novel genes not found in the reference genome and associated with strains or groups of strains appear to be consistent with anticipated adaptations in specific lineages. As more S. cerevisiae strain genomes are released, our analysis can be used to collate genome data and relate it to lineage-specific patterns of genome evolution. Our new tool set will enhance our understanding of genomic and functional evolution in S. cerevisiae, and will be available to the yeast genetics and molecular biology community.

  10. Network analysis of eight industrial symbiosis systems

    Science.gov (United States)

    Zhang, Yan; Zheng, Hongmei; Shi, Han; Yu, Xiangyi; Liu, Gengyuan; Su, Meirong; Li, Yating; Chai, Yingying

    2016-06-01

    Industrial symbiosis is the quintessential characteristic of an eco-industrial park. To divide parks into different types, previous studies mostly focused on qualitative judgments, and failed to use metrics to conduct quantitative research on the internal structural or functional characteristics of a park. To analyze a park's structural attributes, a range of metrics from network analysis have been applied, but few researchers have compared two or more symbioses using multiple metrics. In this study, we used two metrics (density and network degree centralization) to compare the degrees of completeness and dependence of eight diverse but representative industrial symbiosis networks. Through the combination of the two metrics, we divided the networks into three types: weak completeness, and two forms of strong completeness, namely "anchor tenant" mutualism and "equality-oriented" mutualism. The results showed that the networks with a weak degree of completeness were sparse and had few connections among nodes; for "anchor tenant" mutualism, the degree of completeness was relatively high, but the affiliated members were too dependent on core members; and the members in "equality-oriented" mutualism had equal roles, with diverse and flexible symbiotic paths. These results revealed some of the systems' internal structure and how different structures influenced the exchanges of materials, energy, and knowledge among members of a system, thereby providing insights into threats that may destabilize the network. Based on this analysis, we provide examples of the advantages and effectiveness of recent improvement projects in a typical Chinese eco-industrial park (Shandong Lubei).

  11. Analysis of Complexity Evolution Management and Human Performance Issues in Commercial Aircraft Automation Systems

    Science.gov (United States)

    Vakil, Sanjay S.; Hansman, R. John

    2000-01-01

    Autoflight systems in the current generation of aircraft have been implicated in several recent incidents and accidents. A contributory aspect to these incidents may be the manner in which aircraft transition between differing behaviours or 'modes.' The current state of aircraft automation was investigated and the incremental development of the autoflight system was tracked through a set of aircraft to gain insight into how these systems developed. This process appears to have resulted in a system without a consistent global representation. In order to evaluate and examine autoflight systems, a 'Hybrid Automation Representation' (HAR) was developed. This representation was used to examine several specific problems known to exist in aircraft systems. Cyclomatic complexity is an analysis tool from computer science which counts the number of linearly independent paths through a program graph. This approach was extended to examine autoflight mode transitions modelled with the HAR. A survey was conducted of pilots to identify those autoflight mode transitions which airline pilots find difficult. The transitions identified in this survey were analyzed using cyclomatic complexity to gain insight into the apparent complexity of the autoflight system from the perspective of the pilot. Mode transitions which had been identified as complex by pilots were found to have a high cyclomatic complexity. Further examination was made into a set of specific problems identified in aircraft: the lack of a consistent representation of automation, concern regarding appropriate feedback from the automation, and the implications of physical limitations on the autoflight systems. Mode transitions involved in changing to and leveling at a new altitude were identified across multiple aircraft by numerous pilots. Where possible, evaluation and verification of the behaviour of these autoflight mode transitions was investigated via aircraft-specific high fidelity simulators. Three solution

  12. Reproducibility of In Vivo Corneal Confocal Microscopy Using an Automated Analysis Program for Detection of Diabetic Sensorimotor Polyneuropathy.

    Directory of Open Access Journals (Sweden)

    Ilia Ostrovski

    Full Text Available In vivo Corneal Confocal Microscopy (IVCCM is a validated, non-invasive test for diabetic sensorimotor polyneuropathy (DSP detection, but its utility is limited by the image analysis time and expertise required. We aimed to determine the inter- and intra-observer reproducibility of a novel automated analysis program compared to manual analysis.In a cross-sectional diagnostic study, 20 non-diabetes controls (mean age 41.4±17.3y, HbA1c 5.5±0.4% and 26 participants with type 1 diabetes (42.8±16.9y, 8.0±1.9% underwent two separate IVCCM examinations by one observer and a third by an independent observer. Along with nerve density and branch density, corneal nerve fibre length (CNFL was obtained by manual analysis (CNFLMANUAL, a protocol in which images were manually selected for automated analysis (CNFLSEMI-AUTOMATED, and one in which selection and analysis were performed electronically (CNFLFULLY-AUTOMATED. Reproducibility of each protocol was determined using intraclass correlation coefficients (ICC and, as a secondary objective, the method of Bland and Altman was used to explore agreement between protocols.Mean CNFLManual was 16.7±4.0, 13.9±4.2 mm/mm2 for non-diabetes controls and diabetes participants, while CNFLSemi-Automated was 10.2±3.3, 8.6±3.0 mm/mm2 and CNFLFully-Automated was 12.5±2.8, 10.9 ± 2.9 mm/mm2. Inter-observer ICC and 95% confidence intervals (95%CI were 0.73(0.56, 0.84, 0.75(0.59, 0.85, and 0.78(0.63, 0.87, respectively (p = NS for all comparisons. Intra-observer ICC and 95%CI were 0.72(0.55, 0.83, 0.74(0.57, 0.85, and 0.84(0.73, 0.91, respectively (p<0.05 for CNFLFully-Automated compared to others. The other IVCCM parameters had substantially lower ICC compared to those for CNFL. CNFLSemi-Automated and CNFLFully-Automated underestimated CNFLManual by mean and 95%CI of 35.1(-4.5, 67.5% and 21.0(-21.6, 46.1%, respectively.Despite an apparent measurement (underestimation bias in comparison to the manual strategy of image

  13. Design of Intelligent Network Performance Analysis Forecast Support System

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    A system designed for supporting the network performance analysis and forecast effort is pre sented, based on the combination of offline network analysis and online real-time performance forecast. The off-line analysis will perform analysis of specific network node performance, correlation analysis of relative network nodes performance and evolutionary mathematical modeling of long-term network performance mea surements. The online real-time network performance forecast will be based on one so-called hybrid predic tion modeling approach for short-term network performance prediction and trend analysis. Based on the module design, the system proposed has good intelligence, scalability and self-adaptability, which will offer highly effective network performance analysis and forecast tools for network managers, and is one ideal sup port platform for network performance analysis and forecast effort.

  14. Complex networks analysis of language complexity

    CERN Document Server

    Amancio, Diego R; Oliveira, Osvaldo N; Costa, Luciano da F; 10.1209/0295-5075/100/58002

    2013-01-01

    Methods from statistical physics, such as those involving complex networks, have been increasingly used in quantitative analysis of linguistic phenomena. In this paper, we represented pieces of text with different levels of simplification in co-occurrence networks and found that topological regularity correlated negatively with textual complexity. Furthermore, in less complex texts the distance between concepts, represented as nodes, tended to decrease. The complex networks metrics were treated with multivariate pattern recognition techniques, which allowed us to distinguish between original texts and their simplified versions. For each original text, two simplified versions were generated manually with increasing number of simplification operations. As expected, distinction was easier for the strongly simplified versions, where the most relevant metrics were node strength, shortest paths and diversity. Also, the discrimination of complex texts was improved with higher hierarchical network metrics, thus point...

  15. Automated kidney morphology measurements from ultrasound images using texture and edge analysis

    Science.gov (United States)

    Ravishankar, Hariharan; Annangi, Pavan; Washburn, Michael; Lanning, Justin

    2016-04-01

    In a typical ultrasound scan, a sonographer measures Kidney morphology to assess renal abnormalities. Kidney morphology can also help to discriminate between chronic and acute kidney failure. The caliper placements and volume measurements are often time consuming and an automated solution will help to improve accuracy, repeatability and throughput. In this work, we developed an automated Kidney morphology measurement solution from long axis Ultrasound scans. Automated kidney segmentation is challenging due to wide variability in kidney shape, size, weak contrast of the kidney boundaries and presence of strong edges like diaphragm, fat layers. To address the challenges and be able to accurately localize and detect kidney regions, we present a two-step algorithm that makes use of edge and texture information in combination with anatomical cues. First, we use an edge analysis technique to localize kidney region by matching the edge map with predefined templates. To accurately estimate the kidney morphology, we use textural information in a machine learning algorithm framework using Haar features and Gradient boosting classifier. We have tested the algorithm on 45 unseen cases and the performance against ground truth is measured by computing Dice overlap, % error in major and minor axis of kidney. The algorithm shows successful performance on 80% cases.

  16. An architecture and model for cognitive engineering simulation analysis - Application to advanced aviation automation

    Science.gov (United States)

    Corker, Kevin M.; Smith, Barry R.

    1993-01-01

    The process of designing crew stations for large-scale, complex automated systems is made difficult because of the flexibility of roles that the crew can assume, and by the rapid rate at which system designs become fixed. Modern cockpit automation frequently involves multiple layers of control and display technology in which human operators must exercise equipment in augmented, supervisory, and fully automated control modes. In this context, we maintain that effective human-centered design is dependent on adequate models of human/system performance in which representations of the equipment, the human operator(s), and the mission tasks are available to designers for manipulation and modification. The joint Army-NASA Aircrew/Aircraft Integration (A3I) Program, with its attendant Man-machine Integration Design and Analysis System (MIDAS), was initiated to meet this challenge. MIDAS provides designers with a test bed for analyzing human-system integration in an environment in which both cognitive human function and 'intelligent' machine function are described in similar terms. This distributed object-oriented simulation system, its architecture and assumptions, and our experiences from its application in advanced aviation crew stations are described.

  17. Differential network analysis in human cancer research.

    Science.gov (United States)

    Gill, Ryan; Datta, Somnath; Datta, Susmita

    2014-01-01

    A complex disease like cancer is hardly caused by one gene or one protein singly. It is usually caused by the perturbation of the network formed by several genes or proteins. In the last decade several research teams have attempted to construct interaction maps of genes and proteins either experimentally or reverse engineer interaction maps using computational techniques. These networks were usually created under a certain condition such as an environmental condition, a particular disease, or a specific tissue type. Lately, however, there has been greater emphasis on finding the differential structure of the existing network topology under a novel condition or disease status to elucidate the perturbation in a biological system. In this review/tutorial article we briefly mention some of the research done in this area; we mainly illustrate the computational/statistical methods developed by our team in recent years for differential network analysis using publicly available gene expression data collected from a well known cancer study. This data includes a group of patients with acute lymphoblastic leukemia and a group with acute myeloid leukemia. In particular, we describe the statistical tests to detect the change in the network topology based on connectivity scores which measure the association or interaction between pairs of genes. The tests under various scores are applied to this data set to perform a differential network analysis on gene expression for human leukemia. We believe that, in the future, differential network analysis will be a standard way to view the changes in gene expression and protein expression data globally and these types of tests could be useful in analyzing the complex differential signatures.

  18. Phylodynamic analysis of a viral infection network

    Directory of Open Access Journals (Sweden)

    Teiichiro eShiino

    2012-07-01

    Full Text Available Viral infections by sexual and droplet transmission routes typically spread through a complex host-to-host contact network. Clarifying the transmission network and epidemiological parameters affecting the variations and dynamics of a specific pathogen is a major issue in the control of infectious diseases. However, conventional methods such as interview and/or classical phylogenetic analysis of viral gene sequences have inherent limitations and often fail to detect infectious clusters and transmission connections. Recent improvements in computational environments now permit the analysis of large datasets. In addition, novel analytical methods have been developed that serve to infer the evolutionary dynamics of virus genetic diversity using sample date information and sequence data. This type of framework, termed phylodynamics, helps connect some of the missing links on viral transmission networks, which are often hard to detect by conventional methods of epidemiology. With sufficient number of sequences available, one can use this new inference method to estimate theoretical epidemiological parameters such as temporal distributions of the primary infection, fluctuation of the pathogen population size, basic reproductive number, and the mean time span of disease infectiousness. Transmission networks estimated by this framework often have the properties of a scale-free network, which are characteristic of infectious and social communication processes. Network analysis based on phylodynamics has alluded to various suggestions concerning the infection dynamics associated with a given community and/or risk behavior. In this review, I will summarize the current methods available for identifying the transmission network using phylogeny, and present an argument on the possibilities of applying the scale-free properties to these existing frameworks.

  19. Automated High-Throughput Permethylation for Glycosylation Analysis of Biologics Using MALDI-TOF-MS.

    Science.gov (United States)

    Shubhakar, Archana; Kozak, Radoslaw P; Reiding, Karli R; Royle, Louise; Spencer, Daniel I R; Fernandes, Daryl L; Wuhrer, Manfred

    2016-09-01

    Monitoring glycoprotein therapeutics for changes in glycosylation throughout the drug's life cycle is vital, as glycans significantly modulate the stability, biological activity, serum half-life, safety, and immunogenicity. Biopharma companies are increasingly adopting Quality by Design (QbD) frameworks for measuring, optimizing, and controlling drug glycosylation. Permethylation of glycans prior to analysis by matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS) is a valuable tool for glycan characterization and for screening of large numbers of samples in QbD drug realization. However, the existing protocols for manual permethylation and liquid-liquid extraction (LLE) steps are labor intensive and are thus not practical for high-throughput (HT) studies. Here we present a glycan permethylation protocol, based on 96-well microplates, that has been developed into a kit suitable for HT work. The workflow is largely automated using a liquid handling robot and includes N-glycan release, enrichment of N-glycans, permethylation, and LLE. The kit has been validated according to industry analytical performance guidelines and applied to characterize biopharmaceutical samples, including IgG4 monoclonal antibodies (mAbs) and recombinant human erythropoietin (rhEPO). The HT permethylation enabled glycan characterization and relative quantitation with minimal side reactions: the MALDI-TOF-MS profiles obtained were in good agreement with hydrophilic liquid interaction chromatography (HILIC) and ultrahigh performance liquid chromatography (UHPLC) data. Automated permethylation and extraction of 96 glycan samples was achieved in less than 5 h and automated data acquisition on MALDI-TOF-MS took on average less than 1 min per sample. This automated and HT glycan preparation and permethylation showed to be convenient, fast, and reliable and can be applied for drug glycan profiling and clinical glycan biomarker studies. PMID:27479043

  20. A wireless smart sensor network for automated monitoring of cable tension

    International Nuclear Information System (INIS)

    As cables are primary load carrying members in cable-stayed bridges, monitoring the tension forces of the cables provides valuable information regarding structural soundness. Incorporating wireless smart sensors with vibration-based tension estimation methods provides an efficient means of autonomous long-term monitoring of cable tensions. This study develops a wireless cable tension monitoring system using MEMSIC’s Imote2 smart sensors. The monitoring system features autonomous operation, sustainable energy harvesting and power consumption, and remote access using the internet. To obtain the tension force, an in-network data processing strategy associated with the vibration-based tension estimation method is implemented on the Imote2-based sensor network, significantly reducing the wireless data transmission and the power consumption. The proposed monitoring system has been deployed and validated on the Jindo Bridge, a cable-stayed bridge located in South Korea. (paper)

  1. A wireless smart sensor network for automated monitoring of cable tension

    Science.gov (United States)

    Sim, Sung-Han; Li, Jian; Jo, Hongki; Park, Jong-Woong; Cho, Soojin; Spencer, Billie F., Jr.; Jung, Hyung-Jo

    2014-02-01

    As cables are primary load carrying members in cable-stayed bridges, monitoring the tension forces of the cables provides valuable information regarding structural soundness. Incorporating wireless smart sensors with vibration-based tension estimation methods provides an efficient means of autonomous long-term monitoring of cable tensions. This study develops a wireless cable tension monitoring system using MEMSIC’s Imote2 smart sensors. The monitoring system features autonomous operation, sustainable energy harvesting and power consumption, and remote access using the internet. To obtain the tension force, an in-network data processing strategy associated with the vibration-based tension estimation method is implemented on the Imote2-based sensor network, significantly reducing the wireless data transmission and the power consumption. The proposed monitoring system has been deployed and validated on the Jindo Bridge, a cable-stayed bridge located in South Korea.

  2. Automated Wildlife Monitoring Using Self-Configuring Sensor Networks Deployed in Natural Habitats

    OpenAIRE

    Trifa, Vlad; Girod, Lewis; Travis C. Collier; Blumstein, Daniel; Taylor, C E

    2007-01-01

    To understand the complex interactions among animals within an ecosystem, biologists need to be able to track their location and social interactions. There are a variety of factors that make this difficult. We propose using adaptive, embedded networked sensing technologies to develop an efficient means for wildlife monitoring. This paper surveys our research; we demonstrate how a self-organizing system can efficiently conduct real-time acoustic source detection and localization using distribu...

  3. A Survey of Communications and Networking Technologies for Energy Management in Buildings and Home Automation

    OpenAIRE

    Aravind Kailas; Valentina Cecchi; Arindam Mukherjee

    2012-01-01

    With the exploding power consumption in private households and increasing environmental and regulatory restraints, the need to improve the overall efficiency of electrical networks has never been greater. That being said, the most efficient way to minimize the power consumption is by voluntary mitigation of home electric energy consumption, based on energy-awareness and automatic or manual reduction of standby power of idling home appliances. Deploying bi-directional smart meters and home ene...

  4. a Psycholinguistic Model for Simultaneous Translation, and Proficiency Assessment by Automated Acoustic Analysis of Discourse.

    Science.gov (United States)

    Yaghi, Hussein M.

    Two separate but related issues are addressed: how simultaneous translation (ST) works on a cognitive level and how such translation can be objectively assessed. Both of these issues are discussed in the light of qualitative and quantitative analyses of a large corpus of recordings of ST and shadowing. The proposed ST model utilises knowledge derived from a discourse analysis of the data, many accepted facts in the psychology tradition, and evidence from controlled experiments that are carried out here. This model has three advantages: (i) it is based on analyses of extended spontaneous speech rather than word-, syllable-, or clause -bound stimuli; (ii) it draws equally on linguistic and psychological knowledge; and (iii) it adopts a non-traditional view of language called 'the linguistic construction of reality'. The discourse-based knowledge is also used to develop three computerised systems for the assessment of simultaneous translation: one is a semi-automated system that treats the content of the translation; and two are fully automated, one of which is based on the time structure of the acoustic signals whilst the other is based on their cross-correlation. For each system, several parameters of performance are identified, and they are correlated with assessments rendered by the traditional, subjective, qualitative method. Using signal processing techniques, the acoustic analysis of discourse leads to the conclusion that quality in simultaneous translation can be assessed quantitatively with varying degrees of automation. It identifies as measures of performance (i) three content-based standards; (ii) four time management parameters that reflect the influence of the source on the target language time structure; and (iii) two types of acoustical signal coherence. Proficiency in ST is shown to be directly related to coherence and speech rate but inversely related to omission and delay. High proficiency is associated with a high degree of simultaneity and

  5. Selection of an optimal neural network architecture for computer-aided detection of microcalcifications - Comparison of automated optimization techniques

    International Nuclear Information System (INIS)

    Many computer-aided diagnosis (CAD) systems use neural networks (NNs) for either detection or classification of abnormalities. Currently, most NNs are 'optimized' by manual search in a very limited parameter space. In this work, we evaluated the use of automated optimization methods for selecting an optimal convolution neural network (CNN) architecture. Three automated methods, the steepest descent (SD), the simulated annealing (SA), and the genetic algorithm (GA), were compared. We used as an example the CNN that classifies true and false microcalcifications detected on digitized mammograms by a prescreening algorithm. Four parameters of the CNN architecture were considered for optimization, the numbers of node groups and the filter kernel sizes in the first and second hidden layers, resulting in a search space of 432 possible architectures. The area Az under the receiver operating characteristic (ROC) curve was used to design a cost function. The SA experiments were conducted with four different annealing schedules. Three different parent selection methods were compared for the GA experiments. An available data set was split into two groups with approximately equal number of samples. By using the two groups alternately for training and testing, two different cost surfaces were evaluated. For the first cost surface, the SD method was trapped in a local minimum 91% (392/432) of the time. The SA using the Boltzman schedule selected the best architecture after evaluating, on average, 167 architectures. The GA achieved its best performance with linearly scaled roulette-wheel parent selection; however, it evaluated 391 different architectures, on average, to find the best one. The second cost surface contained no local minimum. For this surface, a simple SD algorithm could quickly find the global minimum, but the SA with the very fast reannealing schedule was still the most efficient. The same SA scheme, however, was trapped in a local minimum on the first cost surface

  6. Mixed Methods Analysis of Enterprise Social Networks

    DEFF Research Database (Denmark)

    Behrendt, Sebastian; Richter, Alexander; Trier, Matthias

    2014-01-01

    The increasing use of enterprise social networks (ESN) generates vast amounts of data, giving researchers and managerial decision makers unprecedented opportunities for analysis. However, more transparency about the available data dimensions and how these can be combined is needed to yield accurate...

  7. Nonlinear Time Series Analysis via Neural Networks

    Science.gov (United States)

    Volná, Eva; Janošek, Michal; Kocian, Václav; Kotyrba, Martin

    This article deals with a time series analysis based on neural networks in order to make an effective forex market [Moore and Roche, J. Int. Econ. 58, 387-411 (2002)] pattern recognition. Our goal is to find and recognize important patterns which repeatedly appear in the market history to adapt our trading system behaviour based on them.

  8. Diversity Performance Analysis on Multiple HAP Networks

    Directory of Open Access Journals (Sweden)

    Feihong Dong

    2015-06-01

    Full Text Available One of the main design challenges in wireless sensor networks (WSNs is achieving a high-data-rate transmission for individual sensor devices. The high altitude platform (HAP is an important communication relay platform for WSNs and next-generation wireless networks. Multiple-input multiple-output (MIMO techniques provide the diversity and multiplexing gain, which can improve the network performance effectively. In this paper, a virtual MIMO (V-MIMO model is proposed by networking multiple HAPs with the concept of multiple assets in view (MAV. In a shadowed Rician fading channel, the diversity performance is investigated. The probability density function (PDF and cumulative distribution function (CDF of the received signal-to-noise ratio (SNR are derived. In addition, the average symbol error rate (ASER with BPSK and QPSK is given for the V-MIMO model. The system capacity is studied for both perfect channel state information (CSI and unknown CSI individually. The ergodic capacity with various SNR and Rician factors for different network configurations is also analyzed. The simulation results validate the effectiveness of the performance analysis. It is shown that the performance of the HAPs network in WSNs can be significantly improved by utilizing the MAV to achieve overlapping coverage, with the help of the V-MIMO techniques.

  9. Diversity Performance Analysis on Multiple HAP Networks.

    Science.gov (United States)

    Dong, Feihong; Li, Min; Gong, Xiangwu; Li, Hongjun; Gao, Fengyue

    2015-01-01

    One of the main design challenges in wireless sensor networks (WSNs) is achieving a high-data-rate transmission for individual sensor devices. The high altitude platform (HAP) is an important communication relay platform for WSNs and next-generation wireless networks. Multiple-input multiple-output (MIMO) techniques provide the diversity and multiplexing gain, which can improve the network performance effectively. In this paper, a virtual MIMO (V-MIMO) model is proposed by networking multiple HAPs with the concept of multiple assets in view (MAV). In a shadowed Rician fading channel, the diversity performance is investigated. The probability density function (PDF) and cumulative distribution function (CDF) of the received signal-to-noise ratio (SNR) are derived. In addition, the average symbol error rate (ASER) with BPSK and QPSK is given for the V-MIMO model. The system capacity is studied for both perfect channel state information (CSI) and unknown CSI individually. The ergodic capacity with various SNR and Rician factors for different network configurations is also analyzed. The simulation results validate the effectiveness of the performance analysis. It is shown that the performance of the HAPs network in WSNs can be significantly improved by utilizing the MAV to achieve overlapping coverage, with the help of the V-MIMO techniques. PMID:26134102

  10. A method for the automated detection phishing websites through both site characteristics and image analysis

    Science.gov (United States)

    White, Joshua S.; Matthews, Jeanna N.; Stacy, John L.

    2012-06-01

    Phishing website analysis is largely still a time-consuming manual process of discovering potential phishing sites, verifying if suspicious sites truly are malicious spoofs and if so, distributing their URLs to the appropriate blacklisting services. Attackers increasingly use sophisticated systems for bringing phishing sites up and down rapidly at new locations, making automated response essential. In this paper, we present a method for rapid, automated detection and analysis of phishing websites. Our method relies on near real-time gathering and analysis of URLs posted on social media sites. We fetch the pages pointed to by each URL and characterize each page with a set of easily computed values such as number of images and links. We also capture a screen-shot of the rendered page image, compute a hash of the image and use the Hamming distance between these image hashes as a form of visual comparison. We provide initial results demonstrate the feasibility of our techniques by comparing legitimate sites to known fraudulent versions from Phishtank.com, by actively introducing a series of minor changes to a phishing toolkit captured in a local honeypot and by performing some initial analysis on a set of over 2.8 million URLs posted to Twitter over a 4 days in August 2011. We discuss the issues encountered during our testing such as resolvability and legitimacy of URL's posted on Twitter, the data sets used, the characteristics of the phishing sites we discovered, and our plans for future work.

  11. Development and Applications of a Prototypic SCALE Control Module for Automated Burnup Credit Analysis

    International Nuclear Information System (INIS)

    Consideration of the depletion phenomena and isotopic uncertainties in burnup-credit criticality analysis places an increasing reliance on computational tools and significantly increases the overall complexity of the calculations. An automated analysis and data management capability is essential for practical implementation of large-scale burnup credit analyses that can be performed in a reasonable amount of time. STARBUCS is a new prototypic analysis sequence being developed for the SCALE code system to perform automated criticality calculations of spent fuel systems employing burnup credit. STARBUCS is designed to help analyze the dominant burnup credit phenomena including spatial burnup gradients and isotopic uncertainties. A search capability also allows STARBUCS to iterate to determine the spent fuel parameters (e.g., enrichment and burnup combinations) that result in a desired keff for a storage configuration. Although STARBUCS was developed to address the analysis needs for spent fuel transport and storage systems, it provides sufficient flexibility to allow virtually any configuration of spent fuel to be analyzed, such as storage pools and reprocessing operations. STARBUCS has been used extensively at Oak Ridge National Laboratory (ORNL) to study burnup credit phenomena in support of the NRC Research program

  12. Integrating automated structured analysis and design with Ada programming support environments

    Science.gov (United States)

    Hecht, Alan; Simmons, Andy

    1986-01-01

    Ada Programming Support Environments (APSE) include many powerful tools that address the implementation of Ada code. These tools do not address the entire software development process. Structured analysis is a methodology that addresses the creation of complete and accurate system specifications. Structured design takes a specification and derives a plan to decompose the system subcomponents, and provides heuristics to optimize the software design to minimize errors and maintenance. It can also produce the creation of useable modules. Studies have shown that most software errors result from poor system specifications, and that these errors also become more expensive to fix as the development process continues. Structured analysis and design help to uncover error in the early stages of development. The APSE tools help to insure that the code produced is correct, and aid in finding obscure coding errors. However, they do not have the capability to detect errors in specifications or to detect poor designs. An automated system for structured analysis and design TEAMWORK, which can be integrated with an APSE to support software systems development from specification through implementation is described. These tools completement each other to help developers improve quality and productivity, as well as to reduce development and maintenance costs. Complete system documentation and reusable code also resultss from the use of these tools. Integrating an APSE with automated tools for structured analysis and design provide capabilities and advantages beyond those realized with any of these systems used by themselves.

  13. Pathway and network analysis of cancer genomes

    DEFF Research Database (Denmark)

    Creixell, Pau; Reimand, Jueri; Haider, Syed;

    2015-01-01

    Genomic information on tumors from 50 cancer types cataloged by the International Cancer Genome Consortium (ICGC) shows that only a few well-studied driver genes are frequently mutated, in contrast to many infrequently mutated genes that may also contribute to tumor biology. Hence there has been...... large interest in developing pathway and network analysis methods that group genes and illuminate the processes involved. We provide an overview of these analysis techniques and show where they guide mechanistic and translational investigations....

  14. Analysis of SSR Using Artificial Neural Networks

    OpenAIRE

    Nagabhushana, BS; Chandrasekharaiah, HS

    1996-01-01

    Artificial neural networks (ANNs) are being advantageously applied to power system analysis problems. They possess the ability to establish complicated input-output mappings through a learning process, without any explicit programming. In this paper, an ANN based method for subsynchronous resonance (SSR) analysis is presented. The designed ANN outputs a measure of the possibility of the occurrence of SSR and is fully trained to accommodate the variations of power system parameters over the en...

  15. Presentation of automated scientific-technical library of institute of nuclear physics in the international nuclear library network

    International Nuclear Information System (INIS)

    Full text: In 2005, the International Nuclear Library Network (INLN) portal was established by the IAEA library and the Atomic Energy of Canada Limited (AECL) library. In the same year on initiative of INIS Liaison Officer for Uzbekistan the Scientific-Technical Library of INP AS RU was connected to INLN that gives an opportunity for INP AS RU specialists to use the electronic resources of IAEA and INLN Members libraries. Members who join to INLN shall undertake to present their own electronic resources for common use. Scientific-Technical Library (STL) of INP is one of the largest libraries among scientific libraries network of AS RU, having fund over 221215 units of scientific-technical literatures, which has functioned since 1957. Periodical publications-78528, from this 39500 is foreign. Peculiarity of the library consists in that, it is only one centre in the region, which collects IAEA publications, Joint Institute for Nuclear Research (Russia) publications and preprints, preprints of largest International nuclear centers - CERN (Switzerland), International Centre for Theoretical Physics (ICTP, Italy) and others. To present list of nuclear information sources of our library to INLN, we set a goal to automate STL of INP and provide electronic mutual exchange with INLN at IAEA, alongside with providing the access to INIS Database and resources for Uzbekistan nuclear organizations under implementing IAEA Technical project UZB/0/003 Upgrading the National INIS centre of Uzbekistan. Within a frame of this project, regarding to STL of INP, following activities were provided: participation of three INP experts in LIBCOM 2007 conference; two-week training courses on IRBIS64 system of two INP experts in Russian National Public Library for Science and Technology (GPNTB), Moscow; obtaining six modules of IRBIS64 system software; works implementation at STL of INP according to the modules of IRBIS64. For realization the library automation works, including making

  16. A feasibility assessment of automated FISH image and signal analysis to assist cervical cancer detection

    Science.gov (United States)

    Wang, Xingwei; Li, Yuhua; Liu, Hong; Li, Shibo; Zhang, Roy R.; Zheng, Bin

    2012-02-01

    Fluorescence in situ hybridization (FISH) technology provides a promising molecular imaging tool to detect cervical cancer. Since manual FISH analysis is difficult, time-consuming, and inconsistent, the automated FISH image scanning systems have been developed. Due to limited focal depth of scanned microscopic image, a FISH-probed specimen needs to be scanned in multiple layers that generate huge image data. To improve diagnostic efficiency of using automated FISH image analysis, we developed a computer-aided detection (CAD) scheme. In this experiment, four pap-smear specimen slides were scanned by a dual-detector fluorescence image scanning system that acquired two spectrum images simultaneously, which represent images of interphase cells and FISH-probed chromosome X. During image scanning, once detecting a cell signal, system captured nine image slides by automatically adjusting optical focus. Based on the sharpness index and maximum intensity measurement, cells and FISH signals distributed in 3-D space were projected into a 2-D con-focal image. CAD scheme was applied to each con-focal image to detect analyzable interphase cells using an adaptive multiple-threshold algorithm and detect FISH-probed signals using a top-hat transform. The ratio of abnormal cells was calculated to detect positive cases. In four scanned specimen slides, CAD generated 1676 con-focal images that depicted analyzable cells. FISH-probed signals were independently detected by our CAD algorithm and an observer. The Kappa coefficients for agreement between CAD and observer ranged from 0.69 to 1.0 in detecting/counting FISH signal spots. The study demonstrated the feasibility of applying automated FISH image and signal analysis to assist cyto-geneticists in detecting cervical cancers.

  17. Analysis and visualization of large complex attack graphs for networks security

    Science.gov (United States)

    Chen, Hongda; Chen, Genshe; Blasch, Erik; Kruger, Martin; Sityar, Irma

    2007-04-01

    In this paper, we have proposed a comprehensive and innovative approach for analysis and visualization of large complex multi-step cyber attack graphs. As an automated tool for cyber attack detection, prediction, and visualization, the newly proposed method transforms large quantities of network security data into real-time actionable intelligence, which can be used to (1) provide guidance on network hardening to prevent attacks, (2) perform real-time attack event correlation during active attacks, and (3) formulate post-attack responses. We show that it is possible to visualize the complex graphs, including all possible network attack paths while still keeping complexity manageable. The proposed analysis and visualization tool provides an efficient and effective solution for predicting potential attacks upon observed intrusion evidence, as well as interactive multi-resolution views such that an analyst can first obtain high-level overviews quickly, and then drill down to specific details.

  18. Networking automation of ECI’s G-204 electronic engineering laboratory

    Directory of Open Access Journals (Sweden)

    Hernán Paz Penagos

    2010-04-01

    Full Text Available Increased use (by students and teachers of the “Escuela Colombiana de Ingeniería Julio Garavito” Electronic Engineering laboratories during the last year has congested access to these laboratories; the School’s Electronic Engineering (Ecitronica programme’s applied Electronic studies’ centre research group thus proposed, designed and developed a research prolect taking advantage of G building’s electrical distribution to offer access facilities, laboratory equipment control, energy saving and improved service quality. The G-204 laboratory’s network sys- tem will have an access control subsystem with client–main computer architecture. The latter consists of a user, schedule, group and work-bank database; the user is connected from any computer (client to the main computer through Internet to reserve his/her turn at laboratory practice by selecting the schedule, group, work-bank, network type required (1Ф or 3Ф and registering co-workers. Access to the G-204 laboratory on the day and time of practice is made by means of an intelligent card reader. Information of public interest produced and controlled by the main computer is displayed on three LCD screens located on one of G building’s second floor walls, as is an electronic clock. The G-204 laboratory temperature and time are continually updated. Work-banks are enabled or disabled by the main computer; the work banks are provided with power (beginning of practice or disconnected (end of practice or due to eventualities to protect the equipment, save energy, facilitate monitors and supervise the logistics of the state of the equipment at the end of each practice. The research group was organised into Transmission Line and Applications sub-groups. Power Line Communications PLC technology was used for to exploring digital modulation alternatives, coding and detecting errors, coupling, data transmission protocols and new applications, all based on channel estimation (networking

  19. An Automated Video Object Extraction System Based on Spatiotemporal Independent Component Analysis and Multiscale Segmentation

    Directory of Open Access Journals (Sweden)

    Zhang Xiao-Ping

    2006-01-01

    Full Text Available Video content analysis is essential for efficient and intelligent utilizations of vast multimedia databases over the Internet. In video sequences, object-based extraction techniques are important for content-based video processing in many applications. In this paper, a novel technique is developed to extract objects from video sequences based on spatiotemporal independent component analysis (stICA and multiscale analysis. The stICA is used to extract the preliminary source images containing moving objects in video sequences. The source image data obtained after stICA analysis are further processed using wavelet-based multiscale image segmentation and region detection techniques to improve the accuracy of the extracted object. An automated video object extraction system is developed based on these new techniques. Preliminary results demonstrate great potential for the new stICA and multiscale-segmentation-based object extraction system in content-based video processing applications.

  20. Time series analysis of temporal networks

    Science.gov (United States)

    Sikdar, Sandipan; Ganguly, Niloy; Mukherjee, Animesh

    2016-01-01

    A common but an important feature of all real-world networks is that they are temporal in nature, i.e., the network structure changes over time. Due to this dynamic nature, it becomes difficult to propose suitable growth models that can explain the various important characteristic properties of these networks. In fact, in many application oriented studies only knowing these properties is sufficient. For instance, if one wishes to launch a targeted attack on a network, this can be done even without the knowledge of the full network structure; rather an estimate of some of the properties is sufficient enough to launch the attack. We, in this paper show that even if the network structure at a future time point is not available one can still manage to estimate its properties. We propose a novel method to map a temporal network to a set of time series instances, analyze them and using a standard forecast model of time series, try to predict the properties of a temporal network at a later time instance. To our aim, we consider eight properties such as number of active nodes, average degree, clustering coefficient etc. and apply our prediction framework on them. We mainly focus on the temporal network of human face-to-face contacts and observe that it represents a stochastic process with memory that can be modeled as Auto-Regressive-Integrated-Moving-Average (ARIMA). We use cross validation techniques to find the percentage accuracy of our predictions. An important observation is that the frequency domain properties of the time series obtained from spectrogram analysis could be used to refine the prediction framework by identifying beforehand the cases where the error in prediction is likely to be high. This leads to an improvement of 7.96% (for error level ≤20%) in prediction accuracy on an average across all datasets. As an application we show how such prediction scheme can be used to launch targeted attacks on temporal networks. Contribution to the Topical Issue

  1. Bacterial growth on surfaces: Automated image analysis for quantification of growth rate-related parameters

    DEFF Research Database (Denmark)

    Møller, S.; Sternberg, Claus; Poulsen, L. K.;

    1995-01-01

    species-specific hybridizations with fluorescence-labelled ribosomal probes to estimate the single-cell concentration of RNA. By automated analysis of digitized images of stained cells, we determined four independent growth rate-related parameters: cellular RNA and DNA contents, cell volume......, and the frequency of dividing cells in a cell population. These parameters were used to compare physiological states of liquid-suspended and surfacegrowing Pseudomonas putida KT2442 in chemostat cultures. The major finding is that the correlation between substrate availability and cellular growth rate found...

  2. Analysis of Automated Modern Web Crawling and Testing Tools and Their Possible Employment for Information Extraction

    Directory of Open Access Journals (Sweden)

    Tomas Grigalis

    2012-04-01

    Full Text Available World Wide Web has become an enormously big repository of data. Extracting, integrating and reusing this kind of data has a wide range of applications, including meta-searching, comparison shopping, business intelligence tools and security analysis of information in websites. However, reaching information in modern WEB 2.0 web pages, where HTML tree is often dynamically modified by various JavaScript codes, new data are added by asynchronous requests to the web server and elements are positioned with the help of cascading style sheets, is a difficult task. The article reviews automated web testing tools for information extraction tasks.Article in Lithuanian

  3. Automated Static Culture System Cell Module Mixing Protocol and Computational Fluid Dynamics Analysis

    Science.gov (United States)

    Kleis, Stanley J.; Truong, Tuan; Goodwin, Thomas J,

    2004-01-01

    This report is a documentation of a fluid dynamic analysis of the proposed Automated Static Culture System (ASCS) cell module mixing protocol. The report consists of a review of some basic fluid dynamics principles appropriate for the mixing of a patch of high oxygen content media into the surrounding media which is initially depleted of oxygen, followed by a computational fluid dynamics (CFD) study of this process for the proposed protocol over a range of the governing parameters. The time histories of oxygen concentration distributions and mechanical shear levels generated are used to characterize the mixing process for different parameter values.

  4. Software Tool for Automated Failure Modes and Effects Analysis (FMEA) of Hydraulic Systems

    DEFF Research Database (Denmark)

    Stecki, J. S.; Conrad, Finn; Oh, B.

    2002-01-01

    management techniques and a vast array of computer aided techniques are applied during design and testing stages. The paper present and discusses the research and development of a software tool for automated failure mode and effects analysis - FMEA - of hydraulic systems. The paper explains the underlying......Offshore, marine,aircraft and other complex engineering systems operate in harsh environmental and operational conditions and must meet stringent requirements of reliability, safety and maintability. To reduce the hight costs of development of new systems in these fields improved the design...

  5. NOA: a cytoscape plugin for network ontology analysis

    OpenAIRE

    Zhang, Chao; Wang, Jiguang; Hanspers, Kristina; Xu, Dong; Chen, Luonan; Alexander R. Pico

    2013-01-01

    Summary: The Network Ontology Analysis (NOA) plugin for Cytoscape implements the NOA algorithm for network-based enrichment analysis, which extends Gene Ontology annotations to network links, or edges. The plugin facilitates the annotation and analysis of one or more networks in Cytoscape according to user-defined parameters. In addition to tables, the NOA plugin also presents results in the form of heatmaps and overview networks in Cytoscape, which can be exported for publication figures. Av...

  6. Capacity analysis of vehicular communication networks

    CERN Document Server

    Lu, Ning

    2013-01-01

    This SpringerBrief focuses on the network capacity analysis of VANETs, a key topic as fundamental guidance on design and deployment of VANETs is very limited. Moreover, unique characteristics of VANETs impose distinguished challenges on such an investigation. This SpringerBrief first introduces capacity scaling laws for wireless networks and briefly reviews the prior arts in deriving the capacity of VANETs. It then studies the unicast capacity considering the socialized mobility model of VANETs. With vehicles communicating based on a two-hop relaying scheme, the unicast capacity bound is deriv

  7. Historical Network Analysis of the Web

    DEFF Research Database (Denmark)

    Brügger, Niels

    2013-01-01

    of the online web has for a number of years gained currency within Internet studies. However, the combination of these two phenomena—historical network analysis of material in web archives—can at best be characterized as an emerging new area of study. Most of the methodological challenges within this new area...... at the Danish parliamentary elections in 2011, 2007, and 2001. As the Internet grows older historical studies of networks on the web will probably become more widespread and therefore it may be about time to begin debating the methodological challenges within this emerging field....

  8. D-MSR: a distributed network management scheme for real-time monitoring and process control applications in wireless industrial automation.

    Science.gov (United States)

    Zand, Pouria; Dilo, Arta; Havinga, Paul

    2013-01-01

    Current wireless technologies for industrial applications, such as WirelessHART and ISA100.11a, use a centralized management approach where a central network manager handles the requirements of the static network. However, such a centralized approach has several drawbacks. For example, it cannot cope with dynamicity/disturbance in large-scale networks in a real-time manner and it incurs a high communication overhead and latency for exchanging management traffic. In this paper, we therefore propose a distributed network management scheme, D-MSR. It enables the network devices to join the network, schedule their communications, establish end-to-end connections by reserving the communication resources for addressing real-time requirements, and cope with network dynamicity (e.g., node/edge failures) in a distributed manner. According to our knowledge, this is the first distributed management scheme based on IEEE 802.15.4e standard, which guides the nodes in different phases from joining until publishing their sensor data in the network. We demonstrate via simulation that D-MSR can address real-time and reliable communication as well as the high throughput requirements of industrial automation wireless networks, while also achieving higher efficiency in network management than WirelessHART, in terms of delay and overhead.

  9. VAPI: low-cost, rapid automated visual inspection system for Petri plate analysis

    Science.gov (United States)

    Chatburn, L. T.; Kirkup, B. C.; Polz, M. F.

    2007-09-01

    Most culture-based microbiology tasks utilize a petri plate during processing, but rarely do the scientists capture the full information available from the plate. In particular, visual analysis of plates is an under-developed rich source of data that can be rapid and non-invasive. However, collecting this data has been limited by the difficulties of standardizing and quantifying human observations, by the limits of a scientists' fatigue, and by the cost of automating the process. The availability of specialized counting equipment and intelligent camera systems has not changed this - they are prohibitively expensive for many laboratories, only process a limited number of plate types, are often destructive to the sample, and have limited accuracy. This paper describes an automated visual inspection solution, VAPI, that employs inexpensive consumer computing hardware and digital cameras along with custom cross-platform open-source software written in C++, combining Trolltech's Qt GUI toolkit with Intel's OpenCV computer vision library. The system is more accurate than common commercial systems costing many times as much, while being flexible in use and offering comparable responsiveness. VAPI not only counts colonies but also sorts and enumerates colonies by morphology, tracks colony growth by time series analysis, and provides other analytical resources. Output to XML files or directly to a database provides data that can be easily maintained and manipulated by the end user, offering ready access for system enhancement, interaction with other software systems, and rapid development of advanced analysis applications.

  10. Automated static image analysis as a novel tool in describing the physical properties of dietary fiber

    Directory of Open Access Journals (Sweden)

    Marcin Andrzej KUREK

    2015-01-01

    Full Text Available Abstract The growing interest in the usage of dietary fiber in food has caused the need to provide precise tools for describing its physical properties. This research examined two dietary fibers from oats and beets, respectively, in variable particle sizes. The application of automated static image analysis for describing the hydration properties and particle size distribution of dietary fiber was analyzed. Conventional tests for water holding capacity (WHC were conducted. The particles were measured at two points: dry and after water soaking. The most significant water holding capacity (7.00 g water/g solid was achieved by the smaller sized oat fiber. Conversely, the water holding capacity was highest (4.20 g water/g solid in larger sized beet fiber. There was evidence for water absorption increasing with a decrease in particle size in regards to the same fiber source. Very strong correlations were drawn between particle shape parameters, such as fiber length, straightness, width and hydration properties measured conventionally. The regression analysis provided the opportunity to estimate whether the automated static image analysis method could be an efficient tool in describing the hydration properties of dietary fiber. The application of the method was validated using mathematical model which was verified in comparison to conventional WHC measurement results.

  11. Simultaneous and automated monitoring of the multimetal biosorption processes by potentiometric sensor array and artificial neural network.

    Science.gov (United States)

    Wilson, D; del Valle, M; Alegret, S; Valderrama, C; Florido, A

    2013-09-30

    In this communication, a new methodology for the simultaneous and automated monitoring of biosorption processes of multimetal mixtures of polluting heavy metals on vegetable wastes based on flow-injection potentiometry (FIP) and electronic tongue detection (ET) is presented. A fixed-bed column filled with grape stalks from wine industry wastes is used as the biosorption setup to remove the metal mixtures from the influent solution. The monitoring system consists in a computer controlled-FIP prototype with the ET based on an array of 9 flow-through ion-selective electrodes and electrodes with generic response to divalent ions placed in series, plus an artificial neural network response model. The cross-response to Cu(2+), Cd(2+), Zn(2+), Pb(2+) and Ca(2+) (as target ions) is used, and only when dynamic treatment of the kinetic components of the transient signal is incorporated, a correct operation of the system is achieved. For this purpose, the FIA peaks are transformed via use of Fourier treatment, and selected coefficients are used to feed an artificial neural network response model. Real-time monitoring of different binary (Cu(2+)/ Pb(2+)), (Cu(2+)/ Zn(2+)) and ternary mixtures (Cu(2+)/ Pb(2+)/ Zn(2+)), (Cu(2+)/ Zn(2+)/ Cd(2+)), simultaneous to the release of Ca(2+) in the effluent solution, are achieved satisfactorily using the reported system, obtaining the corresponding breakthrough curves, and showing the ion-exchange mechanism among the different metals. Analytical performance is verified against conventional spectroscopic techniques, with good concordance of the obtained breakthrough curves and modeled adsorption parameters. PMID:23953435

  12. Automated Meteor Fluxes with a Wide-Field Meteor Camera Network

    Science.gov (United States)

    Blaauw, R. C.; Campbell-Brown, M. D.; Cooke, W.; Weryk, R. J.; Gill, J.; Musci, R.

    2013-01-01

    Within NASA, the Meteoroid Environment Office (MEO) is charged to monitor the meteoroid environment in near ]earth space for the protection of satellites and spacecraft. The MEO has recently established a two ]station system to calculate automated meteor fluxes in the millimeter ]size ]range. The cameras each consist of a 17 mm focal length Schneider lens on a Watec 902H2 Ultimate CCD video camera, producing a 21.7 x 16.3 degree field of view. This configuration has a red ]sensitive limiting meteor magnitude of about +5. The stations are located in the South Eastern USA, 31.8 kilometers apart, and are aimed at a location 90 km above a point 50 km equidistant from each station, which optimizes the common volume. Both single station and double station fluxes are found, each having benefits; more meteors will be detected in a single camera than will be seen in both cameras, producing a better determined flux, but double station detections allow for non ]ambiguous shower associations and permit speed/orbit determinations. Video from the cameras are fed into Linux computers running the ASGARD (All Sky and Guided Automatic Real ]time Detection) software, created by Rob Weryk of the University of Western Ontario Meteor Physics Group. ASGARD performs the meteor detection/photometry, and invokes the MILIG and MORB codes to determine the trajectory, speed, and orbit of the meteor. A subroutine in ASGARD allows for the approximate shower identification in single station meteors. The ASGARD output is used in routines to calculate the flux in units of #/sq km/hour. The flux algorithm employed here differs from others currently in use in that it does not assume a single height for all meteors observed in the common camera volume. In the MEO system, the volume is broken up into a set of height intervals, with the collecting areas determined by the radiant of active shower or sporadic source. The flux per height interval is summed to obtain the total meteor flux. As ASGARD also

  13. Social Network Analysis and Critical Realism

    DEFF Research Database (Denmark)

    Buch-Hansen, Hubert

    2014-01-01

    Social network analysis ( SNA) is an increasingly popular approach that provides researchers with highly developed tools to map and analyze complexes of social relations. Although a number of network scholars have explicated the assumptions that underpin SNA, the approach has yet to be discussed...... in relation to established philosophies of science. This article argues that there is a tension between applied and methods-oriented SNA studies, on the one hand, and those addressing the social-theoretical nature and implications of networks, on the other. The former, in many cases, exhibits positivist...... tendencies, whereas the latter incorporate a number of assumptions that are directly compatible with core critical realist views on the nature of social reality and knowledge. This article suggests that SNA may be detached from positivist social science and come to constitute a valuable instrument...

  14. Intentional risk management through complex networks analysis

    CERN Document Server

    Chapela, Victor; Moral, Santiago; Romance, Miguel

    2015-01-01

    This book combines game theory and complex networks to examine intentional technological risk through modeling. As information security risks are in constant evolution,  the methodologies and tools to manage them must evolve to an ever-changing environment. A formal global methodology is explained  in this book, which is able to analyze risks in cyber security based on complex network models and ideas extracted from the Nash equilibrium. A risk management methodology for IT critical infrastructures is introduced which provides guidance and analysis on decision making models and real situations. This model manages the risk of succumbing to a digital attack and assesses an attack from the following three variables: income obtained, expense needed to carry out an attack, and the potential consequences for an attack. Graduate students and researchers interested in cyber security, complex network applications and intentional risk will find this book useful as it is filled with a number of models, methodologies a...

  15. A network analysis of Sibiu County, Romania

    CERN Document Server

    Grama, Cristina-Nicol

    2013-01-01

    Network science methods have proved to be able to provide useful insights from both a theoretical and a practical point of view in that they can better inform governance policies in complex dynamic environments. The tourism research community has provided an increasing number of works that analyse destinations from a network science perspective. However, most of the studies refer to relatively small samples of actors and linkages. With this note we provide a full network study, although at a preliminary stage, that reports a complete analysis of a Romanian destination (Sibiu). Our intention is to increase the set of similar studies with the aim of supporting the investigations in structural and dynamical characteristics of tourism destinations.

  16. Network design for heavy rainfall analysis

    Science.gov (United States)

    Rietsch, T.; Naveau, P.; Gilardi, N.; Guillou, A.

    2013-12-01

    The analysis of heavy rainfall distributional properties is a complex object of study in hydrology and climatology, and it is essential for impact studies. In this paper, we investigate the question of how to optimize the spatial design of a network of existing weather stations. Our main criterion for such an inquiry is the capability of the network to capture the statistical properties of heavy rainfall described by the Extreme Value Theory. We combine this theory with a machine learning algorithm based on neural networks and a Query By Committee approach. Our resulting algorithm is tested on simulated data and applied to high-quality extreme daily precipitation measurements recorded in France at 331 weather stations during the time period 1980-2010.

  17. Mathematical Analysis of Urban Spatial Networks

    CERN Document Server

    Blanchard, Philippe

    2009-01-01

    Cities can be considered to be among the largest and most complex artificial networks created by human beings. Due to the numerous and diverse human-driven activities, urban network topology and dynamics can differ quite substantially from that of natural networks and so call for an alternative method of analysis. The intent of the present monograph is to lay down the theoretical foundations for studying the topology of compact urban patterns, using methods from spectral graph theory and statistical physics. These methods are demonstrated as tools to investigate the structure of a number of real cities with widely differing properties: medieval German cities, the webs of city canals in Amsterdam and Venice, and a modern urban structure such as found in Manhattan. Last but not least, the book concludes by providing a brief overview of possible applications that will eventually lead to a useful body of knowledge for architects, urban planners and civil engineers.

  18. Fully automated quantitative analysis of breast cancer risk in DCE-MR images

    Science.gov (United States)

    Jiang, Luan; Hu, Xiaoxin; Gu, Yajia; Li, Qiang

    2015-03-01

    Amount of fibroglandular tissue (FGT) and background parenchymal enhancement (BPE) in dynamic contrast enhanced magnetic resonance (DCE-MR) images are two important indices for breast cancer risk assessment in the clinical practice. The purpose of this study is to develop and evaluate a fully automated scheme for quantitative analysis of FGT and BPE in DCE-MR images. Our fully automated method consists of three steps, i.e., segmentation of whole breast, fibroglandular tissues, and enhanced fibroglandular tissues. Based on the volume of interest extracted automatically, dynamic programming method was applied in each 2-D slice of a 3-D MR scan to delineate the chest wall and breast skin line for segmenting the whole breast. This step took advantages of the continuity of chest wall and breast skin line across adjacent slices. We then further used fuzzy c-means clustering method with automatic selection of cluster number for segmenting the fibroglandular tissues within the segmented whole breast area. Finally, a statistical method was used to set a threshold based on the estimated noise level for segmenting the enhanced fibroglandular tissues in the subtraction images of pre- and post-contrast MR scans. Based on the segmented whole breast, fibroglandular tissues, and enhanced fibroglandular tissues, FGT and BPE were automatically computed. Preliminary results of technical evaluation and clinical validation showed that our fully automated scheme could obtain good segmentation of the whole breast, fibroglandular tissues, and enhanced fibroglandular tissues to achieve accurate assessment of FGT and BPE for quantitative analysis of breast cancer risk.

  19. Automated, Ultra-Sterile Solid Sample Handling and Analysis on a Chip

    Science.gov (United States)

    Mora, Maria F.; Stockton, Amanda M.; Willis, Peter A.

    2013-01-01

    There are no existing ultra-sterile lab-on-a-chip systems that can accept solid samples and perform complete chemical analyses without human intervention. The proposed solution is to demonstrate completely automated lab-on-a-chip manipulation of powdered solid samples, followed by on-chip liquid extraction and chemical analysis. This technology utilizes a newly invented glass micro-device for solid manipulation, which mates with existing lab-on-a-chip instrumentation. Devices are fabricated in a Class 10 cleanroom at the JPL MicroDevices Lab, and are plasma-cleaned before and after assembly. Solid samples enter the device through a drilled hole in the top. Existing micro-pumping technology is used to transfer milligrams of powdered sample into an extraction chamber where it is mixed with liquids to extract organic material. Subsequent chemical analysis is performed using portable microchip capillary electrophoresis systems (CE). These instruments have been used for ultra-highly sensitive (parts-per-trillion, pptr) analysis of organic compounds including amines, amino acids, aldehydes, ketones, carboxylic acids, and thiols. Fully autonomous amino acid analyses in liquids were demonstrated; however, to date there have been no reports of completely automated analysis of solid samples on chip. This approach utilizes an existing portable instrument that houses optics, high-voltage power supplies, and solenoids for fully autonomous microfluidic sample processing and CE analysis with laser-induced fluorescence (LIF) detection. Furthermore, the entire system can be sterilized and placed in a cleanroom environment for analyzing samples returned from extraterrestrial targets, if desired. This is an entirely new capability never demonstrated before. The ability to manipulate solid samples, coupled with lab-on-a-chip analysis technology, will enable ultraclean and ultrasensitive end-to-end analysis of samples that is orders of magnitude more sensitive than the ppb goal given

  20. Micro-macro analysis of complex networks.

    Science.gov (United States)

    Marchiori, Massimo; Possamai, Lino

    2015-01-01

    Complex systems have attracted considerable interest because of their wide range of applications, and are often studied via a "classic" approach: study a specific system, find a complex network behind it, and analyze the corresponding properties. This simple methodology has produced a great deal of interesting results, but relies on an often implicit underlying assumption: the level of detail on which the system is observed. However, in many situations, physical or abstract, the level of detail can be one out of many, and might also depend on intrinsic limitations in viewing the data with a different level of abstraction or precision. So, a fundamental question arises: do properties of a network depend on its level of observability, or are they invariant? If there is a dependence, then an apparently correct network modeling could in fact just be a bad approximation of the true behavior of a complex system. In order to answer this question, we propose a novel micro-macro analysis of complex systems that quantitatively describes how the structure of complex networks varies as a function of the detail level. To this extent, we have developed a new telescopic algorithm that abstracts from the local properties of a system and reconstructs the original structure according to a fuzziness level. This way we can study what happens when passing from a fine level of detail ("micro") to a different scale level ("macro"), and analyze the corresponding behavior in this transition, obtaining a deeper spectrum analysis. The obtained results show that many important properties are not universally invariant with respect to the level of detail, but instead strongly depend on the specific level on which a network is observed. Therefore, caution should be taken in every situation where a complex network is considered, if its context allows for different levels of observability.

  1. Automated detection of semagram-laden images using adaptive neural networks

    Science.gov (United States)

    Cerkez, Paul S.; Cannady, James D.

    2012-04-01

    Digital steganography is gaining wide acceptance in the world of electronic copyright stamping. Digital media that are easy to steal, such as graphics, photos and audio files, are being tagged with both visible and invisible copyright stamps (known as digital watermarking). However, these same techniques can also be used to hide communications between actors in criminal or covert activities. An inherent difficulty in detecting steganography is overcoming the variety of methods for hiding a message and the multitude of choices of available media. Another problem in steganography defense is the issue of detection speed since the encoded data is frequently time-sensitive. When a message is visually transmitted in a non-textual format (i.e., in an image) it is referred to as a semagram. Semagrams are relatively easy to create, but very difficult to detect. While steganography can often be identified by detecting digital modifications to an image's structure, an image-based semagram is more difficult because the message is the image itself. The work presented describes the creation of a novel, computer-based application, which uses hybrid hierarchical neural network architecture to detect the likely presence of a semagram message in an image. The prototype system was used to detect semagrams containing Morse Code messages. Based on the results of these experiments our approach provides a significant advance in the detection of complex semagram patterns. Specific results of the experiments and the potential practical applications of the neural network-based technology are discussed. This presentation provides the final results of our research experiments.

  2. A Survey of Communications and Networking Technologies for Energy Management in Buildings and Home Automation

    Directory of Open Access Journals (Sweden)

    Aravind Kailas

    2012-01-01

    Full Text Available With the exploding power consumption in private households and increasing environmental and regulatory restraints, the need to improve the overall efficiency of electrical networks has never been greater. That being said, the most efficient way to minimize the power consumption is by voluntary mitigation of home electric energy consumption, based on energy-awareness and automatic or manual reduction of standby power of idling home appliances. Deploying bi-directional smart meters and home energy management (HEM agents that provision real-time usage monitoring and remote control, will enable HEM in “smart households.” Furthermore, the traditionally inelastic demand curve has began to change, and these emerging HEM technologies enable consumers (industrial to residential to respond to the energy market behavior to reduce their consumption at peak prices, to supply reserves on a as-needed basis, and to reduce demand on the electric grid. Because the development of smart grid-related activities has resulted in an increased interest in demand response (DR and demand side management (DSM programs, this paper presents some popular DR and DSM initiatives that include planning, implementation and evaluation techniques for reducing energy consumption and peak electricity demand. The paper then focuses on reviewing and distinguishing the various state-of-the-art HEM control and networking technologies, and outlines directions for promoting the shift towards a society with low energy demand and low greenhouse gas emissions. The paper also surveys the existing software and hardware tools, platforms, and test beds for evaluating the performance of the information and communications technologies that are at the core of future smart grids. It is envisioned that this paper will inspire future research and design efforts in developing standardized and user-friendly smart energy monitoring systems that are suitable for wide scale deployment in homes.

  3. Meteorological and oceanographic data collected from the National Data Buoy Center Coastal-Marine Automated Network (C-MAN) and moored (weather) buoys during 2013-03 (NODC Accession 0104424)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Data Buoy Center (NDBC) established the Coastal-Marine Automated Network (C-MAN) for the National Weather Service in the early 1980's. NDBC has...

  4. Meteorological and oceanographic data collected from the National Data Buoy Center Coastal-Marine Automated Network (C-MAN) and moored (weather) buoys during February 2015 (NODC Accession 0126669)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Data Buoy Center (NDBC) established the Coastal-Marine Automated Network (C-MAN) for the National Weather Service in the early 1980's. NDBC has...

  5. Meteorological and oceanographic data collected from the National Data Buoy Center Coastal-Marine Automated Network (C-MAN) and moored (weather) buoys during 2011-01 (NCEI Accession 0070959)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Data Buoy Center (NDBC) established the Coastal-Marine Automated Network (C-MAN) for the National Weather Service in the early 1980's. NDBC has...

  6. Meteorological and oceanographic data collected from the National Data Buoy Center Coastal-Marine Automated Network (C-MAN) and moored (weather) buoys during November 2015 (NCEI Accession 0139254)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Data Buoy Center (NDBC) established the Coastal-Marine Automated Network (C-MAN) for the National Weather Service in the early 1980's. NDBC has...

  7. Meteorological and oceanographic data collected from the National Data Buoy Center Coastal-Marine Automated Network (C-MAN) and moored (weather) buoys during April 2016 (NCEI Accession 0150816)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Data Buoy Center (NDBC) established the Coastal-Marine Automated Network (C-MAN) for the National Weather Service in the early 1980's. NDBC has...

  8. Meteorological and oceanographic data collected from the National Data Buoy Center Coastal-Marine Automated Network (C-MAN) and moored (weather) buoys during 2014-05 (NODC Accession 0119474)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Data Buoy Center (NDBC) established the Coastal-Marine Automated Network (C-MAN) for the National Weather Service in the early 1980's. NDBC has...

  9. Meteorological and oceanographic data collected from the National Data Buoy Center Coastal-Marine Automated Network (C-MAN) and moored (weather) buoys during 2014-06 (NODC Accession 0120329)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Data Buoy Center (NDBC) established the Coastal-Marine Automated Network (C-MAN) for the National Weather Service in the early 1980's. NDBC has...

  10. Meteorological and oceanographic data collected from the National Data Buoy Center Coastal-Marine Automated Network (C-MAN) and moored (weather) buoys during 2013-11 (NODC Accession 0115123)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Data Buoy Center (NDBC) established the Coastal-Marine Automated Network (C-MAN) for the National Weather Service in the early 1980's. NDBC has...

  11. Meteorological and oceanographic data collected from the National Data Buoy Center Coastal-Marine Automated Network (C-MAN) and moored (weather) buoys during 2013-09 (NODC Accession 0113792)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Data Buoy Center (NDBC) established the Coastal-Marine Automated Network (C-MAN) for the National Weather Service in the early 1980's. NDBC has...

  12. Meteorological and oceanographic data collected from the National Data Buoy Center Coastal-Marine Automated Network (C-MAN) and moored (weather) buoys during 2012-12 (NODC Accession 0101426)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Data Buoy Center (NDBC) established the Coastal-Marine Automated Network (C-MAN) for the National Weather Service in the early 1980's. NDBC has...

  13. Meteorological and oceanographic data collected from the National Data Buoy Center Coastal-Marine Automated Network (C-MAN) and moored (weather) buoys during June 2015 (NCEI Accession 0129884)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Data Buoy Center (NDBC) established the Coastal-Marine Automated Network (C-MAN) for the National Weather Service in the early 1980's. NDBC has...

  14. Meteorological and oceanographic data collected from the National Data Buoy Center Coastal-Marine Automated Network (C-MAN) and moored (weather) buoys during 2013-10 (NODC Accession 0114407)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Data Buoy Center (NDBC) established the Coastal-Marine Automated Network (C-MAN) for the National Weather Service in the early 1980's. NDBC has...

  15. Meteorological and oceanographic data collected from the National Data Buoy Center Coastal-Marine Automated Network (C-MAN) and moored (weather) buoys during September 2015 (NCEI Accession 0136935)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Data Buoy Center (NDBC) established the Coastal-Marine Automated Network (C-MAN) for the National Weather Service in the early 1980's. NDBC has...

  16. Meteorological and oceanographic data collected from the National Data Buoy Center Coastal-Marine Automated Network (C-MAN) and moored (weather) buoys during 2014-01 (NODC Accession 0116427)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Data Buoy Center (NDBC) established the Coastal-Marine Automated Network (C-MAN) for the National Weather Service in the early 1980's. NDBC has...

  17. Meteorological and oceanographic data collected from the National Data Buoy Center Coastal-Marine Automated Network (C-MAN) and moored (weather) buoys during 2012-08 (NODC Accession 0095593)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Data Buoy Center (NDBC) established the Coastal-Marine Automated Network (C-MAN) for the National Weather Service in the early 1980's. NDBC has...

  18. Meteorological and oceanographic data collected from the National Data Buoy Center Coastal-Marine Automated Network (C-MAN) and moored (weather) buoys during May 2016 (NCEI Accession 0153542)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Data Buoy Center (NDBC) established the Coastal-Marine Automated Network (C-MAN) for the National Weather Service in the early 1980's. NDBC has...

  19. Meteorological and oceanographic data collected from the National Data Buoy Center Coastal-Marine Automated Network (C-MAN) and moored (weather) buoys during 2013-02 (NODC Accession 0104259)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Data Buoy Center (NDBC) established the Coastal-Marine Automated Network (C-MAN) for the National Weather Service in the early 1980's. NDBC has...

  20. Meteorological and oceanographic data collected from the National Data Buoy Center Coastal-Marine Automated Network (C-MAN) and moored (weather) buoys during July 2016 (NCEI Accession 0156326)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Data Buoy Center (NDBC) established the Coastal-Marine Automated Network (C-MAN) for the National Weather Service in the early 1980's. NDBC has...

  1. Meteorological and oceanographic data collected from the National Data Buoy Center Coastal-Marine Automated Network (C-MAN) and moored (weather) buoys during May 2015 (NCEI Accession 0129415)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Data Buoy Center (NDBC) established the Coastal-Marine Automated Network (C-MAN) for the National Weather Service in the early 1980's. NDBC has...

  2. Meteorological and oceanographic data collected from the National Data Buoy Center Coastal-Marine Automated Network (C-MAN) and moored (weather) buoys during December 2014 (NODC Accession 0125264)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Data Buoy Center (NDBC) established the Coastal-Marine Automated Network (C-MAN) for the National Weather Service in the early 1980's. NDBC has...

  3. Meteorological and oceanographic data collected from the National Data Buoy Center Coastal-Marine Automated Network (C-MAN) and moored (weather) buoys during 2011-10 (NODC Accession 0079513)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Data Buoy Center (NDBC) established the Coastal-Marine Automated Network (C-MAN) for the National Weather Service in the early 1980's. NDBC has...

  4. Meteorological and oceanographic data collected from the National Data Buoy Center Coastal-Marine Automated Network (C-MAN) and moored (weather) buoys during 2012-07 (NODC Accession 0095565)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Data Buoy Center (NDBC) established the Coastal-Marine Automated Network (C-MAN) for the National Weather Service in the early 1980's. NDBC has...

  5. Meteorological and oceanographic data collected from the National Data Buoy Center Coastal-Marine Automated Network (C-MAN) and moored (weather) buoys during August 2015 (NCEI Accession 0131704)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Data Buoy Center (NDBC) established the Coastal-Marine Automated Network (C-MAN) for the National Weather Service in the early 1980's. NDBC has...

  6. Meteorological and oceanographic data collected from the National Data Buoy Center Coastal-Marine Automated Network (C-MAN) and moored (weather) buoys during 2012-11 (NODC Accession 0099948)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Data Buoy Center (NDBC) established the Coastal-Marine Automated Network (C-MAN) for the National Weather Service in the early 1980's. NDBC has...

  7. Meteorological and oceanographic data collected from the National Data Buoy Center Coastal-Marine Automated Network (C-MAN) and moored (weather) buoys during 2012-02 (NODC Accession 0086627)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Data Buoy Center (NDBC) established the Coastal-Marine Automated Network (C-MAN) for the National Weather Service in the early 1980's. NDBC has...

  8. Meteorological and oceanographic data collected from the National Data Buoy Center Coastal-Marine Automated Network (C-MAN) and moored (weather) buoys during 2013-05 (NODC Accession 0108385)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Data Buoy Center (NDBC) established the Coastal-Marine Automated Network (C-MAN) for the National Weather Service in the early 1980's. NDBC has...

  9. Meteorological and oceanographic data collected from the National Data Buoy Center Coastal-Marine Automated Network (C-MAN) and moored (weather) buoys during 2011-07 (NCEI Accession 0074922)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Data Buoy Center (NDBC) established the Coastal-Marine Automated Network (C-MAN) for the National Weather Service in the early 1980's. NDBC has...

  10. Meteorological and oceanographic data collected from the National Data Buoy Center Coastal-Marine Automated Network (C-MAN) and moored (weather) buoys during 2012-10 (NODC Accession 0099428)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Data Buoy Center (NDBC) established the Coastal-Marine Automated Network (C-MAN) for the National Weather Service in the early 1980's. NDBC has...

  11. Meteorological and oceanographic data collected from the National Data Buoy Center Coastal-Marine Automated Network (C-MAN) and moored (weather) buoys during October 2014 (NODC Accession 0122591)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Data Buoy Center (NDBC) established the Coastal-Marine Automated Network (C-MAN) for the National Weather Service in the early 1980's. NDBC has...

  12. Meteorological and oceanographic data collected from the National Data Buoy Center Coastal-Marine Automated Network (C-MAN) and moored (weather) buoys during September 2014 (NODC Accession 0122593)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Data Buoy Center (NDBC) established the Coastal-Marine Automated Network (C-MAN) for the National Weather Service in the early 1980's. NDBC has...

  13. Meteorological and oceanographic data collected from the National Data Buoy Center Coastal-Marine Automated Network (C-MAN) and moored (weather) buoys during 2011-12 (NODC Accession 0083918)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Data Buoy Center (NDBC) established the Coastal-Marine Automated Network (C-MAN) for the National Weather Service in the early 1980's. NDBC has...

  14. Meteorological and oceanographic data collected from the National Data Buoy Center Coastal-Marine Automated Network (C-MAN) and moored (weather) buoys during January 2016 (NCEI Accession 0142963)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Data Buoy Center (NDBC) established the Coastal-Marine Automated Network (C-MAN) for the National Weather Service in the early 1980's. NDBC has...

  15. Meteorological and oceanographic data collected from the National Data Buoy Center Coastal-Marine Automated Network (C-MAN) and moored (weather) buoys during 2012-06 (NODC Accession 0092557)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Data Buoy Center (NDBC) established the Coastal-Marine Automated Network (C-MAN) for the National Weather Service in the early 1980's. NDBC has...

  16. Meteorological and oceanographic data collected from the National Data Buoy Center Coastal-Marine Automated Network (C-MAN) and moored (weather) buoys during 2013-12 (NODC Accession 0115760)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Data Buoy Center (NDBC) established the Coastal-Marine Automated Network (C-MAN) for the National Weather Service in the early 1980's. NDBC has...

  17. Meteorological and oceanographic data collected from the National Data Buoy Center Coastal-Marine Automated Network (C-MAN) and moored (weather) buoys during January 2015 (NODC Accession 0125752)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Data Buoy Center (NDBC) established the Coastal-Marine Automated Network (C-MAN) for the National Weather Service in the early 1980's. NDBC has...

  18. Meteorological and oceanographic data collected from the National Data Buoy Center Coastal-Marine Automated Network (C-MAN) and moored (weather) buoys during February 2016 (NCEI Accession 0145373)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Data Buoy Center (NDBC) established the Coastal-Marine Automated Network (C-MAN) for the National Weather Service in the early 1980's. NDBC has...

  19. Meteorological and oceanographic data collected from the National Data Buoy Center Coastal-Marine Automated Network (C-MAN) and moored (weather) buoys during 2012-05 (NODC Accession 0090313)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Data Buoy Center (NDBC) established the Coastal-Marine Automated Network (C-MAN) for the National Weather Service in the early 1980's. NDBC has...

  20. Meteorological and oceanographic data collected from the National Data Buoy Center Coastal-Marine Automated Network (C-MAN) and moored (weather) buoys during 2011-04 (NCEI Accession 0072886)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Data Buoy Center (NDBC) established the Coastal-Marine Automated Network (C-MAN) for the National Weather Service in the early 1980's. NDBC has...