WorldWideScience

Sample records for integrates distributed automated

  1. Automated Planning and Scheduling for Planetary Rover Distributed Operations

    Science.gov (United States)

    Backes, Paul G.; Rabideau, Gregg; Tso, Kam S.; Chien, Steve

    1999-01-01

    Automated planning and Scheduling, including automated path planning, has been integrated with an Internet-based distributed operations system for planetary rover operations. The resulting prototype system enables faster generation of valid rover command sequences by a distributed planetary rover operations team. The Web Interface for Telescience (WITS) provides Internet-based distributed collaboration, the Automated Scheduling and Planning Environment (ASPEN) provides automated planning and scheduling, and an automated path planner provided path planning. The system was demonstrated on the Rocky 7 research rover at JPL.

  2. World-wide distribution automation systems

    International Nuclear Information System (INIS)

    Devaney, T.M.

    1994-01-01

    A worldwide power distribution automation system is outlined. Distribution automation is defined and the status of utility automation is discussed. Other topics discussed include a distribution management system, substation feeder, and customer functions, potential benefits, automation costs, planning and engineering considerations, automation trends, databases, system operation, computer modeling of system, and distribution management systems

  3. Distribution automation

    International Nuclear Information System (INIS)

    Gruenemeyer, D.

    1991-01-01

    This paper reports on a Distribution Automation (DA) System enhances the efficiency and productivity of a utility. It also provides intangible benefits such as improved public image and market advantages. A utility should evaluate the benefits and costs of such a system before committing funds. The expenditure for distribution automation is economical when justified by the deferral of a capacity increase, a decrease in peak power demand, or a reduction in O and M requirements

  4. EDISON - research programme on electric distribution automation 1993-1997. Final report 1997

    Energy Technology Data Exchange (ETDEWEB)

    Lehtonen, M [ed.; VTT Energy, Espoo (Finland). Energy Systems

    1998-08-01

    This report comprises a summary of the results of the five year research programme EDISON on distribution automation in Finnish utilities. The research programme (1993 - 1997) was conducted under the leadership of VTT Energy, in cooperation with universities, distribution companies and the manufacturing industry. The main part of the funding has been from the Technology Development Centre TEKES and from manufacturing companies. The goal of the research programme was to develop a new scheme for a complete distribution automation system, including the network automation, computer systems in the control centre and the customer associated automation functions. In addition, the techniques for demand side management were developed and integrated into the automation scheme. The final aim was to demonstrate the automation functions and systems of the scheme in real distribution systems. The results of nineteen projects are given in this report

  5. EDISON - research programme on electricity distribution automation 1993-1997. Interim report 1995

    Energy Technology Data Exchange (ETDEWEB)

    Lehtonen, M. [ed.] [VTT Energy, Espoo (Finland). Energy Systems

    1996-12-31

    The report comprises a summary of the results of the first three years of the research programme EDISON on distribution automation in Finnish electrical utilities. The five year research programme (1993-1997) is conducted under the leadership of VTT Energy, in cooperation with universities, distribution companies and the manufacturing industry. The main part of funding is from the Technology Development Centre (Tekes) and from manufacturing companies. The goal of the research programme is to develop a new scheme for a complete distribution automation system, including the network automation, computer systems in the control centre and the customer automation functions. In addition, the techniques for demand side management are developed and integrated into the automation scheme. The final aim is to demonstrate the automation functions and systems of the scheme in real distribution systems. The results of thirteen projects are now given. These results should be considered intermediate, since most projects will be continued in 1996. (orig.)

  6. EDISON - research programme on electricity distribution automation 1993-1997. Interim report 1996

    Energy Technology Data Exchange (ETDEWEB)

    Lehtonen, M. [ed.] [VTT Energy, Espoo (Finland). Energy Systems

    1997-12-31

    The report comprises a summary of the results of the first four years of the research programme EDISON on distribution automation in Finnish utilities. The five year research programme (1993-1997) is conducted under the leadership of VTT Energy, in cooperation with universities, distribution companies and the manufacturing industry. The main part of the funding is from the Technology Development Centre TEKES and from manufacturing companies. The goal of the research programme is to develop a new scheme for a complete distribution automation system, including the network automation, computer systems in the control centre and the customer associated automation functions. In addition, the techniques for demand side management are developed and integrated into the automation scheme. The final aim is to demonstrate the automation functions and systems of the scheme in real distribution systems. The results of fifteen projects are now given. These results should be considered intermediate, since most projects will be continued in 1997. (orig.) 43 refs.

  7. EDISON - research programme on electricity distribution automation 1993-1997. Interim report 1995

    Energy Technology Data Exchange (ETDEWEB)

    Lehtonen, M [ed.; VTT Energy, Espoo (Finland). Energy Systems

    1997-12-31

    The report comprises a summary of the results of the first three years of the research programme EDISON on distribution automation in Finnish electrical utilities. The five year research programme (1993-1997) is conducted under the leadership of VTT Energy, in cooperation with universities, distribution companies and the manufacturing industry. The main part of funding is from the Technology Development Centre (Tekes) and from manufacturing companies. The goal of the research programme is to develop a new scheme for a complete distribution automation system, including the network automation, computer systems in the control centre and the customer automation functions. In addition, the techniques for demand side management are developed and integrated into the automation scheme. The final aim is to demonstrate the automation functions and systems of the scheme in real distribution systems. The results of thirteen projects are now given. These results should be considered intermediate, since most projects will be continued in 1996. (orig.)

  8. Models, methods and software for distributed knowledge acquisition for the automated construction of integrated expert systems knowledge bases

    International Nuclear Information System (INIS)

    Dejneko, A.O.

    2011-01-01

    Based on an analysis of existing models, methods and means of acquiring knowledge, a base method of automated knowledge acquisition has been chosen. On the base of this method, a new approach to integrate information acquired from knowledge sources of different typologies has been proposed, and the concept of a distributed knowledge acquisition with the aim of computerized formation of the most complete and consistent models of problem areas has been introduced. An original algorithm for distributed knowledge acquisition from databases, based on the construction of binary decision trees has been developed [ru

  9. Systems integration (automation system). System integration (automation system)

    Energy Technology Data Exchange (ETDEWEB)

    Fujii, K; Komori, T; Fukuma, Y; Oikawa, M [Nippon Steal Corp., Tokyo (Japan)

    1991-09-26

    This paper introduces business activities on an automation systems integration (SI) started by a company in July,1988, and describes the SI concepts. The business activities include, with the CIM (unified production carried out on computers) and AMENITY (living environment) as the mainstays, a single responsibility construction ranging from consultation on structuring optimal systems for processing and assembling industries and intelligent buildings to system design, installation and after-sales services. With an SI standing on users {prime} position taken most importantly, the business starts from a planning and consultation under close coordination. On the conceptual basis of structuring optimal systems using the ompany {prime}s affluent know-hows and tools and adapting and applying with multi-vendors, open networks, centralized and distributed systems, the business is promoted with the accumulated technologies capable of realizing artificial intelligence and neural networks in its background, and supported with highly valuable business results in the past. 10 figs., 1 tab.

  10. ATLAS Distributed Computing Automation

    CERN Document Server

    Schovancova, J; The ATLAS collaboration; Borrego, C; Campana, S; Di Girolamo, A; Elmsheuser, J; Hejbal, J; Kouba, T; Legger, F; Magradze, E; Medrano Llamas, R; Negri, G; Rinaldi, L; Sciacca, G; Serfon, C; Van Der Ster, D C

    2012-01-01

    The ATLAS Experiment benefits from computing resources distributed worldwide at more than 100 WLCG sites. The ATLAS Grid sites provide over 100k CPU job slots, over 100 PB of storage space on disk or tape. Monitoring of status of such a complex infrastructure is essential. The ATLAS Grid infrastructure is monitored 24/7 by two teams of shifters distributed world-wide, by the ATLAS Distributed Computing experts, and by site administrators. In this paper we summarize automation efforts performed within the ATLAS Distributed Computing team in order to reduce manpower costs and improve the reliability of the system. Different aspects of the automation process are described: from the ATLAS Grid site topology provided by the ATLAS Grid Information System, via automatic site testing by the HammerCloud, to automatic exclusion from production or analysis activities.

  11. Automating an integrated spatial data-mining model for landfill site selection

    Science.gov (United States)

    Abujayyab, Sohaib K. M.; Ahamad, Mohd Sanusi S.; Yahya, Ahmad Shukri; Ahmad, Siti Zubaidah; Aziz, Hamidi Abdul

    2017-10-01

    An integrated programming environment represents a robust approach to building a valid model for landfill site selection. One of the main challenges in the integrated model is the complicated processing and modelling due to the programming stages and several limitations. An automation process helps avoid the limitations and improve the interoperability between integrated programming environments. This work targets the automation of a spatial data-mining model for landfill site selection by integrating between spatial programming environment (Python-ArcGIS) and non-spatial environment (MATLAB). The model was constructed using neural networks and is divided into nine stages distributed between Matlab and Python-ArcGIS. A case study was taken from the north part of Peninsular Malaysia. 22 criteria were selected to utilise as input data and to build the training and testing datasets. The outcomes show a high-performance accuracy percentage of 98.2% in the testing dataset using 10-fold cross validation. The automated spatial data mining model provides a solid platform for decision makers to performing landfill site selection and planning operations on a regional scale.

  12. Distribution automation at BC Hydro : a case study

    Energy Technology Data Exchange (ETDEWEB)

    Siew, C. [BC Hydro, Vancouver, BC (Canada). Smart Grid Development Program

    2009-07-01

    This presentation discussed a distribution automation study conducted by BC Hydro to determine methods of improving grid performance by supporting intelligent transmission and distribution systems. The utility's smart grid program includes a number of utility-side and customer-side applications, including enabled demand response, microgrid, and operational efficiency applications. The smart grid program will improve reliability and power quality by 40 per cent, improve conservation and energy efficiency throughout the province, and provide enhanced customer service. Programs and initiatives currently underway at the utility include distribution management, smart metering, distribution automation, and substation automation programs. The utility's automation functionality will include fault interruption and locating, restoration capability, and restoration success. A decision support system has also been established to assist control room and field operating personnel with monitoring and control of the electric distribution system. Protection, control and monitoring (PCM) and volt VAR optimization upgrades are also planned. Reclosers are also being automated, and an automation guide has been developed for switches. tabs., figs.

  13. Partial Automated Alignment and Integration System

    Science.gov (United States)

    Kelley, Gary Wayne (Inventor)

    2014-01-01

    The present invention is a Partial Automated Alignment and Integration System (PAAIS) used to automate the alignment and integration of space vehicle components. A PAAIS includes ground support apparatuses, a track assembly with a plurality of energy-emitting components and an energy-receiving component containing a plurality of energy-receiving surfaces. Communication components and processors allow communication and feedback through PAAIS.

  14. Layered distributed architecture for plant automation

    International Nuclear Information System (INIS)

    Aravamuthan, G.; Verma, Yachika; Ranjan, Jyoti; Chachondia, Alka S.; Ganesh, G.

    2005-01-01

    The development of plant automation system and associated software remains one of the greatest challenges to the widespread implementation of highly adaptive re-configurable automation technology. This paper presents a layered distributed architecture for a plant automation system designed to support rapid reconfiguration and redeployment of automation components. The paper first presents evolution of automation architecture and their associated environment in the past few decades and then presents the concept of layered system architecture and the use of automation components to support the construction of a wide variety of automation system. It also highlights the role of standards and technology, which can be used in the development of automation components. We have attempted to adhere to open standards and technology for the development of automation component at a various layers. It also highlights the application of this concept in the development of an Operator Information System (OIS) for Advanced Heavy Water Reactor (AHWR). (author)

  15. Data Distribution Service for Industrial Automation

    OpenAIRE

    Yang, Jinsong

    2012-01-01

    In industrial automation systems, there is usually large volume of data which needs to be delivered to right places at the right time. In addition, large number of nodes in the automation systems are usually distributed which increases the complexity that there needs to be more point-to-point Ethernet-connections in the network. Hence, it is necessary to apply data-centric design and reduce the connection complexity. Data Distributed Service for Real-Time Systems (DDS) is a data-centric middl...

  16. Automated Energy Distribution and Reliability System: Validation Integration - Results of Future Architecture Implementation

    Energy Technology Data Exchange (ETDEWEB)

    Buche, D. L.

    2008-06-01

    This report describes Northern Indiana Public Service Co. project efforts to develop an automated energy distribution and reliability system. The purpose of this project was to implement a database-driven GIS solution that would manage all of the company's gas, electric, and landbase objects. This report is second in a series of reports detailing this effort.

  17. MDSplus automated build and distribution system

    Energy Technology Data Exchange (ETDEWEB)

    Fredian, T., E-mail: twf@psfc.mit.edu [Massachusetts Institute of Technology, 175 Albany Street, Cambridge, MA 02139 (United States); Stillerman, J. [Massachusetts Institute of Technology, 175 Albany Street, Cambridge, MA 02139 (United States); Manduchi, G. [Consorzio RFX, Euratom-ENEA Association, Corso Stati Uniti 4, Padova 35127 (Italy)

    2014-05-15

    Support of the MDSplus data handling system has been enhanced by the addition of an automated build system which does nightly builds of MDSplus for many computer platforms producing software packages which can now be downloaded using a web browser or via package repositories suitable for automatic updating. The build system was implemented using an extensible continuous integration server product called Hudson which schedules software builds on a collection of VMware based virtual machines. New releases are created based on updates via the MDSplus cvs code repository and versioning are managed using cvs tags and branches. Currently stable, beta and alpha releases of MDSplus are maintained for eleven different platforms including Windows, MacOSX, RedHat Enterprise Linux, Fedora, Ubuntu and Solaris. For some of these platforms, MDSplus packaging has been broken into functional modules so users can pick and choose which MDSplus features they want to install. An added feature to the latest Linux based platforms is the use of package dependencies. When installing MDSplus from the package repositories, any additional required packages used by MDSplus will be installed automatically greatly simplifying the installation of MDSplus. This paper will describe the MDSplus package automated build and distribution system.

  18. MDSplus automated build and distribution system

    International Nuclear Information System (INIS)

    Fredian, T.; Stillerman, J.; Manduchi, G.

    2014-01-01

    Support of the MDSplus data handling system has been enhanced by the addition of an automated build system which does nightly builds of MDSplus for many computer platforms producing software packages which can now be downloaded using a web browser or via package repositories suitable for automatic updating. The build system was implemented using an extensible continuous integration server product called Hudson which schedules software builds on a collection of VMware based virtual machines. New releases are created based on updates via the MDSplus cvs code repository and versioning are managed using cvs tags and branches. Currently stable, beta and alpha releases of MDSplus are maintained for eleven different platforms including Windows, MacOSX, RedHat Enterprise Linux, Fedora, Ubuntu and Solaris. For some of these platforms, MDSplus packaging has been broken into functional modules so users can pick and choose which MDSplus features they want to install. An added feature to the latest Linux based platforms is the use of package dependencies. When installing MDSplus from the package repositories, any additional required packages used by MDSplus will be installed automatically greatly simplifying the installation of MDSplus. This paper will describe the MDSplus package automated build and distribution system

  19. Chattanooga Electric Power Board Case Study Distribution Automation

    Energy Technology Data Exchange (ETDEWEB)

    Glass, Jim [Chattanooga Electric Power Board (EPB), TN (United States); Melin, Alexander M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Starke, Michael R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Ollis, Ben [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-10-01

    In 2009, the U.S. Department of Energy under the American Recovery and Reinvestment Act (ARRA) awarded a grant to the Chattanooga, Tennessee, Electric Power Board (EPB) as part of the Smart Grid Investment Grant Program. The grant had the objective “to accelerate the transformation of the nation’s electric grid by deploying smart grid technologies.” This funding award enabled EPB to expedite the original smart grid implementation schedule from an estimated 10-12 years to 2.5 years. With this funding, EPB invested heavily in distribution automation technologies including installing over 1,200 automated circuit switches and sensors on 171 circuits. For utilities considering a commitment to distribution automation, there are underlying questions such as the following: “What is the value?” and “What are the costs?” This case study attempts to answer these questions. The primary benefit of distribution automation is increased reliability or reduced power outage duration and frequency. Power outages directly impact customer economics by interfering with business functions. In the past, this economic driver has been difficult to effectively evaluate. However, as this case study demonstrates, tools and analysis techniques are now available. In this case study, the impact on customer costs associated with power outages before and after the implementation of distribution automation are compared. Two example evaluations are performed to demonstrate the benefits: 1) a savings baseline for customers under normal operations1 and 2) customer savings for a single severe weather event. Cost calculations for customer power outages are performed using the US Department of Energy (DOE) Interruption Cost Estimate (ICE) calculator2. This tool uses standard metrics associated with outages and the customers to calculate cost impact. The analysis shows that EPB customers have seen significant reliability improvements from the implementation of distribution automation. Under

  20. Automated Distributed Simulation in Ptolemy II

    DEFF Research Database (Denmark)

    Lázaro Cuadrado, Daniel; Ravn, Anders Peter; Koch, Peter

    2007-01-01

    the ensuing communication and synchronization problems. Very often the designer has to explicitly specify extra information concerning distribution for the framework to make an effort to exploit parallelism. This paper presents Automated Distributed Simulation (ADS), which allows the designer to forget about...

  1. How to Evaluate Integrated Library Automation Systems.

    Science.gov (United States)

    Powell, James R.; Slach, June E.

    1985-01-01

    This paper describes methodology used in compiling a list of candidate integrated library automation systems at a corporate technical library. Priorities for automation, identification of candidate systems, the filtering process, information for suppliers, software and hardware considerations, on-site evaluations, and final system selection are…

  2. Research of the application of the new communication technologies for distribution automation

    Science.gov (United States)

    Zhong, Guoxin; Wang, Hao

    2018-03-01

    Communication network is a key factor of distribution automation. In recent years, new communication technologies for distribution automation have a rapid development in China. This paper introduces the traditional communication technologies of distribution automation and analyse the defects of these traditional technologies. Then this paper gives a detailed analysis on some new communication technologies for distribution automation including wired communication and wireless communication and then gives an application suggestion of these new technologies.

  3. Correlation of the UV-induced mutational spectra and the DNA damage distribution of the human HPRT gene: Automating the analysis

    International Nuclear Information System (INIS)

    Kotturi, G.; Erfle, H.; Koop, B.F.; Boer, J.G. de; Glickman, B.W.

    1994-01-01

    Automated DNA sequencers can be readily adapted for various types of sequence-based nucleic acid analysis: more recently it was determined the distribution of UV photoproducts in the E. coli laci gene using techniques developed for automated fluorescence-based analysis. We have been working to improve the automated approach of damage distribution. Our current method is more rigorous. We have new software that integrates the area under the individual peaks, rather than measuring the height of the curve. In addition, we now employ an internal standard. The analysis can also be partially automated. Detection limits for both major types of UV-photoproducts (cyclobutane dimers and pyrimidine (6-4) pyrimidone photoproducts) are reported. The UV-induced damage distribution in the hprt gene is compared to the mutational spectra in human and rodents cells

  4. Distribution system analysis and automation

    CERN Document Server

    Gers, Juan

    2013-01-01

    A comprehensive guide to techniques that allow engineers to simulate, analyse and optimise power distribution systems which combined with automation, underpin the emerging concept of the "smart grid". This book is supported by theoretical concepts with real-world applications and MATLAB exercises.

  5. Tools for the Automation of Large Distributed Control Systems

    CERN Document Server

    Gaspar, Clara

    2005-01-01

    The new LHC experiments at CERN will have very large numbers of channels to operate. In order to be able to configure and monitor such large systems, a high degree of parallelism is necessary. The control system is built as a hierarchy of sub-systems distributed over several computers. A toolkit - SMI++, combining two approaches: finite state machines and rule-based programming, allows for the description of the various sub-systems as decentralized deciding entities, reacting is real-time to changes in the system, thus providing for the automation of standard procedures and for the automatic recovery from error conditions in a hierarchical fashion. In this paper we will describe the principles and features of SMI++ as well as its integration with an industrial SCADA tool for use by the LHC experiments and we will try to show that such tools, can provide a very convenient mechanism for the automation of large scale, high complexity, applications.

  6. Research and Application of Construction of Operation Integration for Smart Power Distribution and Consumption Based on “Integration of Marketing with Distribution”

    Directory of Open Access Journals (Sweden)

    Zhenbao Feng

    2014-05-01

    Full Text Available The “information integrated platform of marketing and distribution integration system” researched and developed by this article is an advanced application platform to concurrently design and develop the automation of marketing and power distribution through integration and analysis of existing data based on the data platform of Jiaozuo Power Supply Corporation. It uses data mining and data bus technology, uniform analysis of comprehensive marketing and distribution data. And it conducts a real time monitoring on power utilization information for marketing and early warning maintenance business of power distribution according to electric business model, which realizes an integration of marketing and distribution business, achieves the target of integrated operation of marketing and distribution, improves the operation level of business, reduces maintenance costs of distribution grid, increases electricity sales of distribution grid and provide reliable practical basis for operation and maintenance of Jiaozuo power marketing and distribution.

  7. An Evaluation of the Automated Cost Estimating Integrated Tools (ACEIT) System

    Science.gov (United States)

    1989-09-01

    C~4p DTIC S ELECTE fl JAN12 19 .1R ~OF S%. B -U AN EVALUATION OF THE AUTOMATED COST ESTIMATING INTEGRATED TOOLS ( ACEIT ) SYSTEM THESIS Caroline L...Ohio go 91 022 AFIT/GCA/LSQ/89S-5 AN EVALUATION OF THE AUTOMATED COST ESTIMATING INTEGRATED TOOLS ( ACEIT ) SYSTEM THESIS Caroline L. Hanson Major, USAF...Department of Defense. AFIT/GCA/LSQ/89S-5 AN EVALUATION OF THE AUTOMATED COST ESTIMATING INTEGRATED TOOLS ( ACEIT ) SYSTEM THESIS Presented to the

  8. Semi-automated software service integration in virtual organisations

    Science.gov (United States)

    Afsarmanesh, Hamideh; Sargolzaei, Mahdi; Shadi, Mahdieh

    2015-08-01

    To enhance their business opportunities, organisations involved in many service industries are increasingly active in pursuit of both online provision of their business services (BSs) and collaborating with others. Collaborative Networks (CNs) in service industry sector, however, face many challenges related to sharing and integration of their collection of provided BSs and their corresponding software services. Therefore, the topic of service interoperability for which this article introduces a framework is gaining momentum in research for supporting CNs. It contributes to generation of formal machine readable specification for business processes, aimed at providing their unambiguous definitions, as needed for developing their equivalent software services. The framework provides a model and implementation architecture for discovery and composition of shared services, to support the semi-automated development of integrated value-added services. In support of service discovery, a main contribution of this research is the formal representation of services' behaviour and applying desired service behaviour specified by users for automated matchmaking with other existing services. Furthermore, to support service integration, mechanisms are developed for automated selection of the most suitable service(s) according to a number of service quality aspects. Two scenario cases are presented, which exemplify several specific features related to service discovery and service integration aspects.

  9. Joint force protection advanced security system (JFPASS) "the future of force protection: integrate and automate"

    Science.gov (United States)

    Lama, Carlos E.; Fagan, Joe E.

    2009-09-01

    The United States Department of Defense (DoD) defines 'force protection' as "preventive measures taken to mitigate hostile actions against DoD personnel (to include family members), resources, facilities, and critical information." Advanced technologies enable significant improvements in automating and distributing situation awareness, optimizing operator time, and improving sustainability, which enhance protection and lower costs. The JFPASS Joint Capability Technology Demonstration (JCTD) demonstrates a force protection environment that combines physical security and Chemical, Biological, Radiological, Nuclear, and Explosive (CBRNE) defense through the application of integrated command and control and data fusion. The JFPASS JCTD provides a layered approach to force protection by integrating traditional sensors used in physical security, such as video cameras, battlefield surveillance radars, unmanned and unattended ground sensors. The optimization of human participation and automation of processes is achieved by employment of unmanned ground vehicles, along with remotely operated lethal and less-than-lethal weapon systems. These capabilities are integrated via a tailorable, user-defined common operational picture display through a data fusion engine operating in the background. The combined systems automate the screening of alarms, manage the information displays, and provide assessment and response measures. The data fusion engine links disparate sensors and systems, and applies tailored logic to focus the assessment of events. It enables timely responses by providing the user with automated and semi-automated decision support tools. The JFPASS JCTD uses standard communication/data exchange protocols, which allow the system to incorporate future sensor technologies or communication networks, while maintaining the ability to communicate with legacy or existing systems.

  10. 76 FR 17145 - Agency Information Collection Activities: Business Transformation-Automated Integrated Operating...

    Science.gov (United States)

    2011-03-28

    ... Collection Activities: Business Transformation--Automated Integrated Operating Environment (IOE), New... through efforts like USCIS' Business Transformation initiative. The IOE will be implemented by USCIS and... information collection. (2) Title of the Form/Collection: Business Transformation-- Automated Integrated...

  11. Automation, consolidation, and integration in autoimmune diagnostics.

    Science.gov (United States)

    Tozzoli, Renato; D'Aurizio, Federica; Villalta, Danilo; Bizzaro, Nicola

    2015-08-01

    Over the past two decades, we have witnessed an extraordinary change in autoimmune diagnostics, characterized by the progressive evolution of analytical technologies, the availability of new tests, and the explosive growth of molecular biology and proteomics. Aside from these huge improvements, organizational changes have also occurred which brought about a more modern vision of the autoimmune laboratory. The introduction of automation (for harmonization of testing, reduction of human error, reduction of handling steps, increase of productivity, decrease of turnaround time, improvement of safety), consolidation (combining different analytical technologies or strategies on one instrument or on one group of connected instruments) and integration (linking analytical instruments or group of instruments with pre- and post-analytical devices) opened a new era in immunodiagnostics. In this article, we review the most important changes that have occurred in autoimmune diagnostics and present some models related to the introduction of automation in the autoimmunology laboratory, such as automated indirect immunofluorescence and changes in the two-step strategy for detection of autoantibodies; automated monoplex immunoassays and reduction of turnaround time; and automated multiplex immunoassays for autoantibody profiling.

  12. Integrated Computing, Communication, and Distributed Control of Deregulated Electric Power Systems

    Energy Technology Data Exchange (ETDEWEB)

    Bajura, Richard; Feliachi, Ali

    2008-09-24

    Restructuring of the electricity market has affected all aspects of the power industry from generation to transmission, distribution, and consumption. Transmission circuits, in particular, are stressed often exceeding their stability limits because of the difficulty in building new transmission lines due to environmental concerns and financial risk. Deregulation has resulted in the need for tighter control strategies to maintain reliability even in the event of considerable structural changes, such as loss of a large generating unit or a transmission line, and changes in loading conditions due to the continuously varying power consumption. Our research efforts under the DOE EPSCoR Grant focused on Integrated Computing, Communication and Distributed Control of Deregulated Electric Power Systems. This research is applicable to operating and controlling modern electric energy systems. The controls developed by APERC provide for a more efficient, economical, reliable, and secure operation of these systems. Under this program, we developed distributed control algorithms suitable for large-scale geographically dispersed power systems and also economic tools to evaluate their effectiveness and impact on power markets. Progress was made in the development of distributed intelligent control agents for reliable and automated operation of integrated electric power systems. The methodologies employed combine information technology, control and communication, agent technology, and power systems engineering in the development of intelligent control agents for reliable and automated operation of integrated electric power systems. In the event of scheduled load changes or unforeseen disturbances, the power system is expected to minimize the effects and costs of disturbances and to maintain critical infrastructure operational.

  13. Efficient Sustainable Operation Mechanism of Distributed Desktop Integration Storage Based on Virtualization with Ubiquitous Computing

    Directory of Open Access Journals (Sweden)

    Hyun-Woo Kim

    2015-06-01

    Full Text Available Following the rapid growth of ubiquitous computing, many jobs that were previously manual have now been automated. This automation has increased the amount of time available for leisure; diverse services are now being developed for this leisure time. In addition, the development of small and portable devices like smartphones, diverse Internet services can be used regardless of time and place. Studies regarding diverse virtualization are currently in progress. These studies aim to determine ways to efficiently store and process the big data generated by the multitude of devices and services in use. One topic of such studies is desktop storage virtualization, which integrates distributed desktop resources and provides these resources to users to integrate into distributed legacy desktops via virtualization. In the case of desktop storage virtualization, high availability of virtualization is necessary and important for providing reliability to users. Studies regarding hierarchical structures and resource integration are currently in progress. These studies aim to create efficient data distribution and storage for distributed desktops based on resource integration environments. However, studies regarding efficient responses to server faults occurring in desktop-based resource integration environments have been insufficient. This paper proposes a mechanism for the sustainable operation of desktop storage (SODS for high operational availability. It allows for the easy addition and removal of desktops in desktop-based integration environments. It also activates alternative servers when a fault occurs within a system.

  14. Integrated safeguards and security for a highly automated process

    International Nuclear Information System (INIS)

    Zack, N.R.; Hunteman, W.J.; Jaeger, C.D.

    1993-01-01

    Before the cancellation of the New Production Reactor Programs for the production of tritium, the reactors and associated processing were being designed to contain some of the most highly automated and remote systems conceived for a Department of Energy facility. Integrating safety, security, materials control and accountability (MC and A), and process systems at the proposed facilities would enhance the overall information and protection-in-depth available. Remote, automated fuel handling and assembly/disassembly techniques would deny access to the nuclear materials while upholding ALARA principles but would also require the full integration of all data/information systems. Such systems would greatly enhance MC and A as well as facilitate materials tracking. Physical protection systems would be connected with materials control features to cross check activities and help detect and resolve anomalies. This paper will discuss the results of a study of the safeguards and security benefits achieved from a highly automated and integrated remote nuclear facility and the impacts that such systems have on safeguards and computer and information security

  15. Becoming Earth Independent: Human-Automation-Robotics Integration Challenges for Future Space Exploration

    Science.gov (United States)

    Marquez, Jessica J.

    2016-01-01

    Future exploration missions will require NASA to integrate more automation and robotics in order to accomplish mission objectives. This presentation will describe on the future challenges facing the human operator (astronaut, ground controllers) as we increase the amount of automation and robotics in spaceflight operations. It will describe how future exploration missions will have to adapt and evolve in order to deal with more complex missions and communication latencies. This presentation will outline future human-automation-robotic integration challenges.

  16. Choosing the Right Integrator for Your Building Automation Project.

    Science.gov (United States)

    Podgorski, Will

    2002-01-01

    Examines the prevailing definitions and responsibilities of product, network, and system integrators for building automation systems; offers a novel approach to system integration; and sets realistic expectations for the owner in terms of benefits, outcomes, and overall values. (EV)

  17. Proceedings of the distribution automation seminar. CD-ROM ed.

    International Nuclear Information System (INIS)

    2003-01-01

    Electric utilities are being driven to improve the utilization of their distribution system assets while reducing life cycle costs. This seminar provided an opportunity for electric utilities to share their experience and knowledge about the constantly evolving technologies that apply to distributed automation. Customers and their representatives place increased priority on regulatory commissions to achieve reliability and push the conventional use of distribution automation into rural areas. Various options are under consideration by managers to incorporate a variety of distributed generation resources. Several papers highlighted technical aspects as they relate to applications to meet the changing needs of utilities. The latest products and technologies in the field were on display. The seminar sessions included: business cases; utility experience and applications; utility experience and projects; and, technology and equipment. Eight presentations were indexed separately for inclusion in this database

  18. Automated computation of one-loop integrals in massless theories

    International Nuclear Information System (INIS)

    Hameren, A. van; Vollinga, J.; Weinzierl, S.

    2005-01-01

    We consider one-loop tensor and scalar integrals, which occur in a massless quantum field theory, and we report on the implementation into a numerical program of an algorithm for the automated computation of these one-loop integrals. The number of external legs of the loop integrals is not restricted. All calculations are done within dimensional regularization. (orig.)

  19. Path Searching Based Fault Automated Recovery Scheme for Distribution Grid with DG

    Science.gov (United States)

    Xia, Lin; Qun, Wang; Hui, Xue; Simeng, Zhu

    2016-12-01

    Applying the method of path searching based on distribution network topology in setting software has a good effect, and the path searching method containing DG power source is also applicable to the automatic generation and division of planned islands after the fault. This paper applies path searching algorithm in the automatic division of planned islands after faults: starting from the switch of fault isolation, ending in each power source, and according to the line load that the searching path traverses and the load integrated by important optimized searching path, forming optimized division scheme of planned islands that uses each DG as power source and is balanced to local important load. Finally, COBASE software and distribution network automation software applied are used to illustrate the effectiveness of the realization of such automatic restoration program.

  20. Integration of drinking water treatment plant process models and emulated process automation software

    NARCIS (Netherlands)

    Worm, G.I.M.

    2012-01-01

    The objective of this research is to limit the risks of fully automated operation of drinking water treatment plants and to improve their operation by using an integrated system of process models and emulated process automation software. This thesis contains the design of such an integrated system.

  1. E-health, phase two: the imperative to integrate process automation with communication automation for large clinical reference laboratories.

    Science.gov (United States)

    White, L; Terner, C

    2001-01-01

    The initial efforts of e-health have fallen far short of expectations. They were buoyed by the hype and excitement of the Internet craze but limited by their lack of understanding of important market and environmental factors. E-health now recognizes that legacy systems and processes are important, that there is a technology adoption process that needs to be followed, and that demonstrable value drives adoption. Initial e-health transaction solutions have targeted mostly low-cost problems. These solutions invariably are difficult to integrate into existing systems, typically requiring manual interfacing to supported processes. This limitation in particular makes them unworkable for large volume providers. To meet the needs of these providers, e-health companies must rethink their approaches, appropriately applying technology to seamlessly integrate all steps into existing business functions. E-automation is a transaction technology that automates steps, integration of steps, and information communication demands, resulting in comprehensive automation of entire business functions. We applied e-automation to create a billing management solution for clinical reference laboratories. Large volume, onerous regulations, small margins, and only indirect access to patients challenge large laboratories' billing departments. Couple these problems with outmoded, largely manual systems and it becomes apparent why most laboratory billing departments are in crisis. Our approach has been to focus on the most significant and costly problems in billing: errors, compliance, and system maintenance and management. The core of the design relies on conditional processing, a "universal" communications interface, and ASP technologies. The result is comprehensive automation of all routine processes, driving out errors and costs. Additionally, compliance management and billing system support and management costs are dramatically reduced. The implications of e-automated processes can extend

  2. Smart integration of distribution automation applications

    NARCIS (Netherlands)

    Groot, de R.J.W.; Morren, J.; Slootweg, J.G.

    2012-01-01

    Future electricity demand will significantly increase, while flexibility in supply will decrease, due to an increase in the use of renewable energy sources. The most effective way to prepare distribution grids for this increase in loading and decrease in supply-flexibility is to apply balancing,

  3. Robust localisation of automated guided vehicles for computer-integrated manufacturing environments

    Directory of Open Access Journals (Sweden)

    Dixon, R. C.

    2013-05-01

    Full Text Available As industry moves toward an era of complete automation and mass customisation, automated guided vehicles (AGVs are used as material handling systems. However, the current techniques that provide navigation, control, and manoeuvrability of automated guided vehicles threaten to create bottlenecks and inefficiencies in manufacturing environments that strive towards the optimisation of part production. This paper proposes a decentralised localisation technique for an automated guided vehicle without any non-holonomic constraints. Incorporation of these vehicles into the material handling system of a computer-integrated manufacturing environment would increase the characteristics of robustness, efficiency, flexibility, and advanced manoeuvrability.

  4. Distribution Integration | Grid Modernization | NREL

    Science.gov (United States)

    Distribution Integration Distribution Integration The goal of NREL's distribution integration research is to tackle the challenges facing the widespread integration of distributed energy resources NREL engineers mapping out a grid model on a whiteboard. NREL's research on the integration of

  5. Relay Protection and Automation Systems Based on Programmable Logic Integrated Circuits

    International Nuclear Information System (INIS)

    Lashin, A. V.; Kozyrev, A. V.

    2015-01-01

    One of the most promising forms of developing the apparatus part of relay protection and automation devices is considered. The advantages of choosing programmable logic integrated circuits to obtain adaptive technological algorithms in power system protection and control systems are pointed out. The technical difficulties in the problems which today stand in the way of using relay protection and automation systems are indicated and a new technology for solving these problems is presented. Particular attention is devoted to the possibility of reconfiguring the logic of these devices, using programmable logic integrated circuits

  6. A scheme for a future distribution automation system in Finnish utilities

    Energy Technology Data Exchange (ETDEWEB)

    Lehtonen, M.; Kaerkkaeinen, S. [VTT Energy, Espoo (Finland); Partanen, J. [Lappeenranta Univ. of Technology (Finland)

    1996-12-31

    This presentation summarizes the results of a project, the aim of which was to define the optimal set of functions for the future distribution automation (DA) systems in Finland. The general factors. which affect the automation needs, are first discussed. The benefits of various functions of DA and demand side management (DSM) are then studied. Next a computer model for a DA feasibility analysis is presented, and some computation results are given. From these. the proposed automation scheme is finally concluded

  7. A scheme for a future distribution automation system in Finnish utilities

    Energy Technology Data Exchange (ETDEWEB)

    Lehtonen, M; Kaerkkaeinen, S [VTT Energy, Espoo (Finland); Partanen, J [Lappeenranta Univ. of Technology (Finland)

    1998-08-01

    This presentation summarizes the results of a project, the aim of which was to define the optimal set of functions for the future distribution automation (DA) systems in Finland. The general factors, which affect the automation needs, are first discussed. The benefits of various functions of DA and demand side management (DSM) are then studied. Next a computer model for a DA feasibility analysis is presented, and some computation results are given. From these, the proposed automation scheme is finally concluded

  8. A scheme for a future distribution automation system in Finnish utilities

    Energy Technology Data Exchange (ETDEWEB)

    Lehtonen, M; Kaerkkaeinen, S [VTT Energy, Espoo (Finland); Partanen, J [Lappeenranta Univ. of Technology (Finland)

    1997-12-31

    This presentation summarizes the results of a project, the aim of which was to define the optimal set of functions for the future distribution automation (DA) systems in Finland. The general factors. which affect the automation needs, are first discussed. The benefits of various functions of DA and demand side management (DSM) are then studied. Next a computer model for a DA feasibility analysis is presented, and some computation results are given. From these. the proposed automation scheme is finally concluded

  9. Automated Energy Distribution and Reliability System Status Report

    Energy Technology Data Exchange (ETDEWEB)

    Buche, D. L.; Perry, S.

    2007-10-01

    This report describes Northern Indiana Public Service Co. project efforts to develop an automated energy distribution and reliability system. The purpose of this project was to implement a database-driven GIS solution that would manage all of the company's gas, electric, and landbase objects.

  10. Control and management of distribution system with integrated DERs via IEC 61850 based communication

    Directory of Open Access Journals (Sweden)

    Ikbal Ali

    2017-06-01

    Full Text Available Distributed Energy Resources (DERs are being increasingly integrated in the distribution systems and resulting in complex power flow scenarios. In such cases, effective control, management and protection of distribution systems becomes highly challenging. Standardized and interoperable communication in distribution systems has the potential to deal with such challenges to achieve higher energy efficiency and reliability. Ed. 2 of IEC 61850 standards, for utility automation, standardizing the exchange of data among different substations, DERs, control centers, PMUs and PDCs. This paper demonstrates the modelling of information and services needed for control, management and protection of distribution systems with integrated DERs. This paper has used IP tunnels and/or mapping over IP layer for transferring IEC 61850 messages, such as sample values (SVs and GOOSE (Generic Object Oriented Substation Event, over distribution system Wide Area Network (WAN. Finally performance of the proposed communication configurations for different applications is analyzed by calculating End-to-End (ETE delay, throughput and jitter.

  11. IEC 61850: integrating substation automation into the power plant control system; IEC 61850: Integration der Schaltanlagenautomatisierung in die Kraftwerksleittechnik

    Energy Technology Data Exchange (ETDEWEB)

    Orth, J. [ABB AG, Mannheim (Germany)

    2008-07-01

    The new communication standard IEC 61850 has been developed in the substation automation domain and was released 2004 as a worldwide standard. Meanwhile IEC 61850 is already established in many substation automation markets. The paper discusses the implementation of IEC 61850 integrating process control and substation automation into one consistent system in a power plant. (orig.)

  12. An integrated system for buildings’ energy-efficient automation: Application in the tertiary sector

    International Nuclear Information System (INIS)

    Marinakis, Vangelis; Doukas, Haris; Karakosta, Charikleia; Psarras, John

    2013-01-01

    Highlights: ► We developed an interactive software for building automation systems. ► Monitoring of energy consumption in real time. ► Optimization of energy consumption implementing appropriate control scenarios. ► Pilot appraisal on remote control of active systems in the tertiary sector building. ► Significant decrease in energy and operating cost of A/C system. -- Abstract: Although integrated building automation systems have become increasingly popular, an integrated system which includes remote control technology to enable real-time monitoring of the energy consumption by energy end-users, as well as optimization functions is required. To respond to this common interest, the main aim of the paper is to present an integrated system for buildings’ energy-efficient automation. The proposed system is based on a prototype software tool for the simulation and optimization of energy consumption in the building sector, enhancing the interactivity of building automation systems. The system can incorporate energy-efficient automation functions for heating, cooling and/or lighting based on recent guidance and decisions of the National Law, energy efficiency requirements of EN 15232 and ISO 50001 Energy Management Standard among others. The presented system was applied to a supermarket building in Greece and focused on the remote control of active systems.

  13. Load Segmentation for Convergence of Distribution Automation and Advanced Metering Infrastructure Systems

    Science.gov (United States)

    Pamulaparthy, Balakrishna; KS, Swarup; Kommu, Rajagopal

    2014-12-01

    Distribution automation (DA) applications are limited to feeder level today and have zero visibility outside of the substation feeder and reaching down to the low-voltage distribution network level. This has become a major obstacle in realizing many automated functions and enhancing existing DA capabilities. Advanced metering infrastructure (AMI) systems are being widely deployed by utilities across the world creating system-wide communications access to every monitoring and service point, which collects data from smart meters and sensors in short time intervals, in response to utility needs. DA and AMI systems convergence provides unique opportunities and capabilities for distribution grid modernization with the DA system acting as a controller and AMI system acting as feedback to DA system, for which DA applications have to understand and use the AMI data selectively and effectively. In this paper, we propose a load segmentation method that helps the DA system to accurately understand and use the AMI data for various automation applications with a suitable case study on power restoration.

  14. BOA: Framework for Automated Builds

    CERN Document Server

    Ratnikova, N

    2003-01-01

    Managing large-scale software products is a complex software engineering task. The automation of the software development, release and distribution process is most beneficial in the large collaborations, where the big number of developers, multiple platforms and distributed environment are typical factors. This paper describes Build and Output Analyzer framework and its components that have been developed in CMS to facilitate software maintenance and improve software quality. The system allows to generate, control and analyze various types of automated software builds and tests, such as regular rebuilds of the development code, software integration for releases and installation of the existing versions.

  15. BOA: Framework for automated builds

    International Nuclear Information System (INIS)

    Ratnikova, N.

    2003-01-01

    Managing large-scale software products is a complex software engineering task. The automation of the software development, release and distribution process is most beneficial in the large collaborations, where the big number of developers, multiple platforms and distributed environment are typical factors. This paper describes Build and Output Analyzer framework and its components that have been developed in CMS to facilitate software maintenance and improve software quality. The system allows to generate, control and analyze various types of automated software builds and tests, such as regular rebuilds of the development code, software integration for releases and installation of the existing versions

  16. Automated Design Tools for Integrated Mixed-Signal Microsystems (NeoCAD)

    National Research Council Canada - National Science Library

    Petre, P; Visher, J; Shringarpure, R; Valley, F; Swaminathan, M

    2005-01-01

    Automated design tools and integrated design flow methodologies were developed that demonstrated more than an order- of-magnitude reduction in cycle time and cost for mixed signal (digital/analoglRF...

  17. Multi-purpose logical device with integrated circuit for the automation of mine water disposal

    Energy Technology Data Exchange (ETDEWEB)

    Pop, E.; Pasculescu, M.

    1980-06-01

    After an analysis of the waste water disposal as an object of automation, the author presents a BASIC-language programme established to simulate the automated control system on a digital computer. Then a multi-purpose logical device with integrated circuits for the automation of the mine water disposal is presented. (In Romanian)

  18. Energy Production System Management - Renewable energy power supply integration with Building Automation System

    International Nuclear Information System (INIS)

    Figueiredo, Joao; Martins, Joao

    2010-01-01

    Intelligent buildings, historically and technologically, refers to the integration of four distinctive systems: Building Automation Systems (BAS), Telecommunication Systems, Office Automation Systems and Computer Building Management Systems. The increasing sophisticated BAS has become the 'heart and soul' of modern intelligent buildings. Integrating energy supply and demand elements - often known as Demand-Side Management (DSM) - has became an important energy efficiency policy concept. Nowadays, European countries have diversified their power supplies, reducing the dependence on OPEC, and developing a broader mix of energy sources maximizing the use of renewable energy domestic sources. In this way it makes sense to include a fifth system into the intelligent building group: Energy Production System Management (EPSM). This paper presents a Building Automation System where the Demand-Side Management is fully integrated with the building's Energy Production System, which incorporates a complete set of renewable energy production and storage systems.

  19. Process automation system for integration and operation of Large Volume Plasma Device

    International Nuclear Information System (INIS)

    Sugandhi, R.; Srivastava, P.K.; Sanyasi, A.K.; Srivastav, Prabhakar; Awasthi, L.M.; Mattoo, S.K.

    2016-01-01

    Highlights: • Analysis and design of process automation system for Large Volume Plasma Device (LVPD). • Data flow modeling for process model development. • Modbus based data communication and interfacing. • Interface software development for subsystem control in LabVIEW. - Abstract: Large Volume Plasma Device (LVPD) has been successfully contributing towards understanding of the plasma turbulence driven by Electron Temperature Gradient (ETG), considered as a major contributor for the plasma loss in the fusion devices. Large size of the device imposes certain difficulties in the operation, such as access of the diagnostics, manual control of subsystems and large number of signals monitoring etc. To achieve integrated operation of the machine, automation is essential for the enhanced performance and operational efficiency. Recently, the machine is undergoing major upgradation for the new physics experiments. The new operation and control system consists of following: (1) PXIe based fast data acquisition system for the equipped diagnostics; (2) Modbus based Process Automation System (PAS) for the subsystem controls and (3) Data Utilization System (DUS) for efficient storage, processing and retrieval of the acquired data. In the ongoing development, data flow model of the machine’s operation has been developed. As a proof of concept, following two subsystems have been successfully integrated: (1) Filament Power Supply (FPS) for the heating of W- filaments based plasma source and (2) Probe Positioning System (PPS) for control of 12 number of linear probe drives for a travel length of 100 cm. The process model of the vacuum production system has been prepared and validated against acquired pressure data. In the next upgrade, all the subsystems of the machine will be integrated in a systematic manner. The automation backbone is based on 4-wire multi-drop serial interface (RS485) using Modbus communication protocol. Software is developed on LabVIEW platform using

  20. Process automation system for integration and operation of Large Volume Plasma Device

    Energy Technology Data Exchange (ETDEWEB)

    Sugandhi, R., E-mail: ritesh@ipr.res.in; Srivastava, P.K.; Sanyasi, A.K.; Srivastav, Prabhakar; Awasthi, L.M.; Mattoo, S.K.

    2016-11-15

    Highlights: • Analysis and design of process automation system for Large Volume Plasma Device (LVPD). • Data flow modeling for process model development. • Modbus based data communication and interfacing. • Interface software development for subsystem control in LabVIEW. - Abstract: Large Volume Plasma Device (LVPD) has been successfully contributing towards understanding of the plasma turbulence driven by Electron Temperature Gradient (ETG), considered as a major contributor for the plasma loss in the fusion devices. Large size of the device imposes certain difficulties in the operation, such as access of the diagnostics, manual control of subsystems and large number of signals monitoring etc. To achieve integrated operation of the machine, automation is essential for the enhanced performance and operational efficiency. Recently, the machine is undergoing major upgradation for the new physics experiments. The new operation and control system consists of following: (1) PXIe based fast data acquisition system for the equipped diagnostics; (2) Modbus based Process Automation System (PAS) for the subsystem controls and (3) Data Utilization System (DUS) for efficient storage, processing and retrieval of the acquired data. In the ongoing development, data flow model of the machine’s operation has been developed. As a proof of concept, following two subsystems have been successfully integrated: (1) Filament Power Supply (FPS) for the heating of W- filaments based plasma source and (2) Probe Positioning System (PPS) for control of 12 number of linear probe drives for a travel length of 100 cm. The process model of the vacuum production system has been prepared and validated against acquired pressure data. In the next upgrade, all the subsystems of the machine will be integrated in a systematic manner. The automation backbone is based on 4-wire multi-drop serial interface (RS485) using Modbus communication protocol. Software is developed on LabVIEW platform using

  1. Integrating the Allen Brain Institute Cell Types Database into Automated Neuroscience Workflow.

    Science.gov (United States)

    Stockton, David B; Santamaria, Fidel

    2017-10-01

    We developed software tools to download, extract features, and organize the Cell Types Database from the Allen Brain Institute (ABI) in order to integrate its whole cell patch clamp characterization data into the automated modeling/data analysis cycle. To expand the potential user base we employed both Python and MATLAB. The basic set of tools downloads selected raw data and extracts cell, sweep, and spike features, using ABI's feature extraction code. To facilitate data manipulation we added a tool to build a local specialized database of raw data plus extracted features. Finally, to maximize automation, we extended our NeuroManager workflow automation suite to include these tools plus a separate investigation database. The extended suite allows the user to integrate ABI experimental and modeling data into an automated workflow deployed on heterogeneous computer infrastructures, from local servers, to high performance computing environments, to the cloud. Since our approach is focused on workflow procedures our tools can be modified to interact with the increasing number of neuroscience databases being developed to cover all scales and properties of the nervous system.

  2. Microprocessor-based integration of microfluidic control for the implementation of automated sensor monitoring and multithreaded optimization algorithms.

    Science.gov (United States)

    Ezra, Elishai; Maor, Idan; Bavli, Danny; Shalom, Itai; Levy, Gahl; Prill, Sebastian; Jaeger, Magnus S; Nahmias, Yaakov

    2015-08-01

    Microfluidic applications range from combinatorial synthesis to high throughput screening, with platforms integrating analog perfusion components, digitally controlled micro-valves and a range of sensors that demand a variety of communication protocols. Currently, discrete control units are used to regulate and monitor each component, resulting in scattered control interfaces that limit data integration and synchronization. Here, we present a microprocessor-based control unit, utilizing the MS Gadgeteer open framework that integrates all aspects of microfluidics through a high-current electronic circuit that supports and synchronizes digital and analog signals for perfusion components, pressure elements, and arbitrary sensor communication protocols using a plug-and-play interface. The control unit supports an integrated touch screen and TCP/IP interface that provides local and remote control of flow and data acquisition. To establish the ability of our control unit to integrate and synchronize complex microfluidic circuits we developed an equi-pressure combinatorial mixer. We demonstrate the generation of complex perfusion sequences, allowing the automated sampling, washing, and calibrating of an electrochemical lactate sensor continuously monitoring hepatocyte viability following exposure to the pesticide rotenone. Importantly, integration of an optical sensor allowed us to implement automated optimization protocols that require different computational challenges including: prioritized data structures in a genetic algorithm, distributed computational efforts in multiple-hill climbing searches and real-time realization of probabilistic models in simulated annealing. Our system offers a comprehensive solution for establishing optimization protocols and perfusion sequences in complex microfluidic circuits.

  3. Automated Energy Distribution and Reliability System (AEDR): Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Buche, D. L.

    2008-07-01

    This report describes Northern Indiana Public Service Co. project efforts to develop an automated energy distribution and reliability system. The purpose of this project was to implement a database-driven GIS solution that would manage all of the company's gas, electric, and landbase objects.

  4. On the engineering design for systematic integration of agent-orientation in industrial automation.

    Science.gov (United States)

    Yu, Liyong; Schüller, Andreas; Epple, Ulrich

    2014-09-01

    In today's automation industry, agent-oriented development of system functionalities appears to have a great potential for increasing autonomy and flexibility of complex operations, while lowering the workload of users. In this paper, we present a reference model for the harmonious and systematical integration of agent-orientation in industrial automation. Considering compatibility with existing automation systems and best practice, this model combines advantages of function block technology, service orientation and native description methods from the automation standard IEC 61131-3. This approach can be applied as a guideline for the engineering design of future agent-oriented automation systems. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  5. Moving NASA Beyond Low Earth Orbit: Future Human-Automation-Robotic Integration Challenges

    Science.gov (United States)

    Marquez, Jessica

    2016-01-01

    This presentation will provide an overview of current human spaceflight operations. It will also describe how future exploration missions will have to adapt and evolve in order to deal with more complex missions and communication latencies. Additionally, there are many implications regarding advanced automation and robotics, and this presentation will outline future human-automation-robotic integration challenges.

  6. The optimal number, type and location of devices in automation of electrical distribution networks

    Directory of Open Access Journals (Sweden)

    Popović Željko N.

    2015-01-01

    Full Text Available This paper presents the mixed integer linear programming based model for determining optimal number, type and location of remotely controlled and supervised devices in distribution networks in the presence of distributed generators. The proposed model takes into consideration a number of different devices simultaneously (remotely controlled circuit breakers/reclosers, sectionalizing switches, remotely supervised and local fault passage indicators along with the following: expected outage cost to consumers and producers due to momentary and long-term interruptions, automated device expenses (capital investment, installation, and annual operation and maintenance costs, number and expenses of crews involved in the isolation and restoration process. Furthermore, the other possible benefits of each of automated device are also taken into account (e.g., benefits due to decreasing the cost of switching operations in normal conditions. The obtained numerical results emphasize the importance of consideration of different types of automation devices simultaneously. They also show that the proposed approach have a potential to improve the process of determining of the best automation strategy in real life distribution networks.

  7. AIRSAR Automated Web-based Data Processing and Distribution System

    Science.gov (United States)

    Chu, Anhua; vanZyl, Jakob; Kim, Yunjin; Lou, Yunling; Imel, David; Tung, Wayne; Chapman, Bruce; Durden, Stephen

    2005-01-01

    In this paper, we present an integrated, end-to-end synthetic aperture radar (SAR) processing system that accepts data processing requests, submits processing jobs, performs quality analysis, delivers and archives processed data. This fully automated SAR processing system utilizes database and internet/intranet web technologies to allow external users to browse and submit data processing requests and receive processed data. It is a cost-effective way to manage a robust SAR processing and archival system. The integration of these functions has reduced operator errors and increased processor throughput dramatically.

  8. Modeling Multiple Human-Automation Distributed Systems using Network-form Games

    Science.gov (United States)

    Brat, Guillaume

    2012-01-01

    The paper describes at a high-level the network-form game framework (based on Bayes net and game theory), which can be used to model and analyze safety issues in large, distributed, mixed human-automation systems such as NextGen.

  9. Automated granularity to integrate digital information: the "Antarctic Treaty Searchable Database" case study

    Directory of Open Access Journals (Sweden)

    Paul Arthur Berkman

    2006-06-01

    Full Text Available Access to information is necessary, but not sufficient in our digital era. The challenge is to objectively integrate digital resources based on user-defined objectives for the purpose of discovering information relationships that facilitate interpretations and decision making. The Antarctic Treaty Searchable Database (http://aspire.nvi.net, which is in its sixth edition, provides an example of digital integration based on the automated generation of information granules that can be dynamically combined to reveal objective relationships within and between digital information resources. This case study further demonstrates that automated granularity and dynamic integration can be accomplished simply by utilizing the inherent structure of the digital information resources. Such information integration is relevant to library and archival programs that require long-term preservation of authentic digital resources.

  10. Design and Development of an Integrated Workstation Automation Hub

    Energy Technology Data Exchange (ETDEWEB)

    Weber, Andrew; Ghatikar, Girish; Sartor, Dale; Lanzisera, Steven

    2015-03-30

    Miscellaneous Electronic Loads (MELs) account for one third of all electricity consumption in U.S. commercial buildings, and are drivers for a significant energy use in India. Many of the MEL-specific plug-load devices are concentrated at workstations in offices. The use of intelligence, and integrated controls and communications at the workstation for an Office Automation Hub – offers the opportunity to improve both energy efficiency and occupant comfort, along with services for Smart Grid operations. Software and hardware solutions are available from a wide array of vendors for the different components, but an integrated system with interoperable communications is yet to be developed and deployed. In this study, we propose system- and component-level specifications for the Office Automation Hub, their functions, and a prioritized list for the design of a proof-of-concept system. Leveraging the strength of both the U.S. and India technology sectors, this specification serves as a guide for researchers and industry in both countries to support the development, testing, and evaluation of a prototype product. Further evaluation of such integrated technologies for performance and cost is necessary to identify the potential to reduce energy consumptions in MELs and to improve occupant comfort.

  11. Standard IEC 61850 substation automation

    Energy Technology Data Exchange (ETDEWEB)

    Bricchi, A.; Mezzadri, D. [Selta, Tortoreto (Italy)

    2008-07-01

    The International Electrotechnical Commission (IEC) 61850 standard is the reference communication protocol for all electrical substations protection and control systems. It creates models of all the elements and functionalities of an electrical substation, including physical elements such as switches or circuit breakers, as well as protection, control and monitoring functionalities. Network managers need to renew power substation automation and control systems in order to improve the efficiency and quality of services offered by electric utilities. Selta has proposed a new integrated solution for the automation of power substations which is fully compliant with the IEC 61850 norms. The solution involves the integration of control, automation, protection, monitoring and maintenance functions and applies leading edge technology to its systems, particularly for the TERNA network. The system is based on the use of many electronic devices at a power plant, each one with a specialized function, and all interconnected via a Station LAN. This solution, was tested on the TERNA network in Italy, in VHV and HV stations. It was shown to offer many advantages, such as an architecture based on full interoperability between control, monitoring and protection equipment; centralized and distributed automation; a LAN station that allows full interoperability between different bay units and protection relays in order to integrate equipment from various suppliers; the integration of automation systems in existing bay units and protection relays equipped with standard communication buses or with proprietary interfaces; and time synchronization for the entire system through a station GPS reception system. 10 refs., 1 tab., 7 figs.

  12. Enhanced distribution automation system based on new communication technology at ENEL

    International Nuclear Information System (INIS)

    Comellini, E.; Gargiuli, R.; Gelli, G.; Tonon, R.; Mirandola, P.

    1991-01-01

    On the basis of extensive research aimed at the assessment of the feasibility of a cost effective two-way telecommunications system, and of the experience gained during the eighties in the field of remote control of the primary distribution network where new digital techniques were introduced, and in the field of metering apparatus, where about 7,000 HV and MV customers were equipped with Ferraris meters associated with electronic devices for the application of multirate tariffs, ENEL (Italian National Electricity Board) has designed a new distribution automation system aimed at: remote control of the MV distribution network, and MV and LV customer meter service automation. This report describes the key choices that determined the architecture of the new system and the most important features of its main components, in view of: an improvement of energy usage efficiency, better service to the customers, as well as, increased simplicity and transparency in customer relationships

  13. Enhanced distribution automation system based on new communications technology at ENEL

    Energy Technology Data Exchange (ETDEWEB)

    Comellini, E; Gargiuli, R; Gelli, G; Tonon, R; Mirandola, P

    1992-12-31

    On the basis of extensive research aimed at the assessment of the feasibility of a cost effective two-way telecommunications system, and of the experience gained during the eighties in the field of remote control of the primary distribution network where new digital techniques were introduced, and in the field of metering apparatus, where about 7,000 HV and MV customers were equipped with Ferraris meters associated with electronic devices for the application of multirate tariffs, ENEL (Italian National Electricity Board) has designed a new distribution automation system aimed at: remote control of the MV distribution network, and MV and LV customer meter service automation. This report describes the key choices that determined the architecture of the new system and the most important features of its main components, in view of: an improvement of energy usage efficiency, better service to the customers, as well as, increased simplicity and transparency in customer relationships.

  14. Integration of disabled people in an automated work process

    Science.gov (United States)

    Jalba, C. K.; Muminovic, A.; Epple, S.; Barz, C.; Nasui, V.

    2017-05-01

    Automation processes enter more and more into all areas of life and production. Especially people with disabilities can hardly keep step with this change. In sheltered workshops in Germany people with physical and mental disabilities get help with much dedication, to be integrated into the work processes. This work shows that cooperation between disabled people and industrial robots by means of industrial image processing can successfully result in the production of highly complex products. Here is described how high-pressure hydraulic pumps are assembled by people with disabilities in cooperation with industrial robots in a sheltered workshop. After the assembly process, the pumps are checked for leaks at very high pressures in a completely automated process.

  15. Research of the self-healing technologies in the optical communication network of distribution automation

    Science.gov (United States)

    Wang, Hao; Zhong, Guoxin

    2018-03-01

    Optical communication network is the mainstream technique of the communication networks for distribution automation, and self-healing technologies can improve the in reliability of the optical communication networks significantly. This paper discussed the technical characteristics and application scenarios of several network self-healing technologies in the access layer, the backbone layer and the core layer of the optical communication networks for distribution automation. On the base of the contrastive analysis, this paper gives an application suggestion of these self-healing technologies.

  16. Distributed microprocessor automation network for synthesizing radiotracers used in positron emission tomography

    International Nuclear Information System (INIS)

    Russell, J.A.G.; Alexoff, D.L.; Wolf, A.P.

    1984-01-01

    This presentation describes an evolving distributed microprocessor network for automating the routine production synthesis of radiotracers used in Positron Emission Tomography. We first present a brief overview of the PET method for measuring biological function, and then outline the general procedure for producing a radiotracer. The paper identifies several reasons for our automating the syntheses of these compounds. There is a description of the distributed microprocessor network architecture chosen and the rationale for that choice. Finally, we speculate about how this network may be exploited to extend the power of the PET method from the large university or National Laboratory to the biomedical research and clinical community at large. 20 refs. (DT)

  17. On the Automated Synthesis of Enterprise Integration Patterns to Adapt Choreography-based Distributed Systems

    Directory of Open Access Journals (Sweden)

    Marco Autili

    2015-12-01

    Full Text Available The Future Internet is becoming a reality, providing a large-scale computing environments where a virtually infinite number of available services can be composed so to fit users' needs. Modern service-oriented applications will be more and more often built by reusing and assembling distributed services. A key enabler for this vision is then the ability to automatically compose and dynamically coordinate software services. Service choreographies are an emergent Service Engineering (SE approach to compose together and coordinate services in a distributed way. When mismatching third-party services are to be composed, obtaining the distributed coordination and adaptation logic required to suitably realize a choreography is a non-trivial and error prone task. Automatic support is then needed. In this direction, this paper leverages previous work on the automatic synthesis of choreography-based systems, and describes our preliminary steps towards exploiting Enterprise Integration Patterns to deal with a form of choreography adaptation.

  18. Automated Integration of Dedicated Hardwired IP Cores in Heterogeneous MPSoCs Designed with ESPAM

    Directory of Open Access Journals (Sweden)

    Ed Deprettere

    2008-06-01

    Full Text Available This paper presents a methodology and techniques for automated integration of dedicated hardwired (HW IP cores into heterogeneous multiprocessor systems. We propose an IP core integration approach based on an HW module generation that consists of a wrapper around a predefined IP core. This approach has been implemented in a tool called ESPAM for automated multiprocessor system design, programming, and implementation. In order to keep high performance of the integrated IP cores, the structure of the IP core wrapper is devised in a way that adequately represents and efficiently implements the main characteristics of the formal model of computation, namely, Kahn process networks, we use as an underlying programming model in ESPAM. We present details about the structure of the HW module, the supported types of IP cores, and the minimum interfaces these IP cores have to provide in order to allow automated integration in heterogeneous multiprocessor systems generated by ESPAM. The ESPAM design flow, the multiprocessor platforms we consider, and the underlying programming (KPN model are introduced as well. Furthermore, we present the efficiency of our approach by applying our methodology and ESPAM tool to automatically generate, implement, and program heterogeneous multiprocessor systems that integrate dedicated IP cores and execute real-life applications.

  19. Digital coal mine integrated automation system based on Controlnet

    Energy Technology Data Exchange (ETDEWEB)

    Jin-yun Chen; Shen Zhang; Wei-ran Zuo [China University of Mining and Technology, Xuzhou (China). School of Chemical Engineering and Technology

    2007-06-15

    A three-layer model for digital communication in a mine is proposed. Two basic platforms are discussed: a uniform transmission network and a uniform data warehouse. An actual, ControlNet based, transmission network platform suitable for the Jining No.3 coal mine in China is presented. This network is an information superhighway intended to integrate all existing and new automation subsystems. Its standard interface can be used with future subsystems. The network, data structure and management decision-making all employ this uniform hardware and software. This effectively avoids the problems of system and information islands seen in traditional mine-automation systems. The construction of the network provides a stable foundation for digital communication in the Jining No.3 coal mine. 9 refs., 5 figs.

  20. Automated microfluidic devices integrating solid-phase extraction, fluorescent labeling, and microchip electrophoresis for preterm birth biomarker analysis.

    Science.gov (United States)

    Sahore, Vishal; Sonker, Mukul; Nielsen, Anna V; Knob, Radim; Kumar, Suresh; Woolley, Adam T

    2018-01-01

    We have developed multichannel integrated microfluidic devices for automated preconcentration, labeling, purification, and separation of preterm birth (PTB) biomarkers. We fabricated multilayer poly(dimethylsiloxane)-cyclic olefin copolymer (PDMS-COC) devices that perform solid-phase extraction (SPE) and microchip electrophoresis (μCE) for automated PTB biomarker analysis. The PDMS control layer had a peristaltic pump and pneumatic valves for flow control, while the PDMS fluidic layer had five input reservoirs connected to microchannels and a μCE system. The COC layers had a reversed-phase octyl methacrylate porous polymer monolith for SPE and fluorescent labeling of PTB biomarkers. We determined μCE conditions for two PTB biomarkers, ferritin (Fer) and corticotropin-releasing factor (CRF). We used these integrated microfluidic devices to preconcentrate and purify off-chip-labeled Fer and CRF in an automated fashion. Finally, we performed a fully automated on-chip analysis of unlabeled PTB biomarkers, involving SPE, labeling, and μCE separation with 1 h total analysis time. These integrated systems have strong potential to be combined with upstream immunoaffinity extraction, offering a compact sample-to-answer biomarker analysis platform. Graphical abstract Pressure-actuated integrated microfluidic devices have been developed for automated solid-phase extraction, fluorescent labeling, and microchip electrophoresis of preterm birth biomarkers.

  1. Distributed Microprocessor Automation Network for Synthesizing Radiotracers Used in Positron Emission Tomography [PET

    Science.gov (United States)

    Russell, J. A. G.; Alexoff, D. L.; Wolf, A. P.

    1984-09-01

    This presentation describes an evolving distributed microprocessor network for automating the routine production synthesis of radiotracers used in Positron Emission Tomography. We first present a brief overview of the PET method for measuring biological function, and then outline the general procedure for producing a radiotracer. The paper identifies several reasons for our automating the syntheses of these compounds. There is a description of the distributed microprocessor network architecture chosen and the rationale for that choice. Finally, we speculate about how this network may be exploited to extend the power of the PET method from the large university or National Laboratory to the biomedical research and clinical community at large. (DT)

  2. Trends and applications of integrated automated ultra-trace sample handling and analysis (T9)

    International Nuclear Information System (INIS)

    Kingston, H.M.S.; Ye Han; Stewart, L.; Link, D.

    2002-01-01

    Full text: Automated analysis, sub-ppt detection limits, and the trend toward speciated analysis (rather than just elemental analysis) force the innovation of sophisticated and integrated sample preparation and analysis techniques. Traditionally, the ability to handle samples at ppt and sub-ppt levels has been limited to clean laboratories and special sample handling techniques and equipment. The world of sample handling has passed a threshold where older or 'old fashioned' traditional techniques no longer provide the ability to see the sample due to the influence of the analytical blank and the fragile nature of the analyte. When samples require decomposition, extraction, separation and manipulation, application of newer more sophisticated sample handling systems are emerging that enable ultra-trace analysis and species manipulation. In addition, new instrumentation has emerged which integrate sample preparation and analysis to enable on-line near real-time analysis. Examples of those newer sample-handling methods will be discussed and current examples provided as alternatives to traditional sample handling. Two new techniques applying ultra-trace microwave energy enhanced sample handling have been developed that permit sample separation and refinement while performing species manipulation during decomposition. A demonstration, that applies to semiconductor materials, will be presented. Next, a new approach to the old problem of sample evaporation without losses will be demonstrated that is capable of retaining all elements and species tested. Both of those methods require microwave energy manipulation in specialized systems and are not accessible through convection, conduction, or other traditional energy applications. A new automated integrated method for handling samples for ultra-trace analysis has been developed. An on-line near real-time measurement system will be described that enables many new automated sample handling and measurement capabilities. This

  3. An automated and integrated framework for dust storm detection based on ogc web processing services

    Science.gov (United States)

    Xiao, F.; Shea, G. Y. K.; Wong, M. S.; Campbell, J.

    2014-11-01

    Dust storms are known to have adverse effects on public health. Atmospheric dust loading is also one of the major uncertainties in global climatic modelling as it is known to have a significant impact on the radiation budget and atmospheric stability. The complexity of building scientific dust storm models is coupled with the scientific computation advancement, ongoing computing platform development, and the development of heterogeneous Earth Observation (EO) networks. It is a challenging task to develop an integrated and automated scheme for dust storm detection that combines Geo-Processing frameworks, scientific models and EO data together to enable the dust storm detection and tracking processes in a dynamic and timely manner. This study develops an automated and integrated framework for dust storm detection and tracking based on the Web Processing Services (WPS) initiated by Open Geospatial Consortium (OGC). The presented WPS framework consists of EO data retrieval components, dust storm detecting and tracking component, and service chain orchestration engine. The EO data processing component is implemented based on OPeNDAP standard. The dust storm detecting and tracking component combines three earth scientific models, which are SBDART model (for computing aerosol optical depth (AOT) of dust particles), WRF model (for simulating meteorological parameters) and HYSPLIT model (for simulating the dust storm transport processes). The service chain orchestration engine is implemented based on Business Process Execution Language for Web Service (BPEL4WS) using open-source software. The output results, including horizontal and vertical AOT distribution of dust particles as well as their transport paths, were represented using KML/XML and displayed in Google Earth. A serious dust storm, which occurred over East Asia from 26 to 28 Apr 2012, is used to test the applicability of the proposed WPS framework. Our aim here is to solve a specific instance of a complex EO data

  4. Automics: an integrated platform for NMR-based metabonomics spectral processing and data analysis

    Directory of Open Access Journals (Sweden)

    Qu Lijia

    2009-03-01

    Full Text Available Abstract Background Spectral processing and post-experimental data analysis are the major tasks in NMR-based metabonomics studies. While there are commercial and free licensed software tools available to assist these tasks, researchers usually have to use multiple software packages for their studies because software packages generally focus on specific tasks. It would be beneficial to have a highly integrated platform, in which these tasks can be completed within one package. Moreover, with open source architecture, newly proposed algorithms or methods for spectral processing and data analysis can be implemented much more easily and accessed freely by the public. Results In this paper, we report an open source software tool, Automics, which is specifically designed for NMR-based metabonomics studies. Automics is a highly integrated platform that provides functions covering almost all the stages of NMR-based metabonomics studies. Automics provides high throughput automatic modules with most recently proposed algorithms and powerful manual modules for 1D NMR spectral processing. In addition to spectral processing functions, powerful features for data organization, data pre-processing, and data analysis have been implemented. Nine statistical methods can be applied to analyses including: feature selection (Fisher's criterion, data reduction (PCA, LDA, ULDA, unsupervised clustering (K-Mean and supervised regression and classification (PLS/PLS-DA, KNN, SIMCA, SVM. Moreover, Automics has a user-friendly graphical interface for visualizing NMR spectra and data analysis results. The functional ability of Automics is demonstrated with an analysis of a type 2 diabetes metabolic profile. Conclusion Automics facilitates high throughput 1D NMR spectral processing and high dimensional data analysis for NMR-based metabonomics applications. Using Automics, users can complete spectral processing and data analysis within one software package in most cases

  5. Automics: an integrated platform for NMR-based metabonomics spectral processing and data analysis.

    Science.gov (United States)

    Wang, Tao; Shao, Kang; Chu, Qinying; Ren, Yanfei; Mu, Yiming; Qu, Lijia; He, Jie; Jin, Changwen; Xia, Bin

    2009-03-16

    Spectral processing and post-experimental data analysis are the major tasks in NMR-based metabonomics studies. While there are commercial and free licensed software tools available to assist these tasks, researchers usually have to use multiple software packages for their studies because software packages generally focus on specific tasks. It would be beneficial to have a highly integrated platform, in which these tasks can be completed within one package. Moreover, with open source architecture, newly proposed algorithms or methods for spectral processing and data analysis can be implemented much more easily and accessed freely by the public. In this paper, we report an open source software tool, Automics, which is specifically designed for NMR-based metabonomics studies. Automics is a highly integrated platform that provides functions covering almost all the stages of NMR-based metabonomics studies. Automics provides high throughput automatic modules with most recently proposed algorithms and powerful manual modules for 1D NMR spectral processing. In addition to spectral processing functions, powerful features for data organization, data pre-processing, and data analysis have been implemented. Nine statistical methods can be applied to analyses including: feature selection (Fisher's criterion), data reduction (PCA, LDA, ULDA), unsupervised clustering (K-Mean) and supervised regression and classification (PLS/PLS-DA, KNN, SIMCA, SVM). Moreover, Automics has a user-friendly graphical interface for visualizing NMR spectra and data analysis results. The functional ability of Automics is demonstrated with an analysis of a type 2 diabetes metabolic profile. Automics facilitates high throughput 1D NMR spectral processing and high dimensional data analysis for NMR-based metabonomics applications. Using Automics, users can complete spectral processing and data analysis within one software package in most cases. Moreover, with its open source architecture, interested

  6. Automation Challenges of the 80's: What to Do until Your Integrated Library System Arrives.

    Science.gov (United States)

    Allan, Ferne C.; Shields, Joyce M.

    1986-01-01

    A medium-sized aerospace library has developed interim solutions to automation needs by using software and equipment that were available in-house in preparation for an expected integrated library system. Automated processes include authors' file of items authored by employees, journal routing (including routing slips), statistics, journal…

  7. Functional integration of automated system databases by means of artificial intelligence

    Science.gov (United States)

    Dubovoi, Volodymyr M.; Nikitenko, Olena D.; Kalimoldayev, Maksat; Kotyra, Andrzej; Gromaszek, Konrad; Iskakova, Aigul

    2017-08-01

    The paper presents approaches for functional integration of automated system databases by means of artificial intelligence. The peculiarities of turning to account the database in the systems with the usage of a fuzzy implementation of functions were analyzed. Requirements for the normalization of such databases were defined. The question of data equivalence in conditions of uncertainty and collisions in the presence of the databases functional integration is considered and the model to reveal their possible occurrence is devised. The paper also presents evaluation method of standardization of integrated database normalization.

  8. CMS Distributed Computing Integration in the LHC sustained operations era

    International Nuclear Information System (INIS)

    Grandi, C; Bonacorsi, D; Bockelman, B; Fisk, I

    2011-01-01

    After many years of preparation the CMS computing system has reached a situation where stability in operations limits the possibility to introduce innovative features. Nevertheless it is the same need of stability and smooth operations that requires the introduction of features that were considered not strategic in the previous phases. Examples are: adequate authorization to control and prioritize the access to storage and computing resources; improved monitoring to investigate problems and identify bottlenecks on the infrastructure; increased automation to reduce the manpower needed for operations; effective process to deploy in production new releases of the software tools. We present the work of the CMS Distributed Computing Integration Activity that is responsible for providing a liaison between the CMS distributed computing infrastructure and the software providers, both internal and external to CMS. In particular we describe the introduction of new middleware features during the last 18 months as well as the requirements to Grid and Cloud software developers for the future.

  9. Office automation: The administrative window into the integrated DBMS

    Science.gov (United States)

    Brock, G. H.

    1985-01-01

    In parallel to the evolution of Management Information Systems from simple data files to complex data bases, the stand-alone computer systems have been migrating toward fully integrated systems serving the work force. The next major productivity gain may very well be to make these highly sophisticated working level Data Base Management Systems (DMBS) serve all levels of management with reports of varying levels of detail. Most attempts by the DBMS development organization to provide useful information to management seem to bog down in the quagmire of competing working level requirements. Most large DBMS development organizations possess three to five year backlogs. Perhaps Office Automation is the vehicle that brings to pass the Management Information System that really serves management. A good office automation system manned by a team of facilitators seeking opportunities to serve end-users could go a long way toward defining a DBMS that serves management. This paper will briefly discuss the problems of the DBMS organization, alternative approaches to solving some of the major problems, a debate about problems that may have no solution, and finally how office automation fits into the development of the Manager's Management Information System.

  10. Power up your plant - An introduction to integrated process and power automation

    Energy Technology Data Exchange (ETDEWEB)

    Vasel, Jeffrey

    2010-09-15

    This paper discusses how a single integrated system can increase energy efficiency, improve plant uptime, and lower life cycle costs. Integrated Process and Power Automation is a new system integration architecture and power strategy that addresses the needs of the process and power generation industries. The architecture is based on Industrial Ethernet standards such as IEC 61850 and Profinet as well as Fieldbus technologies. The energy efficiency gains from integration are discussed in a power generation use case. A power management system success story from a major oil and gas company, Petrobras, is also discussed.

  11. 76 FR 63941 - Agency Information Collection Activities: Business Transformation-Automated Integrated Operating...

    Science.gov (United States)

    2011-10-14

    ... Activities: Business Transformation--Automated Integrated Operating Environment (IOE), New Information... Internet by federal agencies through efforts like USCIS' Business Transformation initiative. The USCIS ELIS... the USCIS Business Transformation initiative and wizard technology. The supporting statement can be...

  12. Integrated Markov-neural reliability computation method: A case for multiple automated guided vehicle system

    International Nuclear Information System (INIS)

    Fazlollahtabar, Hamed; Saidi-Mehrabad, Mohammad; Balakrishnan, Jaydeep

    2015-01-01

    This paper proposes an integrated Markovian and back propagation neural network approaches to compute reliability of a system. While states of failure occurrences are significant elements for accurate reliability computation, Markovian based reliability assessment method is designed. Due to drawbacks shown by Markovian model for steady state reliability computations and neural network for initial training pattern, integration being called Markov-neural is developed and evaluated. To show efficiency of the proposed approach comparative analyses are performed. Also, for managerial implication purpose an application case for multiple automated guided vehicles (AGVs) in manufacturing networks is conducted. - Highlights: • Integrated Markovian and back propagation neural network approach to compute reliability. • Markovian based reliability assessment method. • Managerial implication is shown in an application case for multiple automated guided vehicles (AGVs) in manufacturing networks

  13. Autonomous Integrated Receive System (AIRS) requirements definition. Volume 4: Functional specification for the prototype Automated Integrated Receive System (AIRS)

    Science.gov (United States)

    Chie, C. M.

    1984-01-01

    The functional requirements for the performance, design, and testing for the prototype Automated Integrated Receive System (AIRS) to be demonstrated for the TDRSS S-Band Single Access Return Link are presented.

  14. Integrated system for automated financial document processing

    Science.gov (United States)

    Hassanein, Khaled S.; Wesolkowski, Slawo; Higgins, Ray; Crabtree, Ralph; Peng, Antai

    1997-02-01

    A system was developed that integrates intelligent document analysis with multiple character/numeral recognition engines in order to achieve high accuracy automated financial document processing. In this system, images are accepted in both their grayscale and binary formats. A document analysis module starts by extracting essential features from the document to help identify its type (e.g. personal check, business check, etc.). These features are also utilized to conduct a full analysis of the image to determine the location of interesting zones such as the courtesy amount and the legal amount. These fields are then made available to several recognition knowledge sources such as courtesy amount recognition engines and legal amount recognition engines through a blackboard architecture. This architecture allows all the available knowledge sources to contribute incrementally and opportunistically to the solution of the given recognition query. Performance results on a test set of machine printed business checks using the integrated system are also reported.

  15. Enterprise Integration of Management and Automation in a Refinery

    Science.gov (United States)

    Wang, Chengen

    Traditionally, problems in a petroleum refinery were separately modeled and solved with respect to disciplines. The segregated implementations of various disciplinary technologies resulted in considerable barriers impeding the pursuit of global optimal performance. It is recognized that enterprise-wide integration of the managerial and automation systems is of fundamental significance for refineries to promptly respond to global market requirements. In this paper, the technical implementations are disciplinarily categorized into managerial and automatic systems. Then, typical managerial and automatic implementations in a refinery are depicted to give an insight perception of the heterogeneous data sources manipulated by these systems. Finally, an integration approach based on data reconciliation techniques is proposed to link up the heterogeneous data sources.

  16. An Accelerated Testing Approach for Automated Vehicles with Background Traffic Described by Joint Distributions

    OpenAIRE

    Huang, Zhiyuan; Lam, Henry; Zhao, Ding

    2017-01-01

    This paper proposes a new framework based on joint statistical models for evaluating risks of automated vehicles in a naturalistic driving environment. The previous studies on the Accelerated Evaluation for automated vehicles are extended from multi-independent-variate models to joint statistics. The proposed toolkit includes exploration of the rare event (e.g. crash) sets and construction of accelerated distributions for Gaussian Mixture models using Importance Sampling techniques. Furthermo...

  17. Integration issues of distributed generation in distribution grids

    NARCIS (Netherlands)

    Coster, E.J.; Myrzik, J.M.A.; Kruimer, B.; Kling, W.L.

    2011-01-01

    In today’s distribution grids the number of distributed generation (DG) units is increasing rapidly. Combined heat and power (CHP) plants and wind turbines are most often installed. Integration of these DG units into the distribution grid leads to planning as well as operational challenges. Based on

  18. Automated Peak Picking and Peak Integration in Macromolecular NMR Spectra Using AUTOPSY

    Science.gov (United States)

    Koradi, Reto; Billeter, Martin; Engeli, Max; Güntert, Peter; Wüthrich, Kurt

    1998-12-01

    A new approach for automated peak picking of multidimensional protein NMR spectra with strong overlap is introduced, which makes use of the program AUTOPSY (automatedpeak picking for NMRspectroscopy). The main elements of this program are a novel function for local noise level calculation, the use of symmetry considerations, and the use of lineshapes extracted from well-separated peaks for resolving groups of strongly overlapping peaks. The algorithm generates peak lists with precise chemical shift and integral intensities, and a reliability measure for the recognition of each peak. The results of automated peak picking of NOESY spectra with AUTOPSY were tested in combination with the combined automated NOESY cross peak assignment and structure calculation routine NOAH implemented in the program DYANA. The quality of the resulting structures was found to be comparable with those from corresponding data obtained with manual peak picking.

  19. An Integrated Systems Approach: A Description of an Automated Circulation Management System.

    Science.gov (United States)

    Seifert, Jan E.; And Others

    These bidding specifications describe requirements for a turn-key automated circulation system for the University of Oklahoma Libraries. An integrated systems approach is planned, and requirements are presented for various subsystems: acquisitions, fund accounting, reserve room, and bibliographic and serials control. Also outlined are hardware…

  20. Application of high performance asynchronous socket communication in power distribution automation

    Science.gov (United States)

    Wang, Ziyu

    2017-05-01

    With the development of information technology and Internet technology, and the growing demand for electricity, the stability and the reliable operation of power system have been the goal of power grid workers. With the advent of the era of big data, the power data will gradually become an important breakthrough to guarantee the safe and reliable operation of the power grid. So, in the electric power industry, how to efficiently and robustly receive the data transmitted by the data acquisition device, make the power distribution automation system be able to execute scientific decision quickly, which is the pursuit direction in power grid. In this paper, some existing problems in the power system communication are analysed, and with the help of the network technology, a set of solutions called Asynchronous Socket Technology to the problem in network communication which meets the high concurrency and the high throughput is proposed. Besides, the paper also looks forward to the development direction of power distribution automation in the era of big data and artificial intelligence.

  1. Integrated Automation of High-Throughput Screening and Reverse Phase Protein Array Sample Preparation

    DEFF Research Database (Denmark)

    Pedersen, Marlene Lemvig; Block, Ines; List, Markus

    into automated robotic high-throughput screens, which allows subsequent protein quantification. In this integrated solution, samples are directly forwarded to automated cell lysate preparation and preparation of dilution series, including reformatting to a protein spotter-compatible format after the high......-throughput screening. Tracking of huge sample numbers and data analysis from a high-content screen to RPPAs is accomplished via MIRACLE, a custom made software suite developed by us. To this end, we demonstrate that the RPPAs generated in this manner deliver reliable protein readouts and that GAPDH and TFR levels can...

  2. Automated integration of continuous glucose monitor data in the electronic health record using consumer technology.

    Science.gov (United States)

    Kumar, Rajiv B; Goren, Nira D; Stark, David E; Wall, Dennis P; Longhurst, Christopher A

    2016-05-01

    The diabetes healthcare provider plays a key role in interpreting blood glucose trends, but few institutions have successfully integrated patient home glucose data in the electronic health record (EHR). Published implementations to date have required custom interfaces, which limit wide-scale replication. We piloted automated integration of continuous glucose monitor data in the EHR using widely available consumer technology for 10 pediatric patients with insulin-dependent diabetes. Establishment of a passive data communication bridge via a patient's/parent's smartphone enabled automated integration and analytics of patient device data within the EHR between scheduled clinic visits. It is feasible to utilize available consumer technology to assess and triage home diabetes device data within the EHR, and to engage patients/parents and improve healthcare provider workflow. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  3. ROBOCAL: An automated NDA [nondestructive analysis] calorimetry and gamma isotopic system

    International Nuclear Information System (INIS)

    Hurd, J.R.; Powell, W.D.; Ostenak, C.A.

    1989-01-01

    ROBOCAL, which is presently being developed and tested at Los Alamos National Laboratory, is a full-scale, prototype robotic system for remote calorimetric and gamma-ray analysis of special nuclear materials. It integrates a fully automated, multidrawer, vertical stacker-retriever system for staging unmeasured nuclear materials, and a fully automated gantry robot for computer-based selection and transfer of nuclear materials to calorimetric and gamma-ray measurement stations. Since ROBOCAL is designed for minimal operator intervention, a completely programmed user interface is provided to interact with the automated mechanical and assay systems. The assay system is designed to completely integrate calorimetric and gamma-ray data acquisition and to perform state-of-the-art analyses on both homogeneous and heterogeneous distributions of nuclear materials in a wide variety of matrices

  4. Planning and Resource Management in an Intelligent Automated Power Management System

    Science.gov (United States)

    Morris, Robert A.

    1991-01-01

    Power system management is a process of guiding a power system towards the objective of continuous supply of electrical power to a set of loads. Spacecraft power system management requires planning and scheduling, since electrical power is a scarce resource in space. The automation of power system management for future spacecraft has been recognized as an important R&D goal. Several automation technologies have emerged including the use of expert systems for automating human problem solving capabilities such as rule based expert system for fault diagnosis and load scheduling. It is questionable whether current generation expert system technology is applicable for power system management in space. The objective of the ADEPTS (ADvanced Electrical Power management Techniques for Space systems) is to study new techniques for power management automation. These techniques involve integrating current expert system technology with that of parallel and distributed computing, as well as a distributed, object-oriented approach to software design. The focus of the current study is the integration of new procedures for automatically planning and scheduling loads with procedures for performing fault diagnosis and control. The objective is the concurrent execution of both sets of tasks on separate transputer processors, thus adding parallelism to the overall management process.

  5. Distributed GIS for automated natural hazard zonation mapping Internet-SMS warning towards sustainable society

    Directory of Open Access Journals (Sweden)

    Devanjan Bhattacharya

    2014-12-01

    Full Text Available Today, open systems are needed for real time analysis and warnings on geo-hazards and over time can be achieved using Open Source Geographical Information System (GIS-based platform such as GeoNode which is being contributed to by developers around the world. To develop on an open source platform is a very vital component for better disaster information management as far as spatial data infrastructures are concerned and this would be extremely vital when huge databases are to be created and consulted regularly for city planning at different scales, particularly satellite images and maps of locations. There is a big need for spatially referenced data creation, analysis, and management. Some of the salient points that this research would be able to definitely contribute with GeoNode, being an open source platform, are facilitating the creation, sharing, and collaborative use of geospatial data. The objective is development of an automated natural hazard zonation system with Internet-short message service (SMS warning utilizing geomatics for sustainable societies. A concept of developing an internet-resident geospatial geohazard warning system has been put forward in this research, which can communicate alerts via SMS. There has been a need to develop an automated integrated system to categorize hazard and issue warning that reaches users directly. At present, no web-enabled warning system exists which can disseminate warning after hazard evaluation at one go and in real time. The objective of this research work has been to formalize a notion of an integrated, independent, generalized, and automated geo-hazard warning system making use of geo-spatial data under popular usage platform. In this paper, a model of an automated geo-spatial hazard warning system has been elaborated. The functionality is to be modular in architecture having GIS-graphical user interface (GUI, input, understanding, rainfall prediction, expert, output, and warning modules. A

  6. Toward automated interpretation of integrated information: Managing "big data" for NDE

    Science.gov (United States)

    Gregory, Elizabeth; Lesthaeghe, Tyler; Holland, Stephen

    2015-03-01

    Large scale automation of NDE processes is rapidly maturing, thanks to recent improvements in robotics and the rapid growth of computer power over the last twenty years. It is fairly straightforward to automate NDE data collection itself, but the process of NDE remains largely manual. We will discuss three threads of technological needs that must be addressed before we are able to perform automated NDE. Spatial context, the first thread, means that each NDE measurement taken is accompanied by metadata that locates the measurement with respect to the 3D physical geometry of the specimen. In this way, the geometry of the specimen acts as a database key. Data context, the second thread, means that we record why the data was taken and how it was measured in addition to the NDE data itself. We will present our software tool that helps users interact with data in context, Databrowse. Condition estimation, the third thread, is maintaining the best possible knowledge of the condition (serviceability, degradation, etc.) of an object or part. In the NDE context, we can prospectively use Bayes' Theorem to integrate the data from each new NDE measurement with prior knowledge. These tools, combined with robotic measurements and automated defect analysis, will provide the information needed to make high-level life predictions and focus NDE measurements where they are needed most.

  7. Evidence Report, Risk of Inadequate Design of Human and Automation/Robotic Integration

    Science.gov (United States)

    Zumbado, Jennifer Rochlis; Billman, Dorrit; Feary, Mike; Green, Collin

    2011-01-01

    The success of future exploration missions depends, even more than today, on effective integration of humans and technology (automation and robotics). This will not emerge by chance, but by design. Both crew and ground personnel will need to do more demanding tasks in more difficult conditions, amplifying the costs of poor design and the benefits of good design. This report has looked at the importance of good design and the risks from poor design from several perspectives: 1) If the relevant functions needed for a mission are not identified, then designs of technology and its use by humans are unlikely to be effective: critical functions will be missing and irrelevant functions will mislead or drain attention. 2) If functions are not distributed effectively among the (multiple) participating humans and automation/robotic systems, later design choices can do little to repair this: additional unnecessary coordination work may be introduced, workload may be redistributed to create problems, limited human attentional resources may be wasted, and the capabilities of both humans and technology underused. 3) If the design does not promote accurate understanding of the capabilities of the technology, the operators will not use the technology effectively: the system may be switched off in conditions where it would be effective, or used for tasks or in contexts where its effectiveness may be very limited. 4) If an ineffective interaction design is implemented and put into use, a wide range of problems can ensue. Many involve lack of transparency into the system: operators may be unable or find it very difficult to determine a) the current state and changes of state of the automation or robot, b) the current state and changes in state of the system being controlled or acted on, and c) what actions by human or by system had what effects. 5) If the human interfaces for operation and control of robotic agents are not designed to accommodate the unique points of view and

  8. Implementation strategies for load center automation on the space station module/power management and distribution testbed

    Science.gov (United States)

    Watson, Karen

    1990-01-01

    The Space Station Module/Power Management and Distribution (SSM/PMAD) testbed was developed to study the tertiary power management on modules in large spacecraft. The main goal was to study automation techniques, not necessarily develop flight ready systems. Because of the confidence gained in many of automation strategies investigated, it is appropriate to study, in more detail, implementation strategies in order to find better trade-offs for nearer to flight ready systems. These trade-offs particularly concern the weight, volume, power consumption, and performance of the automation system. These systems, in their present implementation are described.

  9. Automation and Integration in Semiconductor Manufacturing

    OpenAIRE

    Liao, Da-Yin

    2010-01-01

    Semiconductor automation originates from the prevention and avoidance of frauds in daily fab operations. As semiconductor technology and business continuously advance and grow, manufacturing systems must aggressively evolve to meet the changing technical and business requirements in this industry. Semiconductor manufacturing has been suffering pains from islands of automation. The problems associated with these systems are limited

  10. Planning and control of automated material handling systems: The merge module

    NARCIS (Netherlands)

    Haneyah, S.W.A.; Hurink, Johann L.; Schutten, Johannes M.J.; Zijm, Willem H.M.; Schuur, Peter; Hu, Bo; Morasch, Karl; Pickl, Stefan; Siegle, Markus

    2011-01-01

    We address the field of internal logistics, embodied in Automated Material Handling Systems (AMHSs), which are complex installations employed in sectors such as Baggage Handling, Physical Distribution, and Parcel & Postal. We work on designing an integral planning and real-time control architecture,

  11. Optimization of automation: III. Development of optimization method for determining automation rate in nuclear power plants

    International Nuclear Information System (INIS)

    Lee, Seung Min; Kim, Jong Hyun; Kim, Man Cheol; Seong, Poong Hyun

    2016-01-01

    Highlights: • We propose an appropriate automation rate that enables the best human performance. • We analyze the shortest working time considering Situation Awareness Recovery (SAR). • The optimized automation rate is estimated by integrating the automation and ostracism rate estimation methods. • The process to derive the optimized automation rate is demonstrated through case studies. - Abstract: Automation has been introduced in various industries, including the nuclear field, because it is commonly believed that automation promises greater efficiency, lower workloads, and fewer operator errors through reducing operator errors and enhancing operator and system performance. However, the excessive introduction of automation has deteriorated operator performance due to the side effects of automation, which are referred to as Out-of-the-Loop (OOTL), and this is critical issue that must be resolved. Thus, in order to determine the optimal level of automation introduction that assures the best human operator performance, a quantitative method of optimizing the automation is proposed in this paper. In order to propose the optimization method for determining appropriate automation levels that enable the best human performance, the automation rate and ostracism rate, which are estimation methods that quantitatively analyze the positive and negative effects of automation, respectively, are integrated. The integration was conducted in order to derive the shortest working time through considering the concept of situation awareness recovery (SAR), which states that the automation rate with the shortest working time assures the best human performance. The process to derive the optimized automation rate is demonstrated through an emergency operation scenario-based case study. In this case study, four types of procedures are assumed through redesigning the original emergency operating procedure according to the introduced automation and ostracism levels. Using the

  12. Automating usability of ATLAS distributed computing resources

    International Nuclear Information System (INIS)

    Tupputi, S A; Girolamo, A Di; Kouba, T; Schovancová, J

    2014-01-01

    The automation of ATLAS Distributed Computing (ADC) operations is essential to reduce manpower costs and allow performance-enhancing actions, which improve the reliability of the system. In this perspective a crucial case is the automatic handling of outages of ATLAS computing sites storage resources, which are continuously exploited at the edge of their capabilities. It is challenging to adopt unambiguous decision criteria for storage resources of non-homogeneous types, sizes and roles. The recently developed Storage Area Automatic Blacklisting (SAAB) tool has provided a suitable solution, by employing an inference algorithm which processes history of storage monitoring tests outcome. SAAB accomplishes both the tasks of providing global monitoring as well as automatic operations on single sites. The implementation of the SAAB tool has been the first step in a comprehensive review of the storage areas monitoring and central management at all levels. Such review has involved the reordering and optimization of SAM tests deployment and the inclusion of SAAB results in the ATLAS Site Status Board with both dedicated metrics and views. The resulting structure allows monitoring the storage resources status with fine time-granularity and automatic actions to be taken in foreseen cases, like automatic outage handling and notifications to sites. Hence, the human actions are restricted to reporting and following up problems, where and when needed. In this work we show SAAB working principles and features. We present also the decrease of human interactions achieved within the ATLAS Computing Operation team. The automation results in a prompt reaction to failures, which leads to the optimization of resource exploitation.

  13. Distribution definition of path integrals

    International Nuclear Information System (INIS)

    Kerler, W.

    1979-01-01

    By starting from quantum mechanics it turns out that a rather general definition of quantum functional integrals can be given which is based on distribution theory. It applies also to curved space and provides clear rules for non-linear transformations. The refinements necessary in usual definitions of path integrals are pointed out. Since the quantum nature requires special care with time sequences, it is not the classical phase space which occurs in the phase-space form of the path integral. Feynman's configuration-space form only applies to a highly specialized situation, and therefore is not a very advantageous starting point for general investigations. It is shown that the commonly used substitutions of variables do not properly account for quantum effects. The relation to the traditional ordering problem is clarified. The distribution formulation has allowed to treat constrained systems directly at the quantum level, to complete the path integral formulation of the equivalence theorem, and to define functional integrals also for space translation after the transition to fields. (orig.)

  14. Power system voltage stability and agent based distribution automation in smart grid

    Science.gov (United States)

    Nguyen, Cuong Phuc

    2011-12-01

    Our interconnected electric power system is presently facing many challenges that it was not originally designed and engineered to handle. The increased inter-area power transfers, aging infrastructure, and old technologies, have caused many problems including voltage instability, widespread blackouts, slow control response, among others. These problems have created an urgent need to transform the present electric power system to a highly stable, reliable, efficient, and self-healing electric power system of the future, which has been termed "smart grid". This dissertation begins with an investigation of voltage stability in bulk transmission networks. A new continuation power flow tool for studying the impacts of generator merit order based dispatch on inter-area transfer capability and static voltage stability is presented. The load demands are represented by lumped load models on the transmission system. While this representation is acceptable in traditional power system analysis, it may not be valid in the future smart grid where the distribution system will be integrated with intelligent and quick control capabilities to mitigate voltage problems before they propagate into the entire system. Therefore, before analyzing the operation of the whole smart grid, it is important to understand the distribution system first. The second part of this dissertation presents a new platform for studying and testing emerging technologies in advanced Distribution Automation (DA) within smart grids. Due to the key benefits over the traditional centralized approach, namely flexible deployment, scalability, and avoidance of single-point-of-failure, a new distributed approach is employed to design and develop all elements of the platform. A multi-agent system (MAS), which has the three key characteristics of autonomy, local view, and decentralization, is selected to implement the advanced DA functions. The intelligent agents utilize a communication network for cooperation and

  15. CERES AuTomAted job Loading SYSTem (CATALYST): An automated workflow manager for satellite data production

    Science.gov (United States)

    Gleason, J. L.; Hillyer, T. N.; Wilkins, J.

    2012-12-01

    The CERES Science Team integrates data from 5 CERES instruments onboard the Terra, Aqua and NPP missions. The processing chain fuses CERES observations with data from 19 other unique sources. The addition of CERES Flight Model 5 (FM5) onboard NPP, coupled with ground processing system upgrades further emphasizes the need for an automated job-submission utility to manage multiple processing streams concurrently. The operator-driven, legacy-processing approach relied on manually staging data from magnetic tape to limited spinning disk attached to a shared memory architecture system. The migration of CERES production code to a distributed, cluster computing environment with approximately one petabyte of spinning disk containing all precursor input data products facilitates the development of a CERES-specific, automated workflow manager. In the cluster environment, I/O is the primary system resource in contention across jobs. Therefore, system load can be maximized with a throttling workload manager. This poster discusses a Java and Perl implementation of an automated job management tool tailored for CERES processing.

  16. MIDAS: Automated Approach to Design Microwave Integrated Inductors and Transformers on Silicon

    Directory of Open Access Journals (Sweden)

    L. Aluigi

    2013-09-01

    Full Text Available The design of modern radiofrequency integrated circuits on silicon operating at microwave and millimeter-waves requires the integration of several spiral inductors and transformers that are not commonly available in the process design-kits of the technologies. In this work we present an auxiliary CAD tool for Microwave Inductor (and transformer Design Automation on Silicon (MIDAS that exploits commercial simulators and allows the implementation of an automatic design flow, including three-dimensional layout editing and electromagnetic simulations. In detail, MIDAS allows the designer to derive a preliminary sizing of the inductor (transformer on the bases of the design entries (specifications. It draws the inductor (transformer layers for the specific process design kit, including vias and underpasses, with or without patterned ground shield, and launches the electromagnetic simulations, achieving effective design automation with respect to the traditional design flow for RFICs. With the present software suite the complete design time is reduced significantly (typically 1 hour on a PC based on Intel® Pentium® Dual 1.80GHz CPU with 2-GB RAM. Afterwards both the device equivalent circuit and the layout are ready to be imported in the Cadence environment.

  17. Thermal Distribution System | Energy Systems Integration Facility | NREL

    Science.gov (United States)

    Thermal Distribution System Thermal Distribution System The Energy Systems Integration Facility's . Photo of the roof of the Energy Systems Integration Facility. The thermal distribution bus allows low as 10% of its full load level). The 60-ton chiller cools water with continuous thermal control

  18. Integration of Real-Time Data Into Building Automation Systems

    Energy Technology Data Exchange (ETDEWEB)

    Mark J. Stunder; Perry Sebastian; Brenda A. Chube; Michael D. Koontz

    2003-04-16

    The project goal was to investigate the possibility of using predictive real-time information from the Internet as an input to building management system algorithms. The objectives were to identify the types of information most valuable to commercial and residential building owners, managers, and system designers. To comprehensively investigate and document currently available electronic real-time information suitable for use in building management systems. Verify the reliability of the information and recommend accreditation methods for data and providers. Assess methodologies to automatically retrieve and utilize the information. Characterize equipment required to implement automated integration. Demonstrate the feasibility and benefits of using the information in building management systems. Identify evolutionary control strategies.

  19. A distributed substation automation model based on the multi-agents technology; Um modelo distribuido de automacao de subestacoes baseado em tecnologia multiagentes

    Energy Technology Data Exchange (ETDEWEB)

    Geus, Klaus de; Milsztajn, Flavio; Kolb, Carlos Jose Johann; Dometerco, Jose Henrique; Souza, Alexandre Mendonca de; Braga, Ciro de Carvalho; Parolin, Emerson Luis; Frisch, Arlenio Carneiro; Fortunato Junior, Luiz Kiss; Erzinger Junior, Augusto; Jonack, Marco Antonio; Guiera, Anderson Juliano Azambuja [Companhia Paranaense de Energia (COPEL), Curitiba, PR (Brazil)]. E-mail: klaus@copel.com; flaviomil@copel.com; kolb@copel.com; dometerc@copel.com; alexandre.mendonca@copel.com; ciro@copel.com; parolin@copel.com; arlenio@copel.com; luiz.kiss@copel.com; aerzinger@copel.com; jonack@copel.com; guiera@copel.com

    2006-10-15

    The main purpose of this paper is to analyse distributed computing technology which can be used in substation automation systems. Based on performance comparative results obtained in laboratory, a specific model for distributed substation automation is proposed considering the current model employed at COPEL - Companhia Paranaense de Energia. The proposed model is based on the multi-agents technology, which has lately received special attention in the development of distributed systems with local intelligence. (author)

  20. Advanced fighter technology integration (AFTI)/F-16 Automated Maneuvering Attack System final flight test results

    Science.gov (United States)

    Dowden, Donald J.; Bessette, Denis E.

    1987-01-01

    The AFTI F-16 Automated Maneuvering Attack System has undergone developmental and demonstration flight testing over a total of 347.3 flying hours in 237 sorties. The emphasis of this phase of the flight test program was on the development of automated guidance and control systems for air-to-air and air-to-ground weapons delivery, using a digital flight control system, dual avionics multiplex buses, an advanced FLIR sensor with laser ranger, integrated flight/fire-control software, advanced cockpit display and controls, and modified core Multinational Stage Improvement Program avionics.

  1. Overview of NREL Distribution Grid Integration Cost Projects

    Energy Technology Data Exchange (ETDEWEB)

    Horowitz, Kelsey A [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Ding, Fei [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Mather, Barry A [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Palmintier, Bryan S [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Denholm, Paul L [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2018-01-12

    This presentation was given at the 2017 NREL Workshop 'Benchmarking Distribution Grid Integration Costs Under High Distributed PV Penetrations.' It provides a brief overview of recent and ongoing NREL work on distribution system grid integration costs, as well as challenges and needs from the community.

  2. Concept of a computer network architecture for complete automation of nuclear power plants

    International Nuclear Information System (INIS)

    Edwards, R.M.; Ray, A.

    1990-01-01

    The state of the art in automation of nuclear power plants has been largely limited to computerized data acquisition, monitoring, display, and recording of process signals. Complete automation of nuclear power plants, which would include plant operations, control, and management, fault diagnosis, and system reconfiguration with efficient and reliable man/machine interactions, has been projected as a realistic goal. This paper presents the concept of a computer network architecture that would use a high-speed optical data highway to integrate diverse, interacting, and spatially distributed functions that are essential for a fully automated nuclear power plant

  3. A Distributed Intelligent Automated Demand Response Building Management System

    Energy Technology Data Exchange (ETDEWEB)

    Auslander, David [Univ. of California, Berkeley, CA (United States); Culler, David [Univ. of California, Berkeley, CA (United States); Wright, Paul [Univ. of California, Berkeley, CA (United States); Lu, Yan [Siemens Corporate Research Inc., Princeton, NJ (United States); Piette, Mary [Univ. of California, Berkeley, CA (United States)

    2013-03-31

    The goal of the 2.5 year Distributed Intelligent Automated Demand Response (DIADR) project was to reduce peak electricity load of Sutardja Dai Hall at UC Berkeley by 30% while maintaining a healthy, comfortable, and productive environment for the occupants. We sought to bring together both central and distributed control to provide “deep” demand response1 at the appliance level of the building as well as typical lighting and HVAC applications. This project brought together Siemens Corporate Research and Siemens Building Technology (the building has a Siemens Apogee Building Automation System (BAS)), Lawrence Berkeley National Laboratory (leveraging their Open Automated Demand Response (openADR), Auto-­Demand Response, and building modeling expertise), and UC Berkeley (related demand response research including distributed wireless control, and grid-­to-­building gateway development). Sutardja Dai Hall houses the Center for Information Technology Research in the Interest of Society (CITRIS), which fosters collaboration among industry and faculty and students of four UC campuses (Berkeley, Davis, Merced, and Santa Cruz). The 141,000 square foot building, occupied in 2009, includes typical office spaces and a nanofabrication laboratory. Heating is provided by a district heating system (steam from campus as a byproduct of the campus cogeneration plant); cooling is provided by one of two chillers: a more typical electric centrifugal compressor chiller designed for the cool months (Nov-­ March) and a steam absorption chiller for use in the warm months (April-­October). Lighting in the open office areas is provided by direct-­indirect luminaries with Building Management System-­based scheduling for open areas, and occupancy sensors for private office areas. For the purposes of this project, we focused on the office portion of the building. Annual energy consumption is approximately 8053 MWh; the office portion is estimated as 1924 MWh. The maximum peak load

  4. Configuration Management Automation (CMA) -

    Data.gov (United States)

    Department of Transportation — Configuration Management Automation (CMA) will provide an automated, integrated enterprise solution to support CM of FAA NAS and Non-NAS assets and investments. CMA...

  5. Automated quality control in a file-based broadcasting workflow

    Science.gov (United States)

    Zhang, Lina

    2014-04-01

    Benefit from the development of information and internet technologies, television broadcasting is transforming from inefficient tape-based production and distribution to integrated file-based workflows. However, no matter how many changes have took place, successful broadcasting still depends on the ability to deliver a consistent high quality signal to the audiences. After the transition from tape to file, traditional methods of manual quality control (QC) become inadequate, subjective, and inefficient. Based on China Central Television's full file-based workflow in the new site, this paper introduces an automated quality control test system for accurate detection of hidden troubles in media contents. It discusses the system framework and workflow control when the automated QC is added. It puts forward a QC criterion and brings forth a QC software followed this criterion. It also does some experiments on QC speed by adopting parallel processing and distributed computing. The performance of the test system shows that the adoption of automated QC can make the production effective and efficient, and help the station to achieve a competitive advantage in the media market.

  6. Regulatory Improvements for Effective Integration of Distributed Generation into Electricity Distribution Networks

    International Nuclear Information System (INIS)

    Scheepers, M.J.J.; Jansen, J.C.; De Joode, J.; Bauknecht, D.; Gomez, T.; Pudjianto, D.; Strbac, G.; Ropenus, S.

    2007-11-01

    The growth of distributed electricity supply of renewable energy sources (RES-E) and combined heat and power (CHP) - so called distributed generation (DG) - can cause technical problems for electricity distribution networks. These integration problems can be overcome by reinforcing the network. Many European Member States apply network regulation that does not account for the impact of DG growth on the network costs. Passing on network integration costs to the DG-operator who is responsible for these extra costs may result in discrimination between different DG plants and between DG and large power generation. Therefore, in many regulatory systems distribution system operators (DSOs) are not being compensated for the DG integration costs. The DG-GRID project analysed technical and economical barriers for integration of distributed generation into electricity distribution networks. The project looked into the impact of a high DG deployment on the electricity distribution system costs and the impact on the financial position of the DSO. Several ways for improving network regulation in order to compensate DSOs for the increasing DG penetration were identified and tested. The DG-GRID project looked also into stimulating network innovations through economic regulation. The project was co-financed by the European Commission and carried out by nine European universities and research institutes. This report summarises the project results and is based on a number of DG-GRID reports that describe the conducted analyses and their results

  7. Integrating CLIPS applications into heterogeneous distributed systems

    Science.gov (United States)

    Adler, Richard M.

    1991-01-01

    SOCIAL is an advanced, object-oriented development tool for integrating intelligent and conventional applications across heterogeneous hardware and software platforms. SOCIAL defines a family of 'wrapper' objects called agents, which incorporate predefined capabilities for distributed communication and control. Developers embed applications within agents and establish interactions between distributed agents via non-intrusive message-based interfaces. This paper describes a predefined SOCIAL agent that is specialized for integrating C Language Integrated Production System (CLIPS)-based applications. The agent's high-level Application Programming Interface supports bidirectional flow of data, knowledge, and commands to other agents, enabling CLIPS applications to initiate interactions autonomously, and respond to requests and results from heterogeneous remote systems. The design and operation of CLIPS agents are illustrated with two distributed applications that integrate CLIPS-based expert systems with other intelligent systems for isolating and mapping problems in the Space Shuttle Launch Processing System at the NASA Kennedy Space Center.

  8. Programmable Automated Welding System (PAWS)

    Science.gov (United States)

    Kline, Martin D.

    1994-01-01

    An ambitious project to develop an advanced, automated welding system is being funded as part of the Navy Joining Center with Babcock & Wilcox as the prime integrator. This program, the Programmable Automated Welding System (PAWS), involves the integration of both planning and real-time control activities. Planning functions include the development of a graphical decision support system within a standard, portable environment. Real-time control functions include the development of a modular, intelligent, real-time control system and the integration of a number of welding process sensors. This paper presents each of these components of the PAWS and discusses how they can be utilized to automate the welding operation.

  9. Potential of Laboratory Execution Systems (LESs) to Simplify the Application of Business Process Management Systems (BPMSs) in Laboratory Automation.

    Science.gov (United States)

    Neubert, Sebastian; Göde, Bernd; Gu, Xiangyu; Stoll, Norbert; Thurow, Kerstin

    2017-04-01

    Modern business process management (BPM) is increasingly interesting for laboratory automation. End-to-end workflow automation and improved top-level systems integration for information technology (IT) and automation systems are especially prominent objectives. With the ISO Standard Business Process Model and Notation (BPMN) 2.X, a system-independent and interdisciplinary accepted graphical process control notation is provided, allowing process analysis, while also being executable. The transfer of BPM solutions to structured laboratory automation places novel demands, for example, concerning the real-time-critical process and systems integration. The article discusses the potential of laboratory execution systems (LESs) for an easier implementation of the business process management system (BPMS) in hierarchical laboratory automation. In particular, complex application scenarios, including long process chains based on, for example, several distributed automation islands and mobile laboratory robots for a material transport, are difficult to handle in BPMSs. The presented approach deals with the displacement of workflow control tasks into life science specialized LESs, the reduction of numerous different interfaces between BPMSs and subsystems, and the simplification of complex process modelings. Thus, the integration effort for complex laboratory workflows can be significantly reduced for strictly structured automation solutions. An example application, consisting of a mixture of manual and automated subprocesses, is demonstrated by the presented BPMS-LES approach.

  10. Automated element identification for EDS spectra evaluation using quantification and integrated spectra simulation approaches

    International Nuclear Information System (INIS)

    Eggert, F

    2010-01-01

    This work describes first real automated solution for qualitative evaluation of EDS spectra in X-ray microanalysis. It uses a combination of integrated standardless quantitative evaluation, computation of analytical errors to a final uncertainty, and parts of recently developed simulation approaches. Multiple spectra reconstruction assessments and peak searches of the residual spectrum are powerful enough to solve the qualitative analytical question automatically for totally unknown specimens. The integrated quantitative assessment is useful to improve the confidence of the qualitative analysis. Therefore, the qualitative element analysis becomes a part of integrated quantitative spectrum evaluation, where the quantitative results are used to iteratively refine element decisions, spectrum deconvolution, and simulation steps.

  11. Integration of 100% Micro-Distributed Energy Resources in the Low Voltage Distribution Network

    DEFF Research Database (Denmark)

    You, Shi; Segerberg, Helena

    2014-01-01

    of heat pumps (HPs) and plug-in electric vehicles (PEVs) at 100% penetration level on a representative urban residential low voltage (LV) distribution network of Denmark are investigated by performing a steady-state load flow analysis through an integrated simulation setup. Three DERs integration...... oriented integration strategies, having 100% integration of DER in the provided LV network is feasible.......The existing electricity infrastructure may to a great extent limit a high penetration of the micro-sized Distributed Energy Resources (DERs), due to the physical bottlenecks, e.g. thermal capacitates of cables, transformers and the voltage limitations. In this study, the integration impacts...

  12. Coordinating complex decision support activities across distributed applications

    Science.gov (United States)

    Adler, Richard M.

    1994-01-01

    Knowledge-based technologies have been applied successfully to automate planning and scheduling in many problem domains. Automation of decision support can be increased further by integrating task-specific applications with supporting database systems, and by coordinating interactions between such tools to facilitate collaborative activities. Unfortunately, the technical obstacles that must be overcome to achieve this vision of transparent, cooperative problem-solving are daunting. Intelligent decision support tools are typically developed for standalone use, rely on incompatible, task-specific representational models and application programming interfaces (API's), and run on heterogeneous computing platforms. Getting such applications to interact freely calls for platform independent capabilities for distributed communication, as well as tools for mapping information across disparate representations. Symbiotics is developing a layered set of software tools (called NetWorks! for integrating and coordinating heterogeneous distributed applications. he top layer of tools consists of an extensible set of generic, programmable coordination services. Developers access these services via high-level API's to implement the desired interactions between distributed applications.

  13. Automation of Test Cases for Web Applications : Automation of CRM Test Cases

    OpenAIRE

    Seyoum, Alazar

    2012-01-01

    The main theme of this project was to design a test automation framework for automating web related test cases. Automating test cases designed for testing a web interface provide a means of improving a software development process by shortening the testing phase in the software development life cycle. In this project an existing AutoTester framework and iMacros test automation tools were used. CRM Test Agent was developed to integrate AutoTester to iMacros and to enable the AutoTester,...

  14. Database Security for an Integrated Solution to Automate Sales Processes in Banking

    OpenAIRE

    Alexandra Maria Ioana FLOREA

    2013-01-01

    In order to maintain a competitive edge in a very active banking market the implementation of a web-based solution to standardize, optimize and manage the flow of sales / pre-sales and generating new leads is requested by a company. This article presents the realization of a development framework for software interoperability in the banking financial institutions and an integrated solution for achieving sales process automation in banking. The paper focuses on presenting the requirements for ...

  15. Distribution Integrity Management Plant (DIMP)

    Energy Technology Data Exchange (ETDEWEB)

    Gonzales, Jerome F. [Los Alamos National Laboratory

    2012-05-07

    This document is the distribution integrity management plan (Plan) for the Los Alamos National Laboratory (LANL) Natural Gas Distribution System. This Plan meets the requirements of 49 CFR Part 192, Subpart P Distribution Integrity Management Programs (DIMP) for the LANL Natural Gas Distribution System. This Plan was developed by reviewing records and interviewing LANL personnel. The records consist of the design, construction, operation and maintenance for the LANL Natural Gas Distribution System. The records system for the LANL Natural Gas Distribution System is limited, so the majority of information is based on the judgment of LANL employees; the maintenance crew, the Corrosion Specialist and the Utilities and Infrastructure (UI) Civil Team Leader. The records used in this report are: Pipeline and Hazardous Materials Safety Administration (PHMSA) 7100.1-1, Report of Main and Service Line Inspection, Natural Gas Leak Survey, Gas Leak Response Report, Gas Leak and Repair Report, and Pipe-to-Soil Recordings. The specific elements of knowledge of the infrastructure used to evaluate each threat and prioritize risks are listed in Sections 6 and 7, Threat Evaluation and Risk Prioritization respectively. This Plan addresses additional information needed and a method for gaining that data over time through normal activities. The processes used for the initial assessment of Threat Evaluation and Risk Prioritization are the methods found in the Simple, Handy Risk-based Integrity Management Plan (SHRIMP{trademark}) software package developed by the American Pipeline and Gas Agency (APGA) Security and Integrity Foundation (SIF). SHRIMP{trademark} uses an index model developed by the consultants and advisors of the SIF. Threat assessment is performed using questions developed by the Gas Piping Technology Company (GPTC) as modified and added to by the SHRIMP{trademark} advisors. This Plan is required to be reviewed every 5 years to be continually refined and improved. Records

  16. Aviation safety/automation program overview

    Science.gov (United States)

    Morello, Samuel A.

    1990-01-01

    The goal is to provide a technology base leading to improved safety of the national airspace system through the development and integration of human-centered automation technologies for aircraft crews and air traffic controllers. Information on the problems, specific objectives, human-automation interaction, intelligent error-tolerant systems, and air traffic control/cockpit integration is given in viewgraph form.

  17. Westinghouse integrated cementation facility. Smart process automation minimizing secondary waste

    International Nuclear Information System (INIS)

    Fehrmann, H.; Jacobs, T.; Aign, J.

    2015-01-01

    The Westinghouse Cementation Facility described in this paper is an example for a typical standardized turnkey project in the area of waste management. The facility is able to handle NPP waste such as evaporator concentrates, spent resins and filter cartridges. The facility scope covers all equipment required for a fully integrated system including all required auxiliary equipment for hydraulic, pneumatic and electric control system. The control system is based on actual PLC technology and the process is highly automated. The equipment is designed to be remotely operated, under radiation exposure conditions. 4 cementation facilities have been built for new CPR-1000 nuclear power stations in China

  18. Context-awareness in task automation services by distributed event processing

    OpenAIRE

    Coronado Barrios, Miguel; Bruns, Ralf; Dunkel, Jürgen; Stipković, Sebastian

    2014-01-01

    Everybody has to coordinate several tasks everyday, usually in a manual manner. Recently, the concept of Task Automation Services has been introduced to automate and personalize the task coordination problem. Several user centered platforms and applications have arisen in the last years, that let their users configure their very own automations based on third party services. In this paper, we propose a new system architecture for Task Automation Services in a heterogeneous mobile, smart devic...

  19. DOMOTICS Aplicability and home automation systems

    Directory of Open Access Journals (Sweden)

    César Luiz de Azevedo Dias

    2010-05-01

    Full Text Available This article discusses the benefits and applicability of domestic automation, also known as Domotics. According to Domotics Integration Project (DIP, Domotics or smart house technology is the integration of services and technologies applied to homes, flats, apartments, houses and small buildings with the purpose of automating them and obtaining and increasing safety and security, comfort, communication and technical management. This papper also presents a summary of the elements which may be part of a “smart home”, the advantages given by its integration and, illustrations of various systems and technologies applied to domestic automation that have achieved both national and international commercial relevance.

  20. Prospective validation of a near real-time EHR-integrated automated SOFA score calculator.

    Science.gov (United States)

    Aakre, Christopher; Franco, Pablo Moreno; Ferreyra, Micaela; Kitson, Jaben; Li, Man; Herasevich, Vitaly

    2017-07-01

    We created an algorithm for automated Sequential Organ Failure Assessment (SOFA) score calculation within the Electronic Health Record (EHR) to facilitate detection of sepsis based on the Third International Consensus Definitions for Sepsis and Septic Shock (SEPSIS-3) clinical definition. We evaluated the accuracy of near real-time and daily automated SOFA score calculation compared with manual score calculation. Automated SOFA scoring computer programs were developed using available EHR data sources and integrated into a critical care focused patient care dashboard at Mayo Clinic in Rochester, Minnesota. We prospectively compared the accuracy of automated versus manual calculation for a sample of patients admitted to the medical intensive care unit at Mayo Clinic Hospitals in Rochester, Minnesota and Jacksonville, Florida. Agreement was calculated with Cohen's kappa statistic. Reason for discrepancy was tabulated during manual review. Random spot check comparisons were performed 134 times on 27 unique patients, and daily SOFA score comparisons were performed for 215 patients over a total of 1206 patient days. Agreement between automatically scored and manually scored SOFA components for both random spot checks (696 pairs, κ=0.89) and daily calculation (5972 pairs, κ=0.89) was high. The most common discrepancies were in the respiratory component (inaccurate fraction of inspired oxygen retrieval; 200/1206) and creatinine (normal creatinine in patients with no urine output on dialysis; 128/1094). 147 patients were at risk of developing sepsis after intensive care unit admission, 10 later developed sepsis confirmed by chart review. All were identified before onset of sepsis with the ΔSOFA≥2 point criterion and 46 patients were false-positives. Near real-time automated SOFA scoring was found to have strong agreement with manual score calculation and may be useful for the detection of sepsis utilizing the new SEPSIS-3 definition. Copyright © 2017 Elsevier B.V. All

  1. Mobile home automation-merging mobile value added services and home automation technologies

    OpenAIRE

    Rosendahl, Andreas; Hampe, Felix J.; Botterweck, Goetz

    2007-01-01

    non-peer-reviewed In this paper we study mobile home automation, a field that emerges from an integration of mobile application platforms and home automation technologies. In a conceptual introduction we first illustrate the need for such applications by introducing a two-dimensional conceptual model of mobility. Subsequently we suggest an architecture and discuss different options of how a user might access a mobile home automation service and the controlled devices. As another contrib...

  2. Design automation for integrated nonlinear logic circuits (Conference Presentation)

    Science.gov (United States)

    Van Vaerenbergh, Thomas; Pelc, Jason; Santori, Charles; Bose, Ranojoy; Kielpinski, Dave; Beausoleil, Raymond G.

    2016-05-01

    A key enabler of the IT revolution of the late 20th century was the development of electronic design automation (EDA) tools allowing engineers to manage the complexity of electronic circuits with transistor counts now reaching into the billions. Recently, we have been developing large-scale nonlinear photonic integrated logic circuits for next generation all-optical information processing. At this time a sufficiently powerful EDA-style software tool chain to design this type of complex circuits does not yet exist. Here we describe a hierarchical approach to automating the design and validation of photonic integrated circuits, which can scale to several orders of magnitude higher complexity than the state of the art. Most photonic integrated circuits developed today consist of a small number of components, and only limited hierarchy. For example, a simple photonic transceiver may contain on the order of 10 building-block components, consisting of grating couplers for photonic I/O, modulators, and signal splitters/combiners. Because this is relatively easy to lay out by hand (or simple script) existing photonic design tools have relatively little automation in comparison to electronics tools. But demonstrating all-optical logic will require significantly more complex photonic circuits containing up to 1,000 components, hence becoming infeasible to design manually. Our design framework is based off Python-based software from Luceda Photonics which provides an environment to describe components, simulate their behavior, and export design files (GDS) to foundries for fabrication. At a fundamental level, a photonic component is described as a parametric cell (PCell) similarly to electronics design. PCells are described by geometric characteristics of their layout. A critical part of the design framework is the implementation of PCells as Python objects. PCell objects can then use inheritance to simplify design, and hierarchical designs can be made by creating composite

  3. National Space Science Data Center data archive and distribution service (NDADS) automated retrieval mail system user's guide

    Science.gov (United States)

    Perry, Charleen M.; Vansteenberg, Michael E.

    1992-01-01

    The National Space Science Data Center (NSSDC) has developed an automated data retrieval request service utilizing our Data Archive and Distribution Service (NDADS) computer system. NDADS currently has selected project data written to optical disk platters with the disks residing in a robotic 'jukebox' near-line environment. This allows for rapid and automated access to the data with no staff intervention required. There are also automated help information and user services available that can be accessed. The request system permits an average-size data request to be completed within minutes of the request being sent to NSSDC. A mail message, in the format described in this document, retrieves the data and can send it to a remote site. Also listed in this document are the data currently available.

  4. Workflow management in large distributed systems

    International Nuclear Information System (INIS)

    Legrand, I; Newman, H; Voicu, R; Dobre, C; Grigoras, C

    2011-01-01

    The MonALISA (Monitoring Agents using a Large Integrated Services Architecture) framework provides a distributed service system capable of controlling and optimizing large-scale, data-intensive applications. An essential part of managing large-scale, distributed data-processing facilities is a monitoring system for computing facilities, storage, networks, and the very large number of applications running on these systems in near realtime. All this monitoring information gathered for all the subsystems is essential for developing the required higher-level services—the components that provide decision support and some degree of automated decisions—and for maintaining and optimizing workflow in large-scale distributed systems. These management and global optimization functions are performed by higher-level agent-based services. We present several applications of MonALISA's higher-level services including optimized dynamic routing, control, data-transfer scheduling, distributed job scheduling, dynamic allocation of storage resource to running jobs and automated management of remote services among a large set of grid facilities.

  5. Workflow management in large distributed systems

    Science.gov (United States)

    Legrand, I.; Newman, H.; Voicu, R.; Dobre, C.; Grigoras, C.

    2011-12-01

    The MonALISA (Monitoring Agents using a Large Integrated Services Architecture) framework provides a distributed service system capable of controlling and optimizing large-scale, data-intensive applications. An essential part of managing large-scale, distributed data-processing facilities is a monitoring system for computing facilities, storage, networks, and the very large number of applications running on these systems in near realtime. All this monitoring information gathered for all the subsystems is essential for developing the required higher-level services—the components that provide decision support and some degree of automated decisions—and for maintaining and optimizing workflow in large-scale distributed systems. These management and global optimization functions are performed by higher-level agent-based services. We present several applications of MonALISA's higher-level services including optimized dynamic routing, control, data-transfer scheduling, distributed job scheduling, dynamic allocation of storage resource to running jobs and automated management of remote services among a large set of grid facilities.

  6. Integrals over products of distributions and coordinate independence of zero-temperature path integrals

    International Nuclear Information System (INIS)

    Kleinert, H.; Chervyakov, A.

    2003-01-01

    In perturbative calculations of quantum-statistical zero-temperature path integrals in curvilinear coordinates one encounters Feynman diagrams involving multiple temporal integrals over products of distributions, which are mathematically undefined. In addition, there are terms proportional to powers of Dirac δ-functions at the origin coming from the measure of path integration. We give simple rules for integrating products of distributions in such a way that the results ensure coordinate independence of the path integrals. The rules are derived by using equations of motion and partial integration, while keeping track of certain minimal features originating in the unique definition of all singular integrals in 1-ε dimensions. Our rules yield the same results as the much more cumbersome calculations in 1-ε dimensions where the limit ε→0 is taken at the end. They also agree with the rules found in an independent treatment on a finite time interval

  7. Consistent integrated automation. Optimized power plant control by means of IEC 61850; Durchgaengig automatisieren. Optimierte Kraftwerksleittechnik durch die Norm IEC 61850

    Energy Technology Data Exchange (ETDEWEB)

    Orth, J. [ABB AG, Mannheim (Germany). Geschaeftsbereich Power Generation

    2007-07-01

    Today's power plants are highly automated. All subsystems of large thermal power plants can be controlled from a central control room. The electrical systems are an important part. In future the new standard IEC 61850 will improve the integration of electrical systems into automation of power plants supporting the reduction of operation and maintenance cost. (orig.)

  8. A development framework for artificial intelligence based distributed operations support systems

    Science.gov (United States)

    Adler, Richard M.; Cottman, Bruce H.

    1990-01-01

    Advanced automation is required to reduce costly human operations support requirements for complex space-based and ground control systems. Existing knowledge based technologies have been used successfully to automate individual operations tasks. Considerably less progress has been made in integrating and coordinating multiple operations applications for unified intelligent support systems. To fill this gap, SOCIAL, a tool set for developing Distributed Artificial Intelligence (DAI) systems is being constructed. SOCIAL consists of three primary language based components defining: models of interprocess communication across heterogeneous platforms; models for interprocess coordination, concurrency control, and fault management; and for accessing heterogeneous information resources. DAI applications subsystems, either new or existing, will access these distributed services non-intrusively, via high-level message-based protocols. SOCIAL will reduce the complexity of distributed communications, control, and integration, enabling developers to concentrate on the design and functionality of the target DAI system itself.

  9. Short circuit analysis of distribution system with integration of DG

    DEFF Research Database (Denmark)

    Su, Chi; Liu, Zhou; Chen, Zhe

    2014-01-01

    and as a result bring challenges to the network protection system. This problem has been frequently discussed in the literature, but mostly considering only the balanced fault situation. This paper presents an investigation on the influence of full converter based wind turbine (WT) integration on fault currents......Integration of distributed generation (DG) such as wind turbines into distribution system is increasing all around the world, because of the flexible and environmentally friendly characteristics. However, DG integration may change the pattern of the fault currents in the distribution system...... during both balanced and unbalanced faults. Major factors such as external grid short circuit power capacity, WT integration location, connection type of WT integration transformer are taken into account. In turn, the challenges brought to the protection system in the distribution network are presented...

  10. Integrating standard operating procedures with spacecraft automation, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Spacecraft automation has the potential to assist crew members and spacecraft operators in managing spacecraft systems during extended space missions. Automation can...

  11. Integrating Standard Operating Procedures with Spacecraft Automation, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Spacecraft automation can be used to greatly reduce the demands on crew member and flight controllers time and attention. Automation can monitor critical resources,...

  12. Integrating Test-Form Formatting into Automated Test Assembly

    Science.gov (United States)

    Diao, Qi; van der Linden, Wim J.

    2013-01-01

    Automated test assembly uses the methodology of mixed integer programming to select an optimal set of items from an item bank. Automated test-form generation uses the same methodology to optimally order the items and format the test form. From an optimization point of view, production of fully formatted test forms directly from the item pool using…

  13. Automation and robotics

    Science.gov (United States)

    Montemerlo, Melvin

    1988-01-01

    The Autonomous Systems focus on the automation of control systems for the Space Station and mission operations. Telerobotics focuses on automation for in-space servicing, assembly, and repair. The Autonomous Systems and Telerobotics each have a planned sequence of integrated demonstrations showing the evolutionary advance of the state-of-the-art. Progress is briefly described for each area of concern.

  14. The Space Station Module Power Management and Distribution automation test bed

    Science.gov (United States)

    Lollar, Louis F.

    1991-01-01

    The Space Station Module Power Management And Distribution (SSM/PMAD) automation test bed project was begun at NASA/Marshall Space Flight Center (MSFC) in the mid-1980s to develop an autonomous, user-supportive power management and distribution test bed simulating the Space Station Freedom Hab/Lab modules. As the test bed has matured, many new technologies and projects have been added. The author focuses on three primary areas. The first area is the overall accomplishments of the test bed itself. These include a much-improved user interface, a more efficient expert system scheduler, improved communication among the three expert systems, and initial work on adding intermediate levels of autonomy. The second area is the addition of a more realistic power source to the SSM/PMAD test bed; this project is called the Large Autonomous Spacecraft Electrical Power System (LASEPS). The third area is the completion of a virtual link between the SSM/PMAD test bed at MSFC and the Autonomous Power Expert at Lewis Research Center.

  15. Automated radiotherapy treatment plan integrity verification

    Energy Technology Data Exchange (ETDEWEB)

    Yang Deshan; Moore, Kevin L. [Department of Radiation Oncology, School of Medicine, Washington University in Saint Louis, St. Louis, Missouri 63110 (United States)

    2012-03-15

    Purpose: In our clinic, physicists spend from 15 to 60 min to verify the physical and dosimetric integrity of radiotherapy plans before presentation to radiation oncology physicians for approval. The purpose of this study was to design and implement a framework to automate as many elements of this quality control (QC) step as possible. Methods: A comprehensive computer application was developed to carry out a majority of these verification tasks in the Philips PINNACLE treatment planning system (TPS). This QC tool functions based on both PINNACLE scripting elements and PERL sub-routines. The core of this technique is the method of dynamic scripting, which involves a PERL programming module that is flexible and powerful for treatment plan data handling. Run-time plan data are collected, saved into temporary files, and analyzed against standard values and predefined logical rules. The results were summarized in a hypertext markup language (HTML) report that is displayed to the user. Results: This tool has been in clinical use for over a year. The occurrence frequency of technical problems, which would cause delays and suboptimal plans, has been reduced since clinical implementation. Conclusions: In addition to drastically reducing the set of human-driven logical comparisons, this QC tool also accomplished some tasks that are otherwise either quite laborious or impractical for humans to verify, e.g., identifying conflicts amongst IMRT optimization objectives.

  16. Automated radiotherapy treatment plan integrity verification

    International Nuclear Information System (INIS)

    Yang Deshan; Moore, Kevin L.

    2012-01-01

    Purpose: In our clinic, physicists spend from 15 to 60 min to verify the physical and dosimetric integrity of radiotherapy plans before presentation to radiation oncology physicians for approval. The purpose of this study was to design and implement a framework to automate as many elements of this quality control (QC) step as possible. Methods: A comprehensive computer application was developed to carry out a majority of these verification tasks in the Philips PINNACLE treatment planning system (TPS). This QC tool functions based on both PINNACLE scripting elements and PERL sub-routines. The core of this technique is the method of dynamic scripting, which involves a PERL programming module that is flexible and powerful for treatment plan data handling. Run-time plan data are collected, saved into temporary files, and analyzed against standard values and predefined logical rules. The results were summarized in a hypertext markup language (HTML) report that is displayed to the user. Results: This tool has been in clinical use for over a year. The occurrence frequency of technical problems, which would cause delays and suboptimal plans, has been reduced since clinical implementation. Conclusions: In addition to drastically reducing the set of human-driven logical comparisons, this QC tool also accomplished some tasks that are otherwise either quite laborious or impractical for humans to verify, e.g., identifying conflicts amongst IMRT optimization objectives.

  17. Using an integrated automated system to optimize retention and increase frequency of blood donations.

    Science.gov (United States)

    Whitney, J Garrett; Hall, Robert F

    2010-07-01

    This study examines the impact of an integrated, automated phone system to reinforce retention and increase frequency of donations among blood donors. Cultivated by incorporating data results over the past 7 years, the system uses computerized phone messaging to contact blood donors with individualized, multilevel notifications. Donors are contacted at planned intervals to acknowledge and recognize their donations, informed where their blood was sent, asked to participate in a survey, and reminded when they are eligible to donate again. The report statistically evaluates the impact of the various components of the system on donor retention and blood donations and quantifies the fiscal advantages to blood centers. By using information and support systems provided by the automated services and then incorporating the phlebotomists and recruiters to reinforce donor retention, both retention and donations will increase. © 2010 American Association of Blood Banks.

  18. Automation facilities for agricultural machinery control

    Directory of Open Access Journals (Sweden)

    A. Yu. Izmaylov

    2017-01-01

    Full Text Available The possibility of use of the automation equipment for agricultural machinery control is investigated. The authors proposed solutions on creation of the centralized unified automated information system for mobile aggregates management. In accordance with the modern requirements this system should be open, integrated into the general schema of agricultural enterprise control. Standard hardware, software and communicative features should be realized in tasks of monitoring and control. Therefore the schema should be get with use the unified modules and Russian standards. The complex multivariate unified automated control system for different objects of agricultural purpose based on block and modular creation should correspond to the following principles: high reliability, simplicity of service, low expenses in case of operation, the short payback period connected to increase in productivity, the reduced losses when harvesting, postharvest processing and storage, the improved energetic indices. Technological processes control in agricultural production is exercised generally with feedback. The example without feedback is program control by temperature in storage in case of the cooling mode. Feedback at technological processes control in agricultural production allows to optimally solve a problem of rational distribution of functions in man-distributed systems and forming the intelligent ergonomic interfaces, consistent with professional perceptions of decision-makers. The negative feedback created by the control unit allows to support automatically a quality index of technological process at the set level. The quantitative analysis of a production situation base itself upon deeply formalized basis of computer facilities that promotes making of the optimal solution. Information automated control system introduction increases labor productivity by 40 percent, reduces energetic costs by 25 percent. Improvement of quality of the executed technological

  19. Automated processing of data generated by molecular dynamics

    International Nuclear Information System (INIS)

    Lobato Hoyos, Ivan; Rojas Tapia, Justo; Instituto Peruano de Energia Nuclear, Lima

    2008-01-01

    A new integrated tool for automated processing of data generated by molecular dynamics packages and programs have been developed. The program allows to calculate important quantities such as pair correlation function, the analysis of common neighbors, counting nanoparticles and their size distribution, conversion of output files between different formats. The work explains in detail the modules of the tool, the interface between them. The uses of program are illustrated in application examples in the calculation of various properties of silver nanoparticles. (author)

  20. The Automation-by-Expertise-by-Training Interaction.

    Science.gov (United States)

    Strauch, Barry

    2017-03-01

    I introduce the automation-by-expertise-by-training interaction in automated systems and discuss its influence on operator performance. Transportation accidents that, across a 30-year interval demonstrated identical automation-related operator errors, suggest a need to reexamine traditional views of automation. I review accident investigation reports, regulator studies, and literature on human computer interaction, expertise, and training and discuss how failing to attend to the interaction of automation, expertise level, and training has enabled operators to commit identical automation-related errors. Automated systems continue to provide capabilities exceeding operators' need for effective system operation and provide interfaces that can hinder, rather than enhance, operator automation-related situation awareness. Because of limitations in time and resources, training programs do not provide operators the expertise needed to effectively operate these automated systems, requiring them to obtain the expertise ad hoc during system operations. As a result, many do not acquire necessary automation-related system expertise. Integrating automation with expected operator expertise levels, and within training programs that provide operators the necessary automation expertise, can reduce opportunities for automation-related operator errors. Research to address the automation-by-expertise-by-training interaction is needed. However, such research must meet challenges inherent to examining realistic sociotechnical system automation features with representative samples of operators, perhaps by using observational and ethnographic research. Research in this domain should improve the integration of design and training and, it is hoped, enhance operator performance.

  1. Working toward integrated models of alpine plant distribution.

    Science.gov (United States)

    Carlson, Bradley Z; Randin, Christophe F; Boulangeat, Isabelle; Lavergne, Sébastien; Thuiller, Wilfried; Choler, Philippe

    2013-10-01

    Species distribution models (SDMs) have been frequently employed to forecast the response of alpine plants to global changes. Efforts to model alpine plant distribution have thus far been primarily based on a correlative approach, in which ecological processes are implicitly addressed through a statistical relationship between observed species occurrences and environmental predictors. Recent evidence, however, highlights the shortcomings of correlative SDMs, especially in alpine landscapes where plant species tend to be decoupled from atmospheric conditions in micro-topographic habitats and are particularly exposed to geomorphic disturbances. While alpine plants respond to the same limiting factors as plants found at lower elevations, alpine environments impose a particular set of scale-dependent and hierarchical drivers that shape the realized niche of species and that require explicit consideration in a modelling context. Several recent studies in the European Alps have successfully integrated both correlative and process-based elements into distribution models of alpine plants, but for the time being a single integrative modelling framework that includes all key drivers remains elusive. As a first step in working toward a comprehensive integrated model applicable to alpine plant communities, we propose a conceptual framework that structures the primary mechanisms affecting alpine plant distributions. We group processes into four categories, including multi-scalar abiotic drivers, gradient dependent species interactions, dispersal and spatial-temporal plant responses to disturbance. Finally, we propose a methodological framework aimed at developing an integrated model to better predict alpine plant distribution.

  2. Database Security for an Integrated Solution to Automate Sales Processes in Banking

    Directory of Open Access Journals (Sweden)

    Alexandra Maria Ioana FLOREA

    2013-05-01

    Full Text Available In order to maintain a competitive edge in a very active banking market the implementation of a web-based solution to standardize, optimize and manage the flow of sales / pre-sales and generating new leads is requested by a company. This article presents the realization of a development framework for software interoperability in the banking financial institutions and an integrated solution for achieving sales process automation in banking. The paper focuses on presenting the requirements for security and confidentiality of stored data and also on presenting the identified techniques and procedures to implement these requirements.

  3. Employment Opportunities for the Handicapped in Programmable Automation.

    Science.gov (United States)

    Swift, Richard; Leneway, Robert

    A Computer Integrated Manufacturing System may make it possible for severely disabled people to custom design, machine, and manufacture either wood or metal parts. Programmable automation merges computer aided design, computer aided manufacturing, computer aided engineering, and computer integrated manufacturing systems with automated production…

  4. An Integrated Simulation Module for Cyber-Physical Automation Systems

    Directory of Open Access Journals (Sweden)

    Francesco Ferracuti

    2016-05-01

    Full Text Available The integration of Wireless Sensors Networks (WSNs into Cyber Physical Systems (CPSs is an important research problem to solve in order to increase the performances, safety, reliability and usability of wireless automation systems. Due to the complexity of real CPSs, emulators and simulators are often used to replace the real control devices and physical connections during the development stage. The most widespread simulators are free, open source, expandable, flexible and fully integrated into mathematical modeling tools; however, the connection at a physical level and the direct interaction with the real process via the WSN are only marginally tackled; moreover, the simulated wireless sensor motes are not able to generate the analogue output typically required for control purposes. A new simulation module for the control of a wireless cyber-physical system is proposed in this paper. The module integrates the COntiki OS JAva Simulator (COOJA, a cross-level wireless sensor network simulator, and the LabVIEW system design software from National Instruments. The proposed software module has been called “GILOO” (Graphical Integration of Labview and cOOja. It allows one to develop and to debug control strategies over the WSN both using virtual or real hardware modules, such as the National Instruments Real-Time Module platform, the CompactRio, the Supervisory Control And Data Acquisition (SCADA, etc. To test the proposed solution, we decided to integrate it with one of the most popular simulators, i.e., the Contiki OS, and wireless motes, i.e., the Sky mote. As a further contribution, the Contiki Sky DAC driver and a new “Advanced Sky GUI” have been proposed and tested in the COOJA Simulator in order to provide the possibility to develop control over the WSN. To test the performances of the proposed GILOO software module, several experimental tests have been made, and interesting preliminary results are reported. The GILOO module has been

  5. An Integrated Simulation Module for Cyber-Physical Automation Systems.

    Science.gov (United States)

    Ferracuti, Francesco; Freddi, Alessandro; Monteriù, Andrea; Prist, Mariorosario

    2016-05-05

    The integration of Wireless Sensors Networks (WSNs) into Cyber Physical Systems (CPSs) is an important research problem to solve in order to increase the performances, safety, reliability and usability of wireless automation systems. Due to the complexity of real CPSs, emulators and simulators are often used to replace the real control devices and physical connections during the development stage. The most widespread simulators are free, open source, expandable, flexible and fully integrated into mathematical modeling tools; however, the connection at a physical level and the direct interaction with the real process via the WSN are only marginally tackled; moreover, the simulated wireless sensor motes are not able to generate the analogue output typically required for control purposes. A new simulation module for the control of a wireless cyber-physical system is proposed in this paper. The module integrates the COntiki OS JAva Simulator (COOJA), a cross-level wireless sensor network simulator, and the LabVIEW system design software from National Instruments. The proposed software module has been called "GILOO" (Graphical Integration of Labview and cOOja). It allows one to develop and to debug control strategies over the WSN both using virtual or real hardware modules, such as the National Instruments Real-Time Module platform, the CompactRio, the Supervisory Control And Data Acquisition (SCADA), etc. To test the proposed solution, we decided to integrate it with one of the most popular simulators, i.e., the Contiki OS, and wireless motes, i.e., the Sky mote. As a further contribution, the Contiki Sky DAC driver and a new "Advanced Sky GUI" have been proposed and tested in the COOJA Simulator in order to provide the possibility to develop control over the WSN. To test the performances of the proposed GILOO software module, several experimental tests have been made, and interesting preliminary results are reported. The GILOO module has been applied to a smart home

  6. Ask the experts: automation: part I.

    Science.gov (United States)

    Allinson, John L; Blick, Kenneth E; Cohen, Lucinda; Higton, David; Li, Ming

    2013-08-01

    Bioanalysis invited a selection of leading researchers to express their views on automation in the bioanalytical laboratory. The topics discussed include the challenges that the modern bioanalyst faces when integrating automation into existing drug-development processes, the impact of automation and how they envision the modern bioanalytical laboratory changing in the near future. Their enlightening responses provide a valuable insight into the impact of automation and the future of the constantly evolving bioanalytical laboratory.

  7. NIF ICCS Test Controller for Automated and Manual Testing

    International Nuclear Information System (INIS)

    Zielinski, J S

    2007-01-01

    The National Ignition Facility (NIF) Integrated Computer Control System (ICCS) is a large (1.5 MSLOC), hierarchical, distributed system that controls all aspects of the NIF laser [1]. The ICCS team delivers software updates to the NIF facility throughout the year to support shot operations and commissioning activities. In 2006, there were 48 releases of ICCS: 29 full releases, 19 patches. To ensure the quality of each delivery, thousands of manual and automated tests are performed using the ICCS Test Controller test infrastructure. The TestController system provides test inventory management, test planning, automated test execution and manual test logging, release testing summaries and test results search, all through a web browser interface. Automated tests include command line based frameworks server tests and Graphical User Interface (GUI) based Java tests. Manual tests are presented as a checklist-style web form to be completed by the tester. The results of all tests, automated and manual, are kept in a common repository that provides data to dynamic status reports. As part of the 3-stage ICCS release testing strategy, the TestController system helps plan, evaluate and track the readiness of each release to the NIF facility

  8. Mathematical methods linear algebra normed spaces distributions integration

    CERN Document Server

    Korevaar, Jacob

    1968-01-01

    Mathematical Methods, Volume I: Linear Algebra, Normed Spaces, Distributions, Integration focuses on advanced mathematical tools used in applications and the basic concepts of algebra, normed spaces, integration, and distributions.The publication first offers information on algebraic theory of vector spaces and introduction to functional analysis. Discussions focus on linear transformations and functionals, rectangular matrices, systems of linear equations, eigenvalue problems, use of eigenvectors and generalized eigenvectors in the representation of linear operators, metric and normed vector

  9. A Study on Integrated Control Network for Multiple Automation Services-1st year report

    Energy Technology Data Exchange (ETDEWEB)

    Hyun, D.H.; Park, B.S.; Kim, M.S.; Lim, Y.H.; Ahn, S.K. [Korea Electric Power Research Institute, Taejon (Korea)

    2002-07-01

    This report describes the development of Integrated and Intelligent Gateway which is under developed. The network operating technique in this report can identifies the causes of the communication faults and can avoid communication network faults in advance. Utility companies spend large financial investment and time for supplying the stabilized power. Since this is deeply related to the reliability of Automation Systems, it is natural to employ Fault-Tolerant communication network for Automation Systems. Use of the network system developed in this report is not limited in DAS. It can be expandable to the many kinds of data services for customer. Thus this report suggests the direction of the communication network development. This 1st year report is composed of following contents, 1) The introduction and problems of DAS. 2) The configuration and functions of IIG. 3) The protocols. (author). 27 refs., 73 figs., 6 tabs.

  10. Automation of innovative facade systems with integrated technical building equipment under consideration of comfort aspects; Automatisierung innovativer Fassadensysteme mit integrierter technischer Gebaeudeausruestung unter Beruecksichtigung von Behaglichkeitsaspekten

    Energy Technology Data Exchange (ETDEWEB)

    Hasert, Anita; Becker, Martin [Hochschule Biberach (Germany). Inst. fuer Gebaeude- und Energiesysteme (IGE)

    2012-07-01

    Facades are not only a shell of the protected habitats and a boundary between the indoor climate and the environment of buildings. Facades also convert from previously passive elements to active building systems which perform various functions of the room conditioning (heating, cooling, ventilation, lighting, and so on). The associated increased demands on the system integration into facades require new solutions for the planning, implementation and operation of these innovative systems. Within the intelligent handling of increasing complexity, superior automation strategies have to be developed by means of facade automation. These automation strategies have to match the individual functions with each other, and to ensure a building sector comprehensive functionality. Furthermore, another criterion for the design of integrated facade systems is the consideration of the user's feeling with respect to comfort as well as the user's control and user's acceptance. In line with the research project AUTiFAS (= Automation of innovative facade systems), different automation strategies of the facades and room automation are considered on the basis of metrological investigations and simulation analyses. For this purpose, an innovative facade element with a decentralized ventilation unit and an integrated sunshade had to be integrated into a test room initially. The functionality and the constructional tightness of the total test stand had to be verified and matched to the requirements of the tests. With the objective to develop a standardized description of the control and regulation functions of the building sector comprehensive automation strategies, an automation library was developed based on standard structures and forms of representation using a test facade as an example. The standards DIN EN 15232, IEC 61131 as well as the guidelines VDI 3813 and VDI 3814 provide the fundamentals. The developed automation strategies form the basis for the development of

  11. Drivers' communicative interactions: on-road observations and modelling for integration in future automation systems.

    Science.gov (United States)

    Portouli, Evangelia; Nathanael, Dimitris; Marmaras, Nicolas

    2014-01-01

    Social interactions with other road users are an essential component of the driving activity and may prove critical in view of future automation systems; still up to now they have received only limited attention in the scientific literature. In this paper, it is argued that drivers base their anticipations about the traffic scene to a large extent on observations of social behaviour of other 'animate human-vehicles'. It is further argued that in cases of uncertainty, drivers seek to establish a mutual situational awareness through deliberate communicative interactions. A linguistic model is proposed for modelling these communicative interactions. Empirical evidence from on-road observations and analysis of concurrent running commentary by 25 experienced drivers support the proposed model. It is suggested that the integration of a social interactions layer based on illocutionary acts in future driving support and automation systems will improve their performance towards matching human driver's expectations. Practitioner Summary: Interactions between drivers on the road may play a significant role in traffic coordination. On-road observations and running commentaries are presented as empirical evidence to support a model of such interactions; incorporation of drivers' interactions in future driving support and automation systems may improve their performance towards matching driver's expectations.

  12. Automated diagnostics scoping study. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Quadrel, R.W.; Lash, T.A.

    1994-06-01

    The objective of the Automated Diagnostics Scoping Study was to investigate the needs for diagnostics in building operation and to examine some of the current technologies in automated diagnostics that can address these needs. The study was conducted in two parts. In the needs analysis, the authors interviewed facility managers and engineers at five building sites. In the technology survey, they collected published information on automated diagnostic technologies in commercial and military applications as well as on technologies currently under research. The following describe key areas that the authors identify for the research, development, and deployment of automated diagnostic technologies: tools and techniques to aid diagnosis during building commissioning, especially those that address issues arising from integrating building systems and diagnosing multiple simultaneous faults; technologies to aid diagnosis for systems and components that are unmonitored or unalarmed; automated capabilities to assist cause-and-effect exploration during diagnosis; inexpensive, reliable sensors, especially those that expand the current range of sensory input; technologies that aid predictive diagnosis through trend analysis; integration of simulation and optimization tools with building automation systems to optimize control strategies and energy performance; integration of diagnostic, control, and preventive maintenance technologies. By relating existing technologies to perceived and actual needs, the authors reached some conclusions about the opportunities for automated diagnostics in building operation. Some of a building operator`s needs can be satisfied by off-the-shelf hardware and software. Other needs are not so easily satisfied, suggesting directions for future research. Their conclusions and suggestions are offered in the final section of this study.

  13. Process automation

    International Nuclear Information System (INIS)

    Moser, D.R.

    1986-01-01

    Process automation technology has been pursued in the chemical processing industries and to a very limited extent in nuclear fuel reprocessing. Its effective use has been restricted in the past by the lack of diverse and reliable process instrumentation and the unavailability of sophisticated software designed for process control. The Integrated Equipment Test (IET) facility was developed by the Consolidated Fuel Reprocessing Program (CFRP) in part to demonstrate new concepts for control of advanced nuclear fuel reprocessing plants. A demonstration of fuel reprocessing equipment automation using advanced instrumentation and a modern, microprocessor-based control system is nearing completion in the facility. This facility provides for the synergistic testing of all chemical process features of a prototypical fuel reprocessing plant that can be attained with unirradiated uranium-bearing feed materials. The unique equipment and mission of the IET facility make it an ideal test bed for automation studies. This effort will provide for the demonstration of the plant automation concept and for the development of techniques for similar applications in a full-scale plant. A set of preliminary recommendations for implementing process automation has been compiled. Some of these concepts are not generally recognized or accepted. The automation work now under way in the IET facility should be useful to others in helping avoid costly mistakes because of the underutilization or misapplication of process automation. 6 figs

  14. Improvement of power quality using distributed generation

    Energy Technology Data Exchange (ETDEWEB)

    Moreno-Munoz, A.; Lopez-Rodriguez, M.A.; Flores-Arias, J.M.; Bellido-Outerino, F.J. [Universidad de Cordoba, Departamento A.C., Electronica y T.E., Escuela Politecnica Superior, Campus de Rabanales, E-14071 Cordoba (Spain); de-la-Rosa, J.J.G. [Universidad de Cadiz, Area de Electronica, Dpto. ISA, TE y Electronica, Escuela Politecnica Superior Avda, Ramon Puyol, S/N, E-11202-Algeciras-Cadiz (Spain); Ruiz-de-Adana, M. [Universidad de Cordoba, Departamento de Quimica Fisica y Termodinamica Aplicada, Campus de Rabanales, E-14071 Cordoba (Spain)

    2010-12-15

    This paper addresses how Distributed Generation (DG), particularly when configured in Combined Heat and Power (CHP) mode, can become a powerful reliability solution in highlight automated factories, especially when integrated with complimentary Power Quality (PQ) measures. The paper presents results from the PQ audit conducted at a highly automated plant over last year. It was found that the main problems for the equipment installed were voltage sags. Among all categories of electrical disturbances, the voltage sag (dip) and momentary interruption are the nemeses of the automated industrial process. The paper analyzes the capabilities of modern electronic power supplies and the convenience of embedded solution. Finally it is addressed the role of the DG/CHP on the reliability of digital factories. (author)

  15. Automated processing for proton spectroscopic imaging using water reference deconvolution.

    Science.gov (United States)

    Maudsley, A A; Wu, Z; Meyerhoff, D J; Weiner, M W

    1994-06-01

    Automated formation of MR spectroscopic images (MRSI) is necessary before routine application of these methods is possible for in vivo studies; however, this task is complicated by the presence of spatially dependent instrumental distortions and the complex nature of the MR spectrum. A data processing method is presented for completely automated formation of in vivo proton spectroscopic images, and applied for analysis of human brain metabolites. This procedure uses the water reference deconvolution method (G. A. Morris, J. Magn. Reson. 80, 547(1988)) to correct for line shape distortions caused by instrumental and sample characteristics, followed by parametric spectral analysis. Results for automated image formation were found to compare favorably with operator dependent spectral integration methods. While the water reference deconvolution processing was found to provide good correction of spatially dependent resonance frequency shifts, it was found to be susceptible to errors for correction of line shape distortions. These occur due to differences between the water reference and the metabolite distributions.

  16. Rules for integrals over products of distributions from coordinate independence of path integrals

    International Nuclear Information System (INIS)

    Kleinert, H.; Chervyakov, A.

    2001-01-01

    In perturbative calculations of quantum-mechanical path integrals in curvilinear coordinates, one encounters Feynman diagrams involving multiple temporal integrals over products of distributions which are mathematically undefined. In addition, there are terms proportional to powers of Dirac δ-functions at the origin coming from the measure of path integration. We derive simple rules for dealing with such singular terms from the natural requirement of coordinate independence of the path integrals. (orig.)

  17. Integrated Transmission and Distribution Control

    Energy Technology Data Exchange (ETDEWEB)

    Kalsi, Karanjit [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fuller, Jason C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Tuffner, Francis K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Lian, Jianming [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Zhang, Wei [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Marinovici, Laurentiu D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fisher, Andrew R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Chassin, Forrest S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hauer, Matthew L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2013-01-01

    Distributed, generation, demand response, distributed storage, smart appliances, electric vehicles and renewable energy resources are expected to play a key part in the transformation of the American power system. Control, coordination and compensation of these smart grid assets are inherently interlinked. Advanced control strategies to warrant large-scale penetration of distributed smart grid assets do not currently exist. While many of the smart grid technologies proposed involve assets being deployed at the distribution level, most of the significant benefits accrue at the transmission level. The development of advanced smart grid simulation tools, such as GridLAB-D, has led to a dramatic improvement in the models of smart grid assets available for design and evaluation of smart grid technology. However, one of the main challenges to quantifying the benefits of smart grid assets at the transmission level is the lack of tools and framework for integrating transmission and distribution technologies into a single simulation environment. Furthermore, given the size and complexity of the distribution system, it is crucial to be able to represent the behavior of distributed smart grid assets using reduced-order controllable models and to analyze their impacts on the bulk power system in terms of stability and reliability.

  18. AmeriFlux Data Processing: Integrating automated and manual data management across software technologies and an international network to generate timely data products

    Science.gov (United States)

    Christianson, D. S.; Beekwilder, N.; Chan, S.; Cheah, Y. W.; Chu, H.; Dengel, S.; O'Brien, F.; Pastorello, G.; Sandesh, M.; Torn, M. S.; Agarwal, D.

    2017-12-01

    AmeriFlux is a network of scientists who independently collect eddy covariance and related environmental observations at over 250 locations across the Americas. As part of the AmeriFlux Management Project, the AmeriFlux Data Team manages standardization, collection, quality assurance / quality control (QA/QC), and distribution of data submitted by network members. To generate data products that are timely, QA/QC'd, and repeatable, and have traceable provenance, we developed a semi-automated data processing pipeline. The new pipeline consists of semi-automated format and data QA/QC checks. Results are communicated via on-line reports as well as an issue-tracking system. Data processing time has been reduced from 2-3 days to a few hours of manual review time, resulting in faster data availability from the time of data submission. The pipeline is scalable to the network level and has the following key features. (1) On-line results of the format QA/QC checks are available immediately for data provider review. This enables data providers to correct and resubmit data quickly. (2) The format QA/QC assessment includes an automated attempt to fix minor format errors. Data submissions that are formatted in the new AmeriFlux FP-In standard can be queued for the data QA/QC assessment, often with minimal delay. (3) Automated data QA/QC checks identify and communicate potentially erroneous data via online, graphical quick views that highlight observations with unexpected values, incorrect units, time drifts, invalid multivariate correlations, and/or radiation shadows. (4) Progress through the pipeline is integrated with an issue-tracking system that facilitates communications between data providers and the data processing team in an organized and searchable fashion. Through development of these and other features of the pipeline, we present solutions to challenges that include optimizing automated with manual processing, bridging legacy data management infrastructure with

  19. Automated detection of fluorescent cells in in-resin fluorescence sections for integrated light and electron microscopy.

    Science.gov (United States)

    Delpiano, J; Pizarro, L; Peddie, C J; Jones, M L; Griffin, L D; Collinson, L M

    2018-04-26

    Integrated array tomography combines fluorescence and electron imaging of ultrathin sections in one microscope, and enables accurate high-resolution correlation of fluorescent proteins to cell organelles and membranes. Large numbers of serial sections can be imaged sequentially to produce aligned volumes from both imaging modalities, thus producing enormous amounts of data that must be handled and processed using novel techniques. Here, we present a scheme for automated detection of fluorescent cells within thin resin sections, which could then be used to drive automated electron image acquisition from target regions via 'smart tracking'. The aim of this work is to aid in optimization of the data acquisition process through automation, freeing the operator to work on other tasks and speeding up the process, while reducing data rates by only acquiring images from regions of interest. This new method is shown to be robust against noise and able to deal with regions of low fluorescence. © 2018 The Authors. Journal of Microscopy published by JohnWiley & Sons Ltd on behalf of Royal Microscopical Society.

  20. Automated security management

    CERN Document Server

    Al-Shaer, Ehab; Xie, Geoffrey

    2013-01-01

    In this contributed volume, leading international researchers explore configuration modeling and checking, vulnerability and risk assessment, configuration analysis, and diagnostics and discovery. The authors equip readers to understand automated security management systems and techniques that increase overall network assurability and usability. These constantly changing networks defend against cyber attacks by integrating hundreds of security devices such as firewalls, IPSec gateways, IDS/IPS, authentication servers, authorization/RBAC servers, and crypto systems. Automated Security Managemen

  1. Understanding reliance on automation: effects of error type, error distribution, age and experience

    Science.gov (United States)

    Sanchez, Julian; Rogers, Wendy A.; Fisk, Arthur D.; Rovira, Ericka

    2015-01-01

    An obstacle detection task supported by “imperfect” automation was used with the goal of understanding the effects of automation error types and age on automation reliance. Sixty younger and sixty older adults interacted with a multi-task simulation of an agricultural vehicle (i.e. a virtual harvesting combine). The simulator included an obstacle detection task and a fully manual tracking task. A micro-level analysis provided insight into the way reliance patterns change over time. The results indicated that there are distinct patterns of reliance that develop as a function of error type. A prevalence of automation false alarms led participants to under-rely on the automation during alarm states while over relying on it during non-alarms states. Conversely, a prevalence of automation misses led participants to over-rely on automated alarms and under-rely on the automation during non-alarm states. Older adults adjusted their behavior according to the characteristics of the automation similarly to younger adults, although it took them longer to do so. The results of this study suggest the relationship between automation reliability and reliance depends on the prevalence of specific errors and on the state of the system. Understanding the effects of automation detection criterion settings on human-automation interaction can help designers of automated systems make predictions about human behavior and system performance as a function of the characteristics of the automation. PMID:25642142

  2. Distributed software framework and continuous integration in hydroinformatics systems

    Science.gov (United States)

    Zhou, Jianzhong; Zhang, Wei; Xie, Mengfei; Lu, Chengwei; Chen, Xiao

    2017-08-01

    When encountering multiple and complicated models, multisource structured and unstructured data, complex requirements analysis, the platform design and integration of hydroinformatics systems become a challenge. To properly solve these problems, we describe a distributed software framework and it’s continuous integration process in hydroinformatics systems. This distributed framework mainly consists of server cluster for models, distributed database, GIS (Geographic Information System) servers, master node and clients. Based on it, a GIS - based decision support system for joint regulating of water quantity and water quality of group lakes in Wuhan China is established.

  3. Integrating photovoltaics into utility distribution systems

    International Nuclear Information System (INIS)

    Zaininger, H.W.; Barnes, P.R.

    1995-01-01

    Electric utility distribution system impacts associated with the integration of distributed photovoltaic (PV) energy sources vary from site to site and utility to utility. The objective of this paper is to examine several utility- and site-specific conditions which may affect economic viability of distributed PV applications to utility systems. Assessment methodology compatible with technical and economic assessment techniques employed by utility engineers and planners is employed to determine PV benefits for seven different utility systems. The seven case studies are performed using utility system characteristics and assumptions obtained from appropriate utility personnel. The resulting site-specific distributed PV benefits increase nonsite-specific generation system benefits available to central station PV plants as much as 46%, for one utility located in the Southwest

  4. Trust in automation: integrating empirical evidence on factors that influence trust.

    Science.gov (United States)

    Hoff, Kevin Anthony; Bashir, Masooda

    2015-05-01

    We systematically review recent empirical research on factors that influence trust in automation to present a three-layered trust model that synthesizes existing knowledge. Much of the existing research on factors that guide human-automation interaction is centered around trust, a variable that often determines the willingness of human operators to rely on automation. Studies have utilized a variety of different automated systems in diverse experimental paradigms to identify factors that impact operators' trust. We performed a systematic review of empirical research on trust in automation from January 2002 to June 2013. Papers were deemed eligible only if they reported the results of a human-subjects experiment in which humans interacted with an automated system in order to achieve a goal. Additionally, a relationship between trust (or a trust-related behavior) and another variable had to be measured. All together, 101 total papers, containing 127 eligible studies, were included in the review. Our analysis revealed three layers of variability in human-automation trust (dispositional trust, situational trust, and learned trust), which we organize into a model. We propose design recommendations for creating trustworthy automation and identify environmental conditions that can affect the strength of the relationship between trust and reliance. Future research directions are also discussed for each layer of trust. Our three-layered trust model provides a new lens for conceptualizing the variability of trust in automation. Its structure can be applied to help guide future research and develop training interventions and design procedures that encourage appropriate trust. © 2014, Human Factors and Ergonomics Society.

  5. Order Division Automated System.

    Science.gov (United States)

    Kniemeyer, Justin M.; And Others

    This publication was prepared by the Order Division Automation Project staff to fulfill the Library of Congress' requirement to document all automation efforts. The report was originally intended for internal use only and not for distribution outside the Library. It is now felt that the library community at-large may have an interest in the…

  6. Designing the Distributed Model Integration Framework – DMIF

    NARCIS (Netherlands)

    Belete, Getachew F.; Voinov, Alexey; Morales, Javier

    2017-01-01

    We describe and discuss the design and prototype of the Distributed Model Integration Framework (DMIF) that links models deployed on different hardware and software platforms. We used distributed computing and service-oriented development approaches to address the different aspects of

  7. [Analysis of foreign experience of usage of automation systems of medication distribution in prevention and treatment facilities].

    Science.gov (United States)

    Miroshnichenko, Iu V; Umarov, S Z

    2012-12-01

    One of the ways of increase of effectiveness and safety of patients medication supplement is the use of automated systems of distribution, through which substantially increases the efficiency and safety of patients' medication supplement, achieves significant economy of material and financial resources for medication assistance and possibility of systematical improvement of its accessibility and quality.

  8. Market Integration Dynamics and Asymptotic Price Convergence in Distribution

    NARCIS (Netherlands)

    A. García-Hiernaux (Alfredo); D.E. Guerrero (David); M.J. McAleer (Michael)

    2013-01-01

    textabstractIn this paper we analyse the market integration process of the relative price distribution, develop a model to analyze market integration, and present a formal test of increasing market integration. We distinguish between the economic concepts of price convergence in mean and in

  9. Automated data collection in single particle electron microscopy

    Science.gov (United States)

    Tan, Yong Zi; Cheng, Anchi; Potter, Clinton S.; Carragher, Bridget

    2016-01-01

    Automated data collection is an integral part of modern workflows in single particle electron microscopy (EM) research. This review surveys the software packages available for automated single particle EM data collection. The degree of automation at each stage of data collection is evaluated, and the capabilities of the software packages are described. Finally, future trends in automation are discussed. PMID:26671944

  10. Studying the Impact of Distributed Solar PV on Power Systems using Integrated Transmission and Distribution Models: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Jain, Himanshu [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Palmintier, Bryan S [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Krad, Ibrahim [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Krishnamurthy, Dheepak [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-08-24

    This paper presents the results of a distributed solar PV impact assessment study that was performed using a synthetic integrated transmission (T) and distribution (D) model. The primary objective of the study was to present a new approach for distributed solar PV impact assessment, where along with detailed models of transmission and distribution networks, consumer loads were modeled using the physics of end-use equipment, and distributed solar PV was geographically dispersed and connected to the secondary distribution networks. The highlights of the study results were (i) increase in the Area Control Error (ACE) at high penetration levels of distributed solar PV; and (ii) differences in distribution voltages profiles and voltage regulator operations between integrated T&D and distribution only simulations.

  11. Extending the NIF DISCO framework to automate complex workflow: coordinating the harvest and integration of data from diverse neuroscience information resources.

    Science.gov (United States)

    Marenco, Luis N; Wang, Rixin; Bandrowski, Anita E; Grethe, Jeffrey S; Shepherd, Gordon M; Miller, Perry L

    2014-01-01

    This paper describes how DISCO, the data aggregator that supports the Neuroscience Information Framework (NIF), has been extended to play a central role in automating the complex workflow required to support and coordinate the NIF's data integration capabilities. The NIF is an NIH Neuroscience Blueprint initiative designed to help researchers access the wealth of data related to the neurosciences available via the Internet. A central component is the NIF Federation, a searchable database that currently contains data from 231 data and information resources regularly harvested, updated, and warehoused in the DISCO system. In the past several years, DISCO has greatly extended its functionality and has evolved to play a central role in automating the complex, ongoing process of harvesting, validating, integrating, and displaying neuroscience data from a growing set of participating resources. This paper provides an overview of DISCO's current capabilities and discusses a number of the challenges and future directions related to the process of coordinating the integration of neuroscience data within the NIF Federation.

  12. CIPSS [computer-integrated process and safeguards system]: The integration of computer-integrated manufacturing and robotics with safeguards, security, and process operations

    International Nuclear Information System (INIS)

    Leonard, R.S.; Evans, J.C.

    1987-01-01

    This poster session describes the computer-integrated process and safeguards system (CIPSS). The CIPSS combines systems developed for factory automation and automated mechanical functions (robots) with varying degrees of intelligence (expert systems) to create an integrated system that would satisfy current and emerging security and safeguards requirements. Specifically, CIPSS is an extension of the automated physical security functions concepts. The CIPSS also incorporates the concepts of computer-integrated manufacturing (CIM) with integrated safeguards concepts, and draws upon the Defense Advance Research Project Agency's (DARPA's) strategic computing program

  13. Quantitative Vulnerability Assessment of Cyber Security for Distribution Automation Systems

    Directory of Open Access Journals (Sweden)

    Xiaming Ye

    2015-06-01

    Full Text Available The distribution automation system (DAS is vulnerable to cyber-attacks due to the widespread use of terminal devices and standard communication protocols. On account of the cost of defense, it is impossible to ensure the security of every device in the DAS. Given this background, a novel quantitative vulnerability assessment model of cyber security for DAS is developed in this paper. In the assessment model, the potential physical consequences of cyber-attacks are analyzed from two levels: terminal device level and control center server level. Then, the attack process is modeled based on game theory and the relationships among different vulnerabilities are analyzed by introducing a vulnerability adjacency matrix. Finally, the application process of the proposed methodology is illustrated through a case study based on bus 2 of the Roy Billinton Test System (RBTS. The results demonstrate the reasonability and effectiveness of the proposed methodology.

  14. Automated processing integrated with a microflow cytometer for pathogen detection in clinical matrices.

    Science.gov (United States)

    Golden, J P; Verbarg, J; Howell, P B; Shriver-Lake, L C; Ligler, F S

    2013-02-15

    A spinning magnetic trap (MagTrap) for automated sample processing was integrated with a microflow cytometer capable of simultaneously detecting multiple targets to provide an automated sample-to-answer diagnosis in 40 min. After target capture on fluorescently coded magnetic microspheres, the magnetic trap automatically concentrated the fluorescently coded microspheres, separated the captured target from the sample matrix, and exposed the bound target sequentially to biotinylated tracer molecules and streptavidin-labeled phycoerythrin. The concentrated microspheres were then hydrodynamically focused in a microflow cytometer capable of 4-color analysis (two wavelengths for microsphere identification, one for light scatter to discriminate single microspheres and one for phycoerythrin bound to the target). A three-fold decrease in sample preparation time and an improved detection limit, independent of target preconcentration, was demonstrated for detection of Escherichia coli 0157:H7 using the MagTrap as compared to manual processing. Simultaneous analysis of positive and negative controls, along with the assay reagents specific for the target, was used to obtain dose-response curves, demonstrating the potential for quantification of pathogen load in buffer and serum. Published by Elsevier B.V.

  15. Building a framework to manage trust in automation

    Science.gov (United States)

    Metcalfe, J. S.; Marathe, A. R.; Haynes, B.; Paul, V. J.; Gremillion, G. M.; Drnec, K.; Atwater, C.; Estepp, J. R.; Lukos, J. R.; Carter, E. C.; Nothwang, W. D.

    2017-05-01

    All automations must, at some point in their lifecycle, interface with one or more humans. Whether operators, end-users, or bystanders, human responses can determine the perceived utility and acceptance of an automation. It has been long believed that human trust is a primary determinant of human-automation interactions and further presumed that calibrating trust can lead to appropriate choices regarding automation use. However, attempts to improve joint system performance by calibrating trust have not yet provided a generalizable solution. To address this, we identified several factors limiting the direct integration of trust, or metrics thereof, into an active mitigation strategy. The present paper outlines our approach to addressing this important issue, its conceptual underpinnings, and practical challenges encountered in execution. Among the most critical outcomes has been a shift in focus from trust to basic interaction behaviors and their antecedent decisions. This change in focus inspired the development of a testbed and paradigm that was deployed in two experiments of human interactions with driving automation that were executed in an immersive, full-motion simulation environment. Moreover, by integrating a behavior and physiology-based predictor within a novel consequence-based control system, we demonstrated that it is possible to anticipate particular interaction behaviors and influence humans towards more optimal choices about automation use in real time. Importantly, this research provides a fertile foundation for the development and integration of advanced, wearable technologies for sensing and inferring critical state variables for better integration of human elements into otherwise fully autonomous systems.

  16. Automated Vehicles Symposium 2014

    CERN Document Server

    Beiker, Sven; Road Vehicle Automation 2

    2015-01-01

    This paper collection is the second volume of the LNMOB series on Road Vehicle Automation. The book contains a comprehensive review of current technical, socio-economic, and legal perspectives written by experts coming from public authorities, companies and universities in the U.S., Europe and Japan. It originates from the Automated Vehicle Symposium 2014, which was jointly organized by the Association for Unmanned Vehicle Systems International (AUVSI) and the Transportation Research Board (TRB) in Burlingame, CA, in July 2014. The contributions discuss the challenges arising from the integration of highly automated and self-driving vehicles into the transportation system, with a focus on human factors and different deployment scenarios. This book is an indispensable source of information for academic researchers, industrial engineers, and policy makers interested in the topic of road vehicle automation.

  17. Performance of integrated systems of automated roller shade systems and daylight responsive dimming systems

    Energy Technology Data Exchange (ETDEWEB)

    Park, Byoung-Chul; Choi, An-Seop; Jeong, Jae-Weon [Department of Architectural Engineering, Sejong University, Kunja-Dong, Kwangjin-Gu, Seoul (Korea, Republic of); Lee, Eleanor S. [Building Technologies Department, Lawrence Berkeley National Laboratory, Berkeley, CA (United States)

    2011-03-15

    Daylight responsive dimming systems have been used in few buildings to date because they require improvements to improve reliability. The key underlying factor contributing to poor performance is the variability of the ratio of the photosensor signal to daylight workplane illuminance in accordance with sun position, sky condition, and fenestration condition. Therefore, this paper describes the integrated systems between automated roller shade systems and daylight responsive dimming systems with an improved closed-loop proportional control algorithm, and the relative performance of the integrated systems and single systems. The concept of the improved closed-loop proportional control algorithm for the integrated systems is to predict the varying correlation of photosensor signal to daylight workplane illuminance according to roller shade height and sky conditions for improvement of the system accuracy. In this study, the performance of the integrated systems with two improved closed-loop proportional control algorithms was compared with that of the current (modified) closed-loop proportional control algorithm. In the results, the average maintenance percentage and the average discrepancies of the target illuminance, as well as the average time under 90% of target illuminance for the integrated systems significantly improved in comparison with the current closed-loop proportional control algorithm for daylight responsive dimming systems as a single system. (author)

  18. Integrated quality status and inventory tracking system for FFTF driver fuel pins

    International Nuclear Information System (INIS)

    Gottschalk, G.P.

    1979-11-01

    An integrated system for quality status and inventory tracking of Fast Flux Test Facility (FFTF) driver fuel pins has been developed. Automated fuel pin identification systems, a distributed computer network, and a data base are used to implement the tracking system

  19. Office automation: a look beyond word processing

    OpenAIRE

    DuBois, Milan Ephriam, Jr.

    1983-01-01

    Approved for public release; distribution is unlimited Word processing was the first of various forms of office automation technologies to gain widespread acceptance and usability in the business world. For many, it remains the only form of office automation technology. Office automation, however, is not just word processing, although it does include the function of facilitating and manipulating text. In reality, office automation is not one innovation, or one office system, or one tech...

  20. System Integration of Distributed Energy Resources

    DEFF Research Database (Denmark)

    Nyeng, Preben

    units, including the ICT solutions that can facilitate the integration. Specifically, the international standard "IEC 61850-7-420 Communications systems for Distributed Energy Resources" is considered as a possible brick in the solution. This standard has undergone continuous development....... It is therefore investigated in this project how ancillary services can be provided by alternatives to central power stations, and to what extent these can be integrated in the system by means of market-based methods. Particular emphasis is put on automatic solutions, which is particularly relevant for small......, and this project has actively contributed to its further development and improvements. Different types of integration methods are investigated in the project. Some are based on local measurement and control, e.g. by measuring the grid frequency, whereas others are based on direct remote control or market...

  1. Tools for the automation of large control systems

    CERN Document Server

    Gaspar, Clara

    2005-01-01

    The new LHC experiments at CERN will have very large numbers of channels to operate. In order to be able to configure and monitor such large systems, a high degree of parallelism is necessary. The control system is built as a hierarchy of sub-systems distributed over several computers. A toolkit – SMI++, combining two approaches: finite state machines and rule-based programming, allows for the description of the various sub-systems as decentralized deciding entities, reacting in real-time to changes in the system, thus providing for the automation of standard procedures and the for the automatic recovery from error conditions in a hierarchical fashion. In this paper we will describe the principles and features of SMI++ as well as its integration with an industrial SCADA tool for use by the LHC experiments and we will try to show that such tools, can provide a very convenient mechanism for the automation of large scale, high complexity, applications.

  2. Laboratory automation and LIMS in forensics

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Hansen, Anders Johannes; Morling, Niels

    2013-01-01

    . Furthermore, implementation of automated liquid handlers reduces the risk of sample misplacement. A LIMS can efficiently control the sample flow through the laboratory and manage the results of the conducted tests for each sample. Integration of automated liquid handlers with a LIMS provides the laboratory......Implementation of laboratory automation and LIMS in a forensic laboratory enables the laboratory, to standardize sample processing. Automated liquid handlers can increase throughput and eliminate manual repetitive pipetting operations, known to result in occupational injuries to the technical staff...... with the tools required for setting up automated production lines of complex laboratory processes and monitoring the whole process and the results. Combined, this enables processing of a large number of samples. Selection of the best automated solution for an individual laboratory should be based on user...

  3. Development of an automation system for a distribution operation center; Desenvolvimento de um sistema de automacao para um Centro de Operacao da Distribuicao

    Energy Technology Data Exchange (ETDEWEB)

    Surur, Paulo Sergio Miguel

    1996-07-01

    The great problems caused by a deficient electric energy supply, mainly referring to quality in the distribution system are widely known. The automation of the feeder and of the Distribution Operational Center, contributes to improving quality mainly concerning the restoring time of the lines during cut-outs decreasing the non-supplied energy. This paper presents an automation system of COD - Distribution Operation Center and tests performed to evaluate the system performance in a substation and in the primary network manoeuvre switch. Considerations on the hardware, software and man machine interface developed for the operator were taken aiming at justifying the adopted choice for this project. Software and hardware modules available in the Brazilian market were applied in this work. Tests of the system were done at a substation and in a laboratory. The results were satisfactory. (author)

  4. The NIF DISCO Framework: facilitating automated integration of neuroscience content on the web.

    Science.gov (United States)

    Marenco, Luis; Wang, Rixin; Shepherd, Gordon M; Miller, Perry L

    2010-06-01

    This paper describes the capabilities of DISCO, an extensible approach that supports integrative Web-based information dissemination. DISCO is a component of the Neuroscience Information Framework (NIF), an NIH Neuroscience Blueprint initiative that facilitates integrated access to diverse neuroscience resources via the Internet. DISCO facilitates the automated maintenance of several distinct capabilities using a collection of files 1) that are maintained locally by the developers of participating neuroscience resources and 2) that are "harvested" on a regular basis by a central DISCO server. This approach allows central NIF capabilities to be updated as each resource's content changes over time. DISCO currently supports the following capabilities: 1) resource descriptions, 2) "LinkOut" to a resource's data items from NCBI Entrez resources such as PubMed, 3) Web-based interoperation with a resource, 4) sharing a resource's lexicon and ontology, 5) sharing a resource's database schema, and 6) participation by the resource in neuroscience-related RSS news dissemination. The developers of a resource are free to choose which DISCO capabilities their resource will participate in. Although DISCO is used by NIF to facilitate neuroscience data integration, its capabilities have general applicability to other areas of research.

  5. Modeling nurses' attitude toward using automated unit-based medication storage and distribution systems: an extension of the technology acceptance model.

    Science.gov (United States)

    Escobar-Rodríguez, Tomás; Romero-Alonso, María Mercedes

    2013-05-01

    This article analyzes the attitude of nurses toward the use of automated unit-based medication storage and distribution systems and identifies influencing factors. Understanding these factors provides an opportunity to explore actions that might be taken to boost adoption by potential users. The theoretical grounding for this research is the Technology Acceptance Model. The Technology Acceptance Model specifies the causal relationships between perceived usefulness, perceived ease of use, attitude toward using, and actual usage behavior. The research model has six constructs, and nine hypotheses were generated from connections between these six constructs. These constructs include perceived risks, experience level, and training. The findings indicate that these three external variables are related to the perceived ease of use and perceived usefulness of automated unit-based medication storage and distribution systems, and therefore, they have a significant influence on attitude toward the use of these systems.

  6. Integrated Power Flow and Short Circuit Calculation Method for Distribution Network with Inverter Based Distributed Generation

    Directory of Open Access Journals (Sweden)

    Shan Yang

    2016-01-01

    Full Text Available Power flow calculation and short circuit calculation are the basis of theoretical research for distribution network with inverter based distributed generation. The similarity of equivalent model for inverter based distributed generation during normal and fault conditions of distribution network and the differences between power flow and short circuit calculation are analyzed in this paper. Then an integrated power flow and short circuit calculation method for distribution network with inverter based distributed generation is proposed. The proposed method let the inverter based distributed generation be equivalent to Iθ bus, which makes it suitable to calculate the power flow of distribution network with a current limited inverter based distributed generation. And the low voltage ride through capability of inverter based distributed generation can be considered as well in this paper. Finally, some tests of power flow and short circuit current calculation are performed on a 33-bus distribution network. The calculated results from the proposed method in this paper are contrasted with those by the traditional method and the simulation method, whose results have verified the effectiveness of the integrated method suggested in this paper.

  7. Methodology for prioritizing projects considering the generation, transmission and distribution integrated planning and the financial restraints; Metodologia para priorizacao de projetos, considerando o planejamento integrado de geracao, transmissao e distribuicao e as restricoes financeiras

    Energy Technology Data Exchange (ETDEWEB)

    Souza, Denis Claudio Cruz de; Andrade, Eduardo Leopoldino de; Pimentel, Elson Luiz de Almeida; Pinto, Everton Barroso [Companhia Energetica de Minas Gerais (CEMIG), Belo Horizonte, MG (Brazil)

    1995-12-31

    This technical report presents a methodology for economical evaluation and work design priorities in the electric system and the experience of CEMIG, an electric power utility of State of Minas Gerais - Southeast Brazil, in defining its transmission expansion plan. It is presented and discussed the concept of integrated projects for expansion, involving generation, transmission, distribution, automation and telecommunication works 3 refs., 3 figs., 1 tab.

  8. Towards Integrating Distributed Energy Resources and Storage Devices in Smart Grid.

    Science.gov (United States)

    Xu, Guobin; Yu, Wei; Griffith, David; Golmie, Nada; Moulema, Paul

    2017-02-01

    Internet of Things (IoT) provides a generic infrastructure for different applications to integrate information communication techniques with physical components to achieve automatic data collection, transmission, exchange, and computation. The smart grid, as one of typical applications supported by IoT, denoted as a re-engineering and a modernization of the traditional power grid, aims to provide reliable, secure, and efficient energy transmission and distribution to consumers. How to effectively integrate distributed (renewable) energy resources and storage devices to satisfy the energy service requirements of users, while minimizing the power generation and transmission cost, remains a highly pressing challenge in the smart grid. To address this challenge and assess the effectiveness of integrating distributed energy resources and storage devices, in this paper we develop a theoretical framework to model and analyze three types of power grid systems: the power grid with only bulk energy generators, the power grid with distributed energy resources, and the power grid with both distributed energy resources and storage devices. Based on the metrics of the power cumulative cost and the service reliability to users, we formally model and analyze the impact of integrating distributed energy resources and storage devices in the power grid. We also use the concept of network calculus, which has been traditionally used for carrying out traffic engineering in computer networks, to derive the bounds of both power supply and user demand to achieve a high service reliability to users. Through an extensive performance evaluation, our data shows that integrating distributed energy resources conjointly with energy storage devices can reduce generation costs, smooth the curve of bulk power generation over time, reduce bulk power generation and power distribution losses, and provide a sustainable service reliability to users in the power grid.

  9. AGIS: Integration of new technologies used in ATLAS Distributed Computing

    CERN Document Server

    Anisenkov, Alexey; The ATLAS collaboration; Alandes Pradillo, Maria

    2016-01-01

    AGIS is the information system designed to integrate configuration and status information about resources, services and topology of the computing infrastructure used by ATLAS Distributed Computing (ADC) applications and services. In this note, we describe the evolution and the recent developments of AGIS functionalities, related to integration of new technologies recently become widely used in ATLAS Computing like flexible computing utilization of opportunistic Cloud and HPC resources, ObjectStore services integration for Distributed Data Management (Rucio) and ATLAS workload management (PanDA) systems, unified storage protocols declaration required for PandDA Pilot site movers and others.

  10. Improving transcriptome construction in non-model organisms: integrating manual and automated gene definition in Emiliania huxleyi.

    Science.gov (United States)

    Feldmesser, Ester; Rosenwasser, Shilo; Vardi, Assaf; Ben-Dor, Shifra

    2014-02-22

    The advent of Next Generation Sequencing technologies and corresponding bioinformatics tools allows the definition of transcriptomes in non-model organisms. Non-model organisms are of great ecological and biotechnological significance, and consequently the understanding of their unique metabolic pathways is essential. Several methods that integrate de novo assembly with genome-based assembly have been proposed. Yet, there are many open challenges in defining genes, particularly where genomes are not available or incomplete. Despite the large numbers of transcriptome assemblies that have been performed, quality control of the transcript building process, particularly on the protein level, is rarely performed if ever. To test and improve the quality of the automated transcriptome reconstruction, we used manually defined and curated genes, several of them experimentally validated. Several approaches to transcript construction were utilized, based on the available data: a draft genome, high quality RNAseq reads, and ESTs. In order to maximize the contribution of the various data, we integrated methods including de novo and genome based assembly, as well as EST clustering. After each step a set of manually curated genes was used for quality assessment of the transcripts. The interplay between the automated pipeline and the quality control indicated which additional processes were required to improve the transcriptome reconstruction. We discovered that E. huxleyi has a very high percentage of non-canonical splice junctions, and relatively high rates of intron retention, which caused unique issues with the currently available tools. While individual tools missed genes and artificially joined overlapping transcripts, combining the results of several tools improved the completeness and quality considerably. The final collection, created from the integration of several quality control and improvement rounds, was compared to the manually defined set both on the DNA and

  11. Blockchain for Smart Grid Resilience: Exchanging Distributed Energy at Speed, Scale and Security

    Energy Technology Data Exchange (ETDEWEB)

    Mylrea, Michael E.; Gourisetti, Sri Nikhil Gup

    2017-09-18

    Blockchain may help solve several complex problems related to integrity and trustworthiness of rapid, distributed, complex energy transactions and data exchanges. In a move towards resilience, blockchain commoditizes trust and enables automated smart contracts to support auditable multiparty transactions based on predefined rules between distributed energy providers and customers. Blockchain based smart contracts also help remove the need to interact with third-parties, facilitating the adoption and monetization of distributed energy transactions and exchanges, both energy flows as well as financial transactions. This may help reduce transactive energy costs and increase the security and sustainability of distributed energy resource (DER) integration, helping to remove barriers to a more decentralized and resilient power grid.

  12. Foundations & principles of distributed manufacturing elements of manufacturing networks, cyber-physical production systems and smart automation

    CERN Document Server

    Kühnle, Hermann

    2015-01-01

    The book presents a coherent description of distributed manufacturing, providing a solid base for further research on the subject as well as smart implementations in companies. It provides a guide for those researching and working in a range of fields, such as smart manufacturing, cloud computing, RFID tracking, distributed automation, cyber physical production and global design anywhere, manufacture anywhere solutions. Foundations & Principles of Distributed Manufacturing anticipates future advances in the fields of embedded systems, the Internet of Things and cyber physical systems, outlining how adopting these innovations could rapidly bring about improvements in key performance indicators, which could in turn generate competition pressure by rendering successful business models obsolete. In laying the groundwork for powerful theoretical models, high standards for the homogeneity and soundness of the suggested setups are applied. The book especially elaborates on the upcoming competition in online manu...

  13. High-resolution monitoring of marine protists based on an observation strategy integrating automated on-board filtration and molecular analyses

    Science.gov (United States)

    Metfies, Katja; Schroeder, Friedhelm; Hessel, Johanna; Wollschläger, Jochen; Micheller, Sebastian; Wolf, Christian; Kilias, Estelle; Sprong, Pim; Neuhaus, Stefan; Frickenhaus, Stephan; Petersen, Wilhelm

    2016-11-01

    Information on recent biomass distribution and biogeography of photosynthetic marine protists with adequate temporal and spatial resolution is urgently needed to better understand the consequences of environmental change for marine ecosystems. Here we introduce and review a molecular-based observation strategy for high-resolution assessment of these protists in space and time. It is the result of extensive technology developments, adaptations and evaluations which are documented in a number of different publications, and the results of the recently completed field testing which are introduced in this paper. The observation strategy is organized at four different levels. At level 1, samples are collected at high spatiotemporal resolution using the remotely controlled automated filtration system AUTOFIM. Resulting samples can either be preserved for later laboratory analyses, or directly subjected to molecular surveillance of key species aboard the ship via an automated biosensor system or quantitative polymerase chain reaction (level 2). Preserved samples are analyzed at the next observational levels in the laboratory (levels 3 and 4). At level 3 this involves molecular fingerprinting methods for a quick and reliable overview of differences in protist community composition. Finally, selected samples can be used to generate a detailed analysis of taxonomic protist composition via the latest next generation sequencing technology (NGS) at level 4. An overall integrated dataset of the results based on the different analyses provides comprehensive information on the diversity and biogeography of protists, including all related size classes. At the same time the cost of the observation is optimized with respect to analysis effort and time.

  14. An automated dose tracking system for adaptive radiation therapy.

    Science.gov (United States)

    Liu, Chang; Kim, Jinkoo; Kumarasiri, Akila; Mayyas, Essa; Brown, Stephen L; Wen, Ning; Siddiqui, Farzan; Chetty, Indrin J

    2018-02-01

    The implementation of adaptive radiation therapy (ART) into routine clinical practice is technically challenging and requires significant resources to perform and validate each process step. The objective of this report is to identify the key components of ART, to illustrate how a specific automated procedure improves efficiency, and to facilitate the routine clinical application of ART. Data was used from patient images, exported from a clinical database and converted to an intermediate format for point-wise dose tracking and accumulation. The process was automated using in-house developed software containing three modularized components: an ART engine, user interactive tools, and integration tools. The ART engine conducts computing tasks using the following modules: data importing, image pre-processing, dose mapping, dose accumulation, and reporting. In addition, custom graphical user interfaces (GUIs) were developed to allow user interaction with select processes such as deformable image registration (DIR). A commercial scripting application programming interface was used to incorporate automated dose calculation for application in routine treatment planning. Each module was considered an independent program, written in C++or C#, running in a distributed Windows environment, scheduled and monitored by integration tools. The automated tracking system was retrospectively evaluated for 20 patients with prostate cancer and 96 patients with head and neck cancer, under institutional review board (IRB) approval. In addition, the system was evaluated prospectively using 4 patients with head and neck cancer. Altogether 780 prostate dose fractions and 2586 head and neck cancer dose fractions went processed, including DIR and dose mapping. On average, daily cumulative dose was computed in 3 h and the manual work was limited to 13 min per case with approximately 10% of cases requiring an additional 10 min for image registration refinement. An efficient and convenient

  15. Automated planning of breast radiotherapy using cone beam CT imaging

    International Nuclear Information System (INIS)

    Amit, Guy; Purdie, Thomas G.

    2015-01-01

    Purpose: Develop and clinically validate a methodology for using cone beam computed tomography (CBCT) imaging in an automated treatment planning framework for breast IMRT. Methods: A technique for intensity correction of CBCT images was developed and evaluated. The technique is based on histogram matching of CBCT image sets, using information from “similar” planning CT image sets from a database of paired CBCT and CT image sets (n = 38). Automated treatment plans were generated for a testing subset (n = 15) on the planning CT and the corrected CBCT. The plans generated on the corrected CBCT were compared to the CT-based plans in terms of beam parameters, dosimetric indices, and dose distributions. Results: The corrected CBCT images showed considerable similarity to their corresponding planning CTs (average mutual information 1.0±0.1, average sum of absolute differences 185 ± 38). The automated CBCT-based plans were clinically acceptable, as well as equivalent to the CT-based plans with average gantry angle difference of 0.99°±1.1°, target volume overlap index (Dice) of 0.89±0.04 although with slightly higher maximum target doses (4482±90 vs 4560±84, P < 0.05). Gamma index analysis (3%, 3 mm) showed that the CBCT-based plans had the same dose distribution as plans calculated with the same beams on the registered planning CTs (average gamma index 0.12±0.04, gamma <1 in 99.4%±0.3%). Conclusions: The proposed method demonstrates the potential for a clinically feasible and efficient online adaptive breast IMRT planning method based on CBCT imaging, integrating automation

  16. Future power plant control integrates process and substation automation into one system; Zukunftsorientierte Kraftwerksleittechnik vereint Prozess- und Stationsautomatisierung

    Energy Technology Data Exchange (ETDEWEB)

    Orth, J. [ABB AG, Mannheim (Germany). Div. Energietechnik-Systeme

    2007-07-01

    The new IEC 61850 standard has been established for substation control systems. In future, IEC 61850 may also be widely used for electrical systems in power plants. IEC 61850 simplifies the integration of process and substation control systems in power plants by creating one automated system across manufacturers and thus makes a significant contribution to cost efficiency in operation and maintenance. (orig.)

  17. Development of methods for DSM and distribution automation planning

    International Nuclear Information System (INIS)

    Kaerkkaeinen, S.; Kekkonen, V.; Rissanen, P.

    1998-01-01

    Demand-Side Management (DSM) is usually an utility (or sometimes governmental) activity designed to influence energy demand of customers (both level and load variation). It includes basic options like strategic conservation or load growth, peak clipping. Load shifting and fuel switching. Typical ways to realize DSM are direct load control, innovative tariffs, different types of campaign etc. Restructuring of utility in Finland and increased competition in electricity market have had dramatic influence on the DSM. Traditional ways are impossible due to the conflicting interests of generation, network and supply business and increased competition between different actors in the market. Costs and benefits of DSM are divided to different companies, and different type of utilities are interested only in those activities which are beneficial to them. On the other hand, due to the increased competition the suppliers are diversifying to different types of products and increasing number of customer services partly based on DSM are available. The aim of this project was to develop and assess methods for DSM and distribution automation planning from the utility point of view. The methods were also applied to case studies at utilities

  18. Development of methods for DSM and distribution automation planning

    Energy Technology Data Exchange (ETDEWEB)

    Kaerkkaeinen, S; Kekkonen, V [VTT Energy, Espoo (Finland); Rissanen, P [Tietosavo Oy (Finland)

    1998-08-01

    Demand-Side Management (DSM) is usually an utility (or sometimes governmental) activity designed to influence energy demand of customers (both level and load variation). It includes basic options like strategic conservation or load growth, peak clipping. Load shifting and fuel switching. Typical ways to realize DSM are direct load control, innovative tariffs, different types of campaign etc. Restructuring of utility in Finland and increased competition in electricity market have had dramatic influence on the DSM. Traditional ways are impossible due to the conflicting interests of generation, network and supply business and increased competition between different actors in the market. Costs and benefits of DSM are divided to different companies, and different type of utilities are interested only in those activities which are beneficial to them. On the other hand, due to the increased competition the suppliers are diversifying to different types of products and increasing number of customer services partly based on DSM are available. The aim of this project was to develop and assess methods for DSM and distribution automation planning from the utility point of view. The methods were also applied to case studies at utilities

  19. A Federated Enterprise Architecture and MBSE Modeling Framework for Integrating Design Automation into a Global PLM Approach

    OpenAIRE

    Vosgien , Thomas; Rigger , Eugen; Schwarz , Martin; Shea , Kristina

    2017-01-01

    Part 1: PLM Maturity, Implementation and Adoption; International audience; PLM and Design Automation (DA) are two interdependent and necessary approaches to increase the performance and efficiency of product development processes. Often, DA systems’ usability suffers due to a lack of integration in industrial business environments stemming from the independent consideration of PLM and DA. This article proposes a methodological and modeling framework for developing and deploying DA solutions w...

  20. A Projection of Automated Book Production Control

    Directory of Open Access Journals (Sweden)

    Mario Barisic

    2006-12-01

    Full Text Available The paper elaborates on the recommendation of systematic introducing of XML technologies as a standard and integral factor in publishing and graphic business activities and as a further improvement of the existing PostScript graphic production platform. Procedures are proposed for applying norm setting in respect to production processes through related connections organized databases under XML technology in a hierarchical way, as well as a book production norm setting system. The proposal for work processes automation in the domain of printing business control is elaborated under the CIP4-JDF automating system. Operation results are used as guidelines for setting the elements of automated business operations in the book production domain, with integrated elements of new technologies, compatible with global trends.

  1. The Semantic Automated Discovery and Integration (SADI Web service Design-Pattern, API and Reference Implementation

    Directory of Open Access Journals (Sweden)

    Wilkinson Mark D

    2011-10-01

    Full Text Available Abstract Background The complexity and inter-related nature of biological data poses a difficult challenge for data and tool integration. There has been a proliferation of interoperability standards and projects over the past decade, none of which has been widely adopted by the bioinformatics community. Recent attempts have focused on the use of semantics to assist integration, and Semantic Web technologies are being welcomed by this community. Description SADI - Semantic Automated Discovery and Integration - is a lightweight set of fully standards-compliant Semantic Web service design patterns that simplify the publication of services of the type commonly found in bioinformatics and other scientific domains. Using Semantic Web technologies at every level of the Web services "stack", SADI services consume and produce instances of OWL Classes following a small number of very straightforward best-practices. In addition, we provide codebases that support these best-practices, and plug-in tools to popular developer and client software that dramatically simplify deployment of services by providers, and the discovery and utilization of those services by their consumers. Conclusions SADI Services are fully compliant with, and utilize only foundational Web standards; are simple to create and maintain for service providers; and can be discovered and utilized in a very intuitive way by biologist end-users. In addition, the SADI design patterns significantly improve the ability of software to automatically discover appropriate services based on user-needs, and automatically chain these into complex analytical workflows. We show that, when resources are exposed through SADI, data compliant with a given ontological model can be automatically gathered, or generated, from these distributed, non-coordinating resources - a behaviour we have not observed in any other Semantic system. Finally, we show that, using SADI, data dynamically generated from Web services

  2. The Semantic Automated Discovery and Integration (SADI) Web service Design-Pattern, API and Reference Implementation

    Science.gov (United States)

    2011-01-01

    Background The complexity and inter-related nature of biological data poses a difficult challenge for data and tool integration. There has been a proliferation of interoperability standards and projects over the past decade, none of which has been widely adopted by the bioinformatics community. Recent attempts have focused on the use of semantics to assist integration, and Semantic Web technologies are being welcomed by this community. Description SADI - Semantic Automated Discovery and Integration - is a lightweight set of fully standards-compliant Semantic Web service design patterns that simplify the publication of services of the type commonly found in bioinformatics and other scientific domains. Using Semantic Web technologies at every level of the Web services "stack", SADI services consume and produce instances of OWL Classes following a small number of very straightforward best-practices. In addition, we provide codebases that support these best-practices, and plug-in tools to popular developer and client software that dramatically simplify deployment of services by providers, and the discovery and utilization of those services by their consumers. Conclusions SADI Services are fully compliant with, and utilize only foundational Web standards; are simple to create and maintain for service providers; and can be discovered and utilized in a very intuitive way by biologist end-users. In addition, the SADI design patterns significantly improve the ability of software to automatically discover appropriate services based on user-needs, and automatically chain these into complex analytical workflows. We show that, when resources are exposed through SADI, data compliant with a given ontological model can be automatically gathered, or generated, from these distributed, non-coordinating resources - a behaviour we have not observed in any other Semantic system. Finally, we show that, using SADI, data dynamically generated from Web services can be explored in a manner

  3. The Semantic Automated Discovery and Integration (SADI) Web service Design-Pattern, API and Reference Implementation.

    Science.gov (United States)

    Wilkinson, Mark D; Vandervalk, Benjamin; McCarthy, Luke

    2011-10-24

    The complexity and inter-related nature of biological data poses a difficult challenge for data and tool integration. There has been a proliferation of interoperability standards and projects over the past decade, none of which has been widely adopted by the bioinformatics community. Recent attempts have focused on the use of semantics to assist integration, and Semantic Web technologies are being welcomed by this community. SADI - Semantic Automated Discovery and Integration - is a lightweight set of fully standards-compliant Semantic Web service design patterns that simplify the publication of services of the type commonly found in bioinformatics and other scientific domains. Using Semantic Web technologies at every level of the Web services "stack", SADI services consume and produce instances of OWL Classes following a small number of very straightforward best-practices. In addition, we provide codebases that support these best-practices, and plug-in tools to popular developer and client software that dramatically simplify deployment of services by providers, and the discovery and utilization of those services by their consumers. SADI Services are fully compliant with, and utilize only foundational Web standards; are simple to create and maintain for service providers; and can be discovered and utilized in a very intuitive way by biologist end-users. In addition, the SADI design patterns significantly improve the ability of software to automatically discover appropriate services based on user-needs, and automatically chain these into complex analytical workflows. We show that, when resources are exposed through SADI, data compliant with a given ontological model can be automatically gathered, or generated, from these distributed, non-coordinating resources - a behaviour we have not observed in any other Semantic system. Finally, we show that, using SADI, data dynamically generated from Web services can be explored in a manner very similar to data housed in

  4. Probabilistic Harmonic Analysis on Distributed Photovoltaic Integration Considering Typical Weather Scenarios

    Science.gov (United States)

    Bin, Che; Ruoying, Yu; Dongsheng, Dang; Xiangyan, Wang

    2017-05-01

    Distributed Generation (DG) integrating to the network would cause the harmonic pollution which would cause damages on electrical devices and affect the normal operation of power system. On the other hand, due to the randomness of the wind and solar irradiation, the output of DG is random, too, which leads to an uncertainty of the harmonic generated by the DG. Thus, probabilistic methods are needed to analyse the impacts of the DG integration. In this work we studied the harmonic voltage probabilistic distribution and the harmonic distortion in distributed network after the distributed photovoltaic (DPV) system integrating in different weather conditions, mainly the sunny day, cloudy day, rainy day and the snowy day. The probabilistic distribution function of the DPV output power in different typical weather conditions could be acquired via the parameter identification method of maximum likelihood estimation. The Monte-Carlo simulation method was adopted to calculate the probabilistic distribution of harmonic voltage content at different frequency orders as well as the harmonic distortion (THD) in typical weather conditions. The case study was based on the IEEE33 system and the results of harmonic voltage content probabilistic distribution as well as THD in typical weather conditions were compared.

  5. Wireless Home Automation System using IoT

    Directory of Open Access Journals (Sweden)

    Alexandra MIHALACHE

    2017-01-01

    Full Text Available Nowadays, the chance of having an automated home is no longer a fancy luxury, but a reality accessible to a wide range of consumers, because smart home systems have replaced those that only automated the home in the past. More and more solutions based on IoT are being devel-oped to transform homes into smart ones, but the problem is that the benefits of home automa-tion are still not clear to everyone as they are not promoted enough, so we cannot talk about a broad mass of consumers already using integrated or DIY solutions to improve their lives. In this paper, I will present a home automation system using Arduino Uno integrated with rele-vant modules which are used to allow remote control of lights or fans, changes being made on the basis of different sensors data. The system is designed to be low cost and expandable, bring-ing accessibility, convenience and energy efficiency.

  6. Using artificial intelligence to automate remittance processing.

    Science.gov (United States)

    Adams, W T; Snow, G M; Helmick, P M

    1998-06-01

    The consolidated business office of the Allegheny Health Education Research Foundation (AHERF), a large integrated healthcare system based in Pittsburgh, Pennsylvania, sought to improve its cash-related business office activities by implementing an automated remittance processing system that uses artificial intelligence. The goal was to create a completely automated system whereby all monies it processed would be tracked, automatically posted, analyzed, monitored, controlled, and reconciled through a central database. Using a phased approach, the automated payment system has become the central repository for all of the remittances for seven of the hospitals in the AHERF system and has allowed for the complete integration of these hospitals' existing billing systems, document imaging system, and intranet, as well as the new automated payment posting, and electronic cash tracking and reconciling systems. For such new technology, which is designed to bring about major change, factors contributing to the project's success were adequate planning, clearly articulated objectives, marketing, end-user acceptance, and post-implementation plan revision.

  7. A Cost Effective Security Technology Integrated with RFID Based Automated Toll Collection System

    Directory of Open Access Journals (Sweden)

    Rafiya Hossain

    2017-09-01

    Full Text Available Crime statistics and research on criminology show that under similar circumstances,crimes are more likely to occur in developing countries than in developed countries due to their lack ofsecurity measures. Transport crimes on highways and bridges are one of the most common crimes in the developing nations. Automation of various systems like the toll collection system is being introduced in the developing countries to avoid corruption in the collection of toll, decrease cost and increase operational efficiency. The goal of this research is to find an integrated solution that enhances security along with the advantage of automated toll collection. Inspired by the availability of many security systems, this research presents a system that can block a specific vehicle or a particular type of vehicles at the toll booths based on directives from the law enforcement agencies. The heart of the system is based on RFID (Radio Frequency Identification technology. In this system, by sending a text message the law enforcement agency or the authority that controls the toll booths can prevent the barrier from being liftedeven after deduction of the toll charge if the passing vehicle has a security issue. The designed system should help the effort of reducing transport crimes on highways and bridges of developing countries.

  8. Distributed Wireless Data Acquisition System with Synchronized Data Flow

    CERN Document Server

    Astakhova, N V; Dikoussar, N D; Eremin, G I; Gerasimov, A V; Ivanov, A I; Kryukov, Yu S; Mazny, N G; Ryabchun, O V; Salamatin, I M

    2006-01-01

    New methods to provide succession of computer codes under changes of the class of problems and to integrate the drivers of special-purpose devices into application are devised. The worked out scheme and methods for constructing automation systems are used to elaborate a distributed wireless system intended for registration of the characteristics of pulse processes with synchronized data flow, transmitted over a radio channel. The equipment with a sampling frequency of 20 kHz allowed us to achieve a synchronization accuracy of up to $\\pm $ 50 $\\mu$s. Modification of part of the equipment (sampling frequency) permits one to improve the accuracy up to 0.1 $\\mu$s. The obtained results can be applied to develop systems for monitoring various objects, as well as automation systems for experiments and automated process control systems.

  9. Integration of Cloud resources in the LHCb Distributed Computing

    Science.gov (United States)

    Úbeda García, Mario; Méndez Muñoz, Víctor; Stagni, Federico; Cabarrou, Baptiste; Rauschmayr, Nathalie; Charpentier, Philippe; Closier, Joel

    2014-06-01

    This contribution describes how Cloud resources have been integrated in the LHCb Distributed Computing. LHCb is using its specific Dirac extension (LHCbDirac) as an interware for its Distributed Computing. So far, it was seamlessly integrating Grid resources and Computer clusters. The cloud extension of DIRAC (VMDIRAC) allows the integration of Cloud computing infrastructures. It is able to interact with multiple types of infrastructures in commercial and institutional clouds, supported by multiple interfaces (Amazon EC2, OpenNebula, OpenStack and CloudStack) - instantiates, monitors and manages Virtual Machines running on this aggregation of Cloud resources. Moreover, specifications for institutional Cloud resources proposed by Worldwide LHC Computing Grid (WLCG), mainly by the High Energy Physics Unix Information Exchange (HEPiX) group, have been taken into account. Several initiatives and computing resource providers in the eScience environment have already deployed IaaS in production during 2013. Keeping this on mind, pros and cons of a cloud based infrasctructure have been studied in contrast with the current setup. As a result, this work addresses four different use cases which represent a major improvement on several levels of our infrastructure. We describe the solution implemented by LHCb for the contextualisation of the VMs based on the idea of Cloud Site. We report on operational experience of using in production several institutional Cloud resources that are thus becoming integral part of the LHCb Distributed Computing resources. Furthermore, we describe as well the gradual migration of our Service Infrastructure towards a fully distributed architecture following the Service as a Service (SaaS) model.

  10. Integration of cloud resources in the LHCb distributed computing

    International Nuclear Information System (INIS)

    García, Mario Úbeda; Stagni, Federico; Cabarrou, Baptiste; Rauschmayr, Nathalie; Charpentier, Philippe; Closier, Joel; Muñoz, Víctor Méndez

    2014-01-01

    This contribution describes how Cloud resources have been integrated in the LHCb Distributed Computing. LHCb is using its specific Dirac extension (LHCbDirac) as an interware for its Distributed Computing. So far, it was seamlessly integrating Grid resources and Computer clusters. The cloud extension of DIRAC (VMDIRAC) allows the integration of Cloud computing infrastructures. It is able to interact with multiple types of infrastructures in commercial and institutional clouds, supported by multiple interfaces (Amazon EC2, OpenNebula, OpenStack and CloudStack) – instantiates, monitors and manages Virtual Machines running on this aggregation of Cloud resources. Moreover, specifications for institutional Cloud resources proposed by Worldwide LHC Computing Grid (WLCG), mainly by the High Energy Physics Unix Information Exchange (HEPiX) group, have been taken into account. Several initiatives and computing resource providers in the eScience environment have already deployed IaaS in production during 2013. Keeping this on mind, pros and cons of a cloud based infrasctructure have been studied in contrast with the current setup. As a result, this work addresses four different use cases which represent a major improvement on several levels of our infrastructure. We describe the solution implemented by LHCb for the contextualisation of the VMs based on the idea of Cloud Site. We report on operational experience of using in production several institutional Cloud resources that are thus becoming integral part of the LHCb Distributed Computing resources. Furthermore, we describe as well the gradual migration of our Service Infrastructure towards a fully distributed architecture following the Service as a Service (SaaS) model.

  11. Integrated Power Flow and Short Circuit Calculation Method for Distribution Network with Inverter Based Distributed Generation

    OpenAIRE

    Yang, Shan; Tong, Xiangqian

    2016-01-01

    Power flow calculation and short circuit calculation are the basis of theoretical research for distribution network with inverter based distributed generation. The similarity of equivalent model for inverter based distributed generation during normal and fault conditions of distribution network and the differences between power flow and short circuit calculation are analyzed in this paper. Then an integrated power flow and short circuit calculation method for distribution network with inverte...

  12. Hexicon 2: automated processing of hydrogen-deuterium exchange mass spectrometry data with improved deuteration distribution estimation.

    Science.gov (United States)

    Lindner, Robert; Lou, Xinghua; Reinstein, Jochen; Shoeman, Robert L; Hamprecht, Fred A; Winkler, Andreas

    2014-06-01

    Hydrogen-deuterium exchange (HDX) experiments analyzed by mass spectrometry (MS) provide information about the dynamics and the solvent accessibility of protein backbone amide hydrogen atoms. Continuous improvement of MS instrumentation has contributed to the increasing popularity of this method; however, comprehensive automated data analysis is only beginning to mature. We present Hexicon 2, an automated pipeline for data analysis and visualization based on the previously published program Hexicon (Lou et al. 2010). Hexicon 2 employs the sensitive NITPICK peak detection algorithm of its predecessor in a divide-and-conquer strategy and adds new features, such as chromatogram alignment and improved peptide sequence assignment. The unique feature of deuteration distribution estimation was retained in Hexicon 2 and improved using an iterative deconvolution algorithm that is robust even to noisy data. In addition, Hexicon 2 provides a data browser that facilitates quality control and provides convenient access to common data visualization tasks. Analysis of a benchmark dataset demonstrates superior performance of Hexicon 2 compared with its predecessor in terms of deuteration centroid recovery and deuteration distribution estimation. Hexicon 2 greatly reduces data analysis time compared with manual analysis, whereas the increased number of peptides provides redundant coverage of the entire protein sequence. Hexicon 2 is a standalone application available free of charge under http://hx2.mpimf-heidelberg.mpg.de.

  13. Hexicon 2: Automated Processing of Hydrogen-Deuterium Exchange Mass Spectrometry Data with Improved Deuteration Distribution Estimation

    Science.gov (United States)

    Lindner, Robert; Lou, Xinghua; Reinstein, Jochen; Shoeman, Robert L.; Hamprecht, Fred A.; Winkler, Andreas

    2014-06-01

    Hydrogen-deuterium exchange (HDX) experiments analyzed by mass spectrometry (MS) provide information about the dynamics and the solvent accessibility of protein backbone amide hydrogen atoms. Continuous improvement of MS instrumentation has contributed to the increasing popularity of this method; however, comprehensive automated data analysis is only beginning to mature. We present Hexicon 2, an automated pipeline for data analysis and visualization based on the previously published program Hexicon (Lou et al. 2010). Hexicon 2 employs the sensitive NITPICK peak detection algorithm of its predecessor in a divide-and-conquer strategy and adds new features, such as chromatogram alignment and improved peptide sequence assignment. The unique feature of deuteration distribution estimation was retained in Hexicon 2 and improved using an iterative deconvolution algorithm that is robust even to noisy data. In addition, Hexicon 2 provides a data browser that facilitates quality control and provides convenient access to common data visualization tasks. Analysis of a benchmark dataset demonstrates superior performance of Hexicon 2 compared with its predecessor in terms of deuteration centroid recovery and deuteration distribution estimation. Hexicon 2 greatly reduces data analysis time compared with manual analysis, whereas the increased number of peptides provides redundant coverage of the entire protein sequence. Hexicon 2 is a standalone application available free of charge under http://hx2.mpimf-heidelberg.mpg.de.

  14. PanDA: distributed production and distributed analysis system for ATLAS

    International Nuclear Information System (INIS)

    Maeno, T

    2008-01-01

    A new distributed software system was developed in the fall of 2005 for the ATLAS experiment at the LHC. This system, called PANDA, provides an integrated service architecture with late binding of jobs, maximal automation through layered services, tight binding with ATLAS Distributed Data Management system [1], advanced error discovery and recovery procedures, and other features. In this talk, we will describe the PANDA software system. Special emphasis will be placed on the evolution of PANDA based on one and half year of real experience in carrying out Computer System Commissioning data production [2] for ATLAS. The architecture of PANDA is well suited for the computing needs of the ATLAS experiment, which is expected to be one of the first HEP experiments to operate at the petabyte scale

  15. Optimal distribution of integration time for intensity measurements in Stokes polarimetry.

    Science.gov (United States)

    Li, Xiaobo; Liu, Tiegen; Huang, Bingjing; Song, Zhanjie; Hu, Haofeng

    2015-10-19

    We consider the typical Stokes polarimetry system, which performs four intensity measurements to estimate a Stokes vector. We show that if the total integration time of intensity measurements is fixed, the variance of the Stokes vector estimator depends on the distribution of the integration time at four intensity measurements. Therefore, by optimizing the distribution of integration time, the variance of the Stokes vector estimator can be decreased. In this paper, we obtain the closed-form solution of the optimal distribution of integration time by employing Lagrange multiplier method. According to the theoretical analysis and real-world experiment, it is shown that the total variance of the Stokes vector estimator can be significantly decreased about 40% in the case discussed in this paper. The method proposed in this paper can effectively decrease the measurement variance and thus statistically improves the measurement accuracy of the polarimetric system.

  16. SSTAC/ARTS review of the draft Integrated Technology Plan (ITP). Volume 8: Aerothermodynamics Automation and Robotics (A/R) systems sensors, high-temperature superconductivity

    Science.gov (United States)

    1991-01-01

    Viewgraphs of briefings presented at the SSTAC/ARTS review of the draft Integrated Technology Plan (ITP) on aerothermodynamics, automation and robotics systems, sensors, and high-temperature superconductivity are included. Topics covered include: aerothermodynamics; aerobraking; aeroassist flight experiment; entry technology for probes and penetrators; automation and robotics; artificial intelligence; NASA telerobotics program; planetary rover program; science sensor technology; direct detector; submillimeter sensors; laser sensors; passive microwave sensing; active microwave sensing; sensor electronics; sensor optics; coolers and cryogenics; and high temperature superconductivity.

  17. SSTAC/ARTS review of the draft Integrated Technology Plan (ITP). Volume 8: Aerothermodynamics Automation and Robotics (A/R) systems sensors, high-temperature superconductivity

    International Nuclear Information System (INIS)

    1991-06-01

    Viewgraphs of briefings presented at the SSTAC/ARTS review of the draft Integrated Technology Plan (ITP) on aerothermodynamics, automation and robotics systems, sensors, and high-temperature superconductivity are included. Topics covered include: aerothermodynamics; aerobraking; aeroassist flight experiment; entry technology for probes and penetrators; automation and robotics; artificial intelligence; NASA telerobotics program; planetary rover program; science sensor technology; direct detector; submillimeter sensors; laser sensors; passive microwave sensing; active microwave sensing; sensor electronics; sensor optics; coolers and cryogenics; and high temperature superconductivity

  18. Automation of multi-agent control for complex dynamic systems in heterogeneous computational network

    Science.gov (United States)

    Oparin, Gennady; Feoktistov, Alexander; Bogdanova, Vera; Sidorov, Ivan

    2017-01-01

    The rapid progress of high-performance computing entails new challenges related to solving large scientific problems for various subject domains in a heterogeneous distributed computing environment (e.g., a network, Grid system, or Cloud infrastructure). The specialists in the field of parallel and distributed computing give the special attention to a scalability of applications for problem solving. An effective management of the scalable application in the heterogeneous distributed computing environment is still a non-trivial issue. Control systems that operate in networks, especially relate to this issue. We propose a new approach to the multi-agent management for the scalable applications in the heterogeneous computational network. The fundamentals of our approach are the integrated use of conceptual programming, simulation modeling, network monitoring, multi-agent management, and service-oriented programming. We developed a special framework for an automation of the problem solving. Advantages of the proposed approach are demonstrated on the parametric synthesis example of the static linear regulator for complex dynamic systems. Benefits of the scalable application for solving this problem include automation of the multi-agent control for the systems in a parallel mode with various degrees of its detailed elaboration.

  19. Web-based automation of green building rating index and life cycle cost analysis

    Science.gov (United States)

    Shahzaib Khan, Jam; Zakaria, Rozana; Aminuddin, Eeydzah; IzieAdiana Abidin, Nur; Sahamir, Shaza Rina; Ahmad, Rosli; Nafis Abas, Darul

    2018-04-01

    Sudden decline in financial markets and economic meltdown has slow down adaptation and lowered interest of investors towards green certified buildings due to their higher initial costs. Similarly, it is essential to fetch investor’s attention towards more development of green buildings through automated tools for the construction projects. Though, historical dearth is found on the automation of green building rating tools that brings up an essential gap to develop an automated analog computerized programming tool. This paper present a proposed research aim to develop an integrated web-based automated analog computerized programming that applies green building rating assessment tool, green technology and life cycle cost analysis. It also emphasizes to identify variables of MyCrest and LCC to be integrated and developed in a framework then transformed into automated analog computerized programming. A mix methodology of qualitative and quantitative survey and its development portray the planned to carry MyCrest-LCC integration to an automated level. In this study, the preliminary literature review enriches better understanding of Green Building Rating Tools (GBRT) integration to LCC. The outcome of this research is a pave way for future researchers to integrate other efficient tool and parameters that contributes towards green buildings and future agendas.

  20. Marketing automation supporting sales

    OpenAIRE

    Sandell, Niko

    2016-01-01

    The past couple of decades has been a time of major changes in marketing. Digitalization has become a permanent part of marketing and at the same time enabled efficient collection of data. Personalization and customization of content are playing a crucial role in marketing when new customers are acquired. This has also created a need for automation to facilitate the distribution of targeted content. As a result of successful marketing automation more information of the customers is gathered ...

  1. Organizational changes and automation: Towards a customer-oriented automation: Part 3

    International Nuclear Information System (INIS)

    Van Gelder, J.W.

    1994-01-01

    Automation offers great opportunities in the efforts of energy utilities in the Netherlands to reorganize towards more customer-oriented businesses. However, automation in itself is not enough. First, the organizational structure has to be changed considerably. Various energy utilities have already started on it. The restructuring principle is the same everywhere, but the way it is implemented differs widely. In this article attention is paid to the necessity of realizing an integrated computerized system, which, however, is not feasible at the moment. The second best alternative is to use various computerized systems, capable of two-way data exchange. Two viable approaches are discussed: (1) one operating system on which all automated systems within a company should run, or (2) a selective system linking on the basis of required speed information exchange. Option (2) offers more freedom of selecting the system. 2 figs

  2. Automated estimation of choroidal thickness distribution and volume based on OCT images of posterior visual section.

    Science.gov (United States)

    Vupparaboina, Kiran Kumar; Nizampatnam, Srinath; Chhablani, Jay; Richhariya, Ashutosh; Jana, Soumya

    2015-12-01

    A variety of vision ailments are indicated by anomalies in the choroid layer of the posterior visual section. Consequently, choroidal thickness and volume measurements, usually performed by experts based on optical coherence tomography (OCT) images, have assumed diagnostic significance. Now, to save precious expert time, it has become imperative to develop automated methods. To this end, one requires choroid outer boundary (COB) detection as a crucial step, where difficulty arises as the COB divides the choroidal granularity and the scleral uniformity only notionally, without marked brightness variation. In this backdrop, we measure the structural dissimilarity between choroid and sclera by structural similarity (SSIM) index, and hence estimate the COB by thresholding. Subsequently, smooth COB estimates, mimicking manual delineation, are obtained using tensor voting. On five datasets, each consisting of 97 adult OCT B-scans, automated and manual segmentation results agree visually. We also demonstrate close statistical match (greater than 99.6% correlation) between choroidal thickness distributions obtained algorithmically and manually. Further, quantitative superiority of our method is established over existing results by respective factors of 27.67% and 76.04% in two quotient measures defined relative to observer repeatability. Finally, automated choroidal volume estimation, being attempted for the first time, also yields results in close agreement with that of manual methods. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Aviation Safety/Automation Program Conference

    Science.gov (United States)

    Morello, Samuel A. (Compiler)

    1990-01-01

    The Aviation Safety/Automation Program Conference - 1989 was sponsored by the NASA Langley Research Center on 11 to 12 October 1989. The conference, held at the Sheraton Beach Inn and Conference Center, Virginia Beach, Virginia, was chaired by Samuel A. Morello. The primary objective of the conference was to ensure effective communication and technology transfer by providing a forum for technical interchange of current operational problems and program results to date. The Aviation Safety/Automation Program has as its primary goal to improve the safety of the national airspace system through the development and integration of human-centered automation technologies for aircraft crews and air traffic controllers.

  4. Coordination control of distributed systems

    CERN Document Server

    Villa, Tiziano

    2015-01-01

    This book describes how control of distributed systems can be advanced by an integration of control, communication, and computation. The global control objectives are met by judicious combinations of local and nonlocal observations taking advantage of various forms of communication exchanges between distributed controllers. Control architectures are considered according to  increasing degrees of cooperation of local controllers:  fully distributed or decentralized control,  control with communication between controllers,  coordination control, and multilevel control.  The book covers also topics bridging computer science, communication, and control, like communication for control of networks, average consensus for distributed systems, and modeling and verification of discrete and of hybrid systems. Examples and case studies are introduced in the first part of the text and developed throughout the book. They include: control of underwater vehicles, automated-guided vehicles on a container terminal, contro...

  5. 2015 Chinese Intelligent Automation Conference

    CERN Document Server

    Li, Hongbo

    2015-01-01

    Proceedings of the 2015 Chinese Intelligent Automation Conference presents selected research papers from the CIAC’15, held in Fuzhou, China. The topics include adaptive control, fuzzy control, neural network based control, knowledge based control, hybrid intelligent control, learning control, evolutionary mechanism based control, multi-sensor integration, failure diagnosis, reconfigurable control, etc. Engineers and researchers from academia, industry and the government can gain valuable insights into interdisciplinary solutions in the field of intelligent automation.

  6. User-friendly Establishment of Trust in Distributed Home Automation Networks

    DEFF Research Database (Denmark)

    Solberg Hjorth, Theis; Torbensen, Rune; Madsen, Per Printz

    2014-01-01

    Current wireless technologies use a variety of methods to locally exchange and verify credentials between devices to establish trusted relationships. Scenarios in home automation networks also require this capability over the Internet, but the necessary involvement of non-expert users to setup...... these relationships can lead to misconfiguration or breaches of security. We outline a security system for Home Automation called Trusted Domain that can establish and maintain cryptographically secure relationships between devices connected via IP-based networks and the Internet. Trust establishment is presented...

  7. Test Automation Process Improvement A case study of BroadSoft

    OpenAIRE

    Gummadi, Jalendar

    2016-01-01

    This master thesis research is about improvement of test automation process at BroadSoft Finland as a case study. Test automation project recently started at BroadSoft but the project is not properly integrated in to existing process. Project is about converting manual test cases to automation test cases. The aim of this thesis is about studying existing BroadSoft test process and studying different test automation frameworks. In this thesis different test automation process are studied ...

  8. Optically induced dielectropheresis sorting with automated medium exchange in an integrated optofluidic device resulting in higher cell viability.

    Science.gov (United States)

    Lee, Gwo-Bin; Wu, Huan-Chun; Yang, Po-Fu; Mai, John D

    2014-08-07

    We demonstrated the integration of a microfluidic device with an optically induced dielectrophoresis (ODEP) device such that the critical medium replacement process was performed automatically and the cells could be subsequently manipulated by using digitally projected optical images. ODEP has been demonstrated to generate sufficient forces for manipulating particles/cells by projecting a light pattern onto photoconductive materials which creates virtual electrodes. The production of the ODEP force usually requires a medium that has a suitable electrical conductivity and an appropriate dielectric constant. Therefore, a 0.2 M sucrose solution is commonly used. However, this requires a complicated medium replacement process before one is able to manipulate cells. Furthermore, the 0.2 M sucrose solution is not suitable for the long-term viability of cells. In comparison to conventional manual processes, our automated medium replacement process only took 25 minutes. Experimental data showed that there was up to a 96.2% recovery rate for the manipulated cells. More importantly, the survival rate of the cells was greatly enhanced due to this faster automated process. This newly developed microfluidic chip provided a promising platform for the rapid replacement of the cell medium and this was also the first time that an ODEP device was integrated with other active flow control components in a microfluidic device. By improving cell viability after cell manipulation, this design may contribute to the practical integration of ODEP modules into other lab-on-a-chip devices and biomedical applications in the future.

  9. Towards data integration automation for the French rare disease registry.

    Science.gov (United States)

    Maaroufi, Meriem; Choquet, Rémy; Landais, Paul; Jaulent, Marie-Christine

    2015-01-01

    Building a medical registry upon an existing infrastructure and rooted practices is not an easy task. It is the case for the BNDMR project, the French rare disease registry, that aims to collect administrative and medical data of rare disease patients seen in different hospitals. To avoid duplicating data entry for health professionals, the project plans to deploy connectors with the existing systems to automatically retrieve data. Given the data heterogeneity and the large number of source systems, the automation of connectors creation is required. In this context, we propose a methodology that optimizes the use of existing alignment approaches in the data integration processes. The generated mappings are formalized in exploitable mapping expressions. Following this methodology, a process has been experimented on specific data types of a source system: Boolean and predefined lists. As a result, effectiveness of the used alignment approach has been enhanced and more good mappings have been detected. Nonetheless, further improvements could be done to deal with the semantic issue and process other data types.

  10. Co-creative design developments for accessibility and home automation

    OpenAIRE

    Taib, SM; De Coster, R; Sabri Tekantape, E

    2017-01-01

    The term “Home Automation” can be referred to a networked home, which provides electronically controlled security and convenience for its users. Home automation is also defined as the integration of home-based technology and services for a better quality of living (Quynh, et al., 2012). The main purpose of home automation technologies is to enhance home comfort for everyone through the automation of higher security, domestic tasks and easy communication. Home automation should be able to enha...

  11. Integrating distributed Bayesian inference and reinforcement learning for sensor management

    NARCIS (Netherlands)

    Grappiolo, C.; Whiteson, S.; Pavlin, G.; Bakker, B.

    2009-01-01

    This paper introduces a sensor management approach that integrates distributed Bayesian inference (DBI) and reinforcement learning (RL). DBI is implemented using distributed perception networks (DPNs), a multiagent approach to performing efficient inference, while RL is used to automatically

  12. Integrated control and diagnostic system architectures for future installations

    International Nuclear Information System (INIS)

    Wood, R.; March-Leuba, J.

    2000-01-01

    Nuclear reactors of the 21st century will employ increasing levels of automation and fault tolerance to increase availability, reduce accident risk, and lower operating costs. Key developments in control algorithms, fault diagnostics, fault tolerance, and distributed communications are needed to implement the fully automated plant. It will be equally challenging to integrate developments in separate information and control fields into a cohesive system, which collectively achieves the overall goals of improved safety, reliability, maintainability, and cost-effectiveness. Under the Nuclear Energy Research Initiative (NERI), the US Department of Energy is sponsoring a project to address some of the technical issues involved in meeting the long-range goal of 21st century reactor control systems. This project involves researchers from Oak Ridge National Laboratory, the University of Tennessee, and North Carolina State University. The research tasks under this project focus on some of the first-level breakthroughs in control design, diagnostic techniques, and information system design that will provide a path to enable the design process to be automated in the future. This paper describes the conceptual development of an integrated nuclear plant control and information system architecture, which incorporates automated control system development that can be traced to a set of technical requirements. The expectation is that an integrated plant architecture with optimal control and efficient use of diagnostic information can reduce the potential for operational errors and minimize challenges to the plant safety systems

  13. Distributed finite-time containment control for double-integrator multiagent systems.

    Science.gov (United States)

    Wang, Xiangyu; Li, Shihua; Shi, Peng

    2014-09-01

    In this paper, the distributed finite-time containment control problem for double-integrator multiagent systems with multiple leaders and external disturbances is discussed. In the presence of multiple dynamic leaders, by utilizing the homogeneous control technique, a distributed finite-time observer is developed for the followers to estimate the weighted average of the leaders' velocities at first. Then, based on the estimates and the generalized adding a power integrator approach, distributed finite-time containment control algorithms are designed to guarantee that the states of the followers converge to the dynamic convex hull spanned by those of the leaders in finite time. Moreover, as a special case of multiple dynamic leaders with zero velocities, the proposed containment control algorithms also work for the case of multiple stationary leaders without using the distributed observer. Simulations demonstrate the effectiveness of the proposed control algorithms.

  14. Shot Automation for the National Ignition Facility

    International Nuclear Information System (INIS)

    Lagin, L J; Bettenhausen, R C; Beeler, R G; Bowers, G A; Carey, R.; Casavant, D.D.; Cline, B.D.; Demaret, R.D.; Domyancic, D.M.; Elko, S.D.; Fisher, J.M.; Hermann, M.R.; Krammen, J.E.; Kohut, T.R.; Marshall, C.D.; Mathisen, D.G.; Ludwigsen, A.P.; Patterson, Jr. R.W.; Sanchez, R.J.; Stout, E.A.; Van Arsdall, P.J.; Van Wonterghem, B.M.

    2005-01-01

    A shot automation framework has been developed and deployed during the past year to automate shots performed on the National Ignition Facility (NIF) using the Integrated Computer Control System This framework automates a 4-8 hour shot sequence, that includes inputting shot goals from a physics model, set up of the laser and diagnostics, automatic alignment of laser beams and verification of status. This sequence consists of set of preparatory verification shots, leading to amplified system shots using a 4-minute countdown, triggering during the last 2 seconds using a high-precision timing system, followed by post-shot analysis and archiving. The framework provides for a flexible, model-based execution driven of scriptable automation called macro steps. The framework is driven by high-level shot director software that provides a restricted set of shot life cycle state transitions to 25 collaboration supervisors that automate 8-laser beams (bundles) and a common set of shared resources. Each collaboration supervisor commands approximately 10 subsystem shot supervisors that perform automated control and status verification. Collaboration supervisors translate shot life cycle state commands from the shot director into sequences of ''macro steps'' to be distributed to each of its shot supervisors. Each Shot supervisor maintains order of macro steps for each subsystem and supports collaboration between macro steps. They also manage failure, restarts and rejoining into the shot cycle (if necessary) and manage auto/manual macro step execution and collaborations between other collaboration supervisors. Shot supervisors execute macro step shot functions commanded by collaboration supervisors. Each macro step has database-driven verification phases and a scripted perform phase. This provides for a highly flexible methodology for performing a variety of NIF shot types. Database tables define the order of work and dependencies (workflow) of macro steps to be performed for a

  15. Distributed Energy Systems: Security Implications of the Grid of the Future

    Energy Technology Data Exchange (ETDEWEB)

    Stamber, Kevin L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kelic, Andjelka [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Taylor, Robert A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Henry, Jordan M [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Stamp, Jason E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-01-01

    Distributed Energy Resources (DER) are being added to the nation's electric grid, and as penetration of these resources increases, they have the potential to displace or offset large-scale, capital-intensive, centralized generation. Integration of DER into operation of the traditional electric grid requires automated operational control and communication of DER elements, from system measurement to control hardware and software, in conjunction with a utility's existing automated and human-directed control of other portions of the system. Implementation of DER technologies suggests a number of gaps from both a security and a policy perspective. This page intentionally left blank.

  16. An automated system for the correlation measurement of γ-quanta energy distribution

    International Nuclear Information System (INIS)

    Ofengenden, R.G.; Berezin, F.N.; Patlan', Yu.V.; Shalejko, A.M.; Shidlyk, A.M.; Shchur, A.M.

    1983-01-01

    Hardware and software are described in brief for an automated system, to measure the energy and time distributions of gamma-quanta, which ensures accumulation and preliminary processing of experimental data while realizing various physical techniques for investigation. The system is based on the SM-4 computer and electronic-physical equipment produced in the CAMAC standard. In the SM-4 computer the RAFOS operational system is employed which has some advantages in solving the tasks of multidimensional data acquisition and analysis, when a high response and real-time operation are reqUired. Certain components of soltware are worked oUt and included in the system: an operational system version with a larger set of drivers which is adapted to the equipment configuration used; library of macrodeterminations and service object library; subsystem of tuning and testing; subsystem of data acquisition and initial processing

  17. Integration of Cloud resources in the LHCb Distributed Computing

    CERN Document Server

    Ubeda Garcia, Mario; Stagni, Federico; Cabarrou, Baptiste; Rauschmayr, Nathalie; Charpentier, Philippe; Closier, Joel

    2014-01-01

    This contribution describes how Cloud resources have been integrated in the LHCb Distributed Computing. LHCb is using its specific Dirac extension (LHCbDirac) as an interware for its Distributed Computing. So far, it was seamlessly integrating Grid resources and Computer clusters. The cloud extension of DIRAC (VMDIRAC) allows the integration of Cloud computing infrastructures. It is able to interact with multiple types of infrastructures in commercial and institutional clouds, supported by multiple interfaces (Amazon EC2, OpenNebula, OpenStack and CloudStack) – instantiates, monitors and manages Virtual Machines running on this aggregation of Cloud resources. Moreover, specifications for institutional Cloud resources proposed by Worldwide LHC Computing Grid (WLCG), mainly by the High Energy Physics Unix Information Exchange (HEPiX) group, have been taken into account. Several initiatives and computing resource providers in the eScience environment have already deployed IaaS in production during 2013. Keepin...

  18. Assessment of the Current Level of Automation in the Manufacture of Fuel Cell Systems for Combined Heat and Power Applications

    Energy Technology Data Exchange (ETDEWEB)

    Ulsh, M.; Wheeler, D.; Protopappas, P.

    2011-08-01

    The U.S. Department of Energy (DOE) is interested in supporting manufacturing research and development (R&D) for fuel cell systems in the 10-1,000 kilowatt (kW) power range relevant to stationary and distributed combined heat and power applications, with the intent to reduce manufacturing costs and increase production throughput. To assist in future decision-making, DOE requested that the National Renewable Energy Laboratory (NREL) provide a baseline understanding of the current levels of adoption of automation in manufacturing processes and flow, as well as of continuous processes. NREL identified and visited or interviewed key manufacturers, universities, and laboratories relevant to the study using a standard questionnaire. The questionnaire covered the current level of vertical integration, the importance of quality control developments for automation, the current level of automation and source of automation design, critical balance of plant issues, potential for continuous cell manufacturing, key manufacturing steps or processes that would benefit from DOE support for manufacturing R&D, the potential for cell or stack design changes to support automation, and the relationship between production volume and decisions on automation.

  19. Fuzzy comprehensive evaluation for grid-connected performance of integrated distributed PV-ES systems

    Science.gov (United States)

    Lv, Z. H.; Li, Q.; Huang, R. W.; Liu, H. M.; Liu, D.

    2016-08-01

    Based on the discussion about topology structure of integrated distributed photovoltaic (PV) power generation system and energy storage (ES) in single or mixed type, this paper focuses on analyzing grid-connected performance of integrated distributed photovoltaic and energy storage (PV-ES) systems, and proposes a comprehensive evaluation index system. Then a multi-level fuzzy comprehensive evaluation method based on grey correlation degree is proposed, and the calculations for weight matrix and fuzzy matrix are presented step by step. Finally, a distributed integrated PV-ES power generation system connected to a 380 V low voltage distribution network is taken as the example, and some suggestions are made based on the evaluation results.

  20. Workload Capacity: A Response Time-Based Measure of Automation Dependence.

    Science.gov (United States)

    Yamani, Yusuke; McCarley, Jason S

    2016-05-01

    An experiment used the workload capacity measure C(t) to quantify the processing efficiency of human-automation teams and identify operators' automation usage strategies in a speeded decision task. Although response accuracy rates and related measures are often used to measure the influence of an automated decision aid on human performance, aids can also influence response speed. Mean response times (RTs), however, conflate the influence of the human operator and the automated aid on team performance and may mask changes in the operator's performance strategy under aided conditions. The present study used a measure of parallel processing efficiency, or workload capacity, derived from empirical RT distributions as a novel gauge of human-automation performance and automation dependence in a speeded task. Participants performed a speeded probabilistic decision task with and without the assistance of an automated aid. RT distributions were used to calculate two variants of a workload capacity measure, COR(t) and CAND(t). Capacity measures gave evidence that a diagnosis from the automated aid speeded human participants' responses, and that participants did not moderate their own decision times in anticipation of diagnoses from the aid. Workload capacity provides a sensitive and informative measure of human-automation performance and operators' automation dependence in speeded tasks. © 2016, Human Factors and Ergonomics Society.

  1. Developing a Cyberinfrastructure for integrated assessments of environmental contaminants.

    Science.gov (United States)

    Kaur, Taranjit; Singh, Jatinder; Goodale, Wing M; Kramar, David; Nelson, Peter

    2005-03-01

    The objective of this study was to design and implement prototype software for capturing field data and automating the process for reporting and analyzing the distribution of mercury. The four phase process used to design, develop, deploy and evaluate the prototype software is described. Two different development strategies were used: (1) design of a mobile data collection application intended to capture field data in a meaningful format and automate transfer into user databases, followed by (2) a re-engineering of the original software to develop an integrated database environment with improved methods for aggregating and sharing data. Results demonstrated that innovative use of commercially available hardware and software components can lead to the development of an end-to-end digital cyberinfrastructure that captures, records, stores, transmits, compiles and integrates multi-source data as it relates to mercury.

  2. Systematic review automation technologies

    Science.gov (United States)

    2014-01-01

    Systematic reviews, a cornerstone of evidence-based medicine, are not produced quickly enough to support clinical practice. The cost of production, availability of the requisite expertise and timeliness are often quoted as major contributors for the delay. This detailed survey of the state of the art of information systems designed to support or automate individual tasks in the systematic review, and in particular systematic reviews of randomized controlled clinical trials, reveals trends that see the convergence of several parallel research projects. We surveyed literature describing informatics systems that support or automate the processes of systematic review or each of the tasks of the systematic review. Several projects focus on automating, simplifying and/or streamlining specific tasks of the systematic review. Some tasks are already fully automated while others are still largely manual. In this review, we describe each task and the effect that its automation would have on the entire systematic review process, summarize the existing information system support for each task, and highlight where further research is needed for realizing automation for the task. Integration of the systems that automate systematic review tasks may lead to a revised systematic review workflow. We envisage the optimized workflow will lead to system in which each systematic review is described as a computer program that automatically retrieves relevant trials, appraises them, extracts and synthesizes data, evaluates the risk of bias, performs meta-analysis calculations, and produces a report in real time. PMID:25005128

  3. The Science of Home Automation

    Science.gov (United States)

    Thomas, Brian Louis

    Smart home technologies and the concept of home automation have become more popular in recent years. This popularity has been accompanied by social acceptance of passive sensors installed throughout the home. The subsequent increase in smart homes facilitates the creation of home automation strategies. We believe that home automation strategies can be generated intelligently by utilizing smart home sensors and activity learning. In this dissertation, we hypothesize that home automation can benefit from activity awareness. To test this, we develop our activity-aware smart automation system, CARL (CASAS Activity-aware Resource Learning). CARL learns the associations between activities and device usage from historical data and utilizes the activity-aware capabilities to control the devices. To help validate CARL we deploy and test three different versions of the automation system in a real-world smart environment. To provide a foundation of activity learning, we integrate existing activity recognition and activity forecasting into CARL home automation. We also explore two alternatives to using human-labeled data to train the activity learning models. The first unsupervised method is Activity Detection, and the second is a modified DBSCAN algorithm that utilizes Dynamic Time Warping (DTW) as a distance metric. We compare the performance of activity learning with human-defined labels and with automatically-discovered activity categories. To provide evidence in support of our hypothesis, we evaluate CARL automation in a smart home testbed. Our results indicate that home automation can be boosted through activity awareness. We also find that the resulting automation has a high degree of usability and comfort for the smart home resident.

  4. Smart thermal grid with integration of distributed and centralized solar energy systems

    International Nuclear Information System (INIS)

    Yang, Libing; Entchev, Evgueniy; Rosato, Antonio; Sibilio, Sergio

    2017-01-01

    Smart thermal grids (STGs) are able to perform the same function as classical grids, but are developed in order to make better use of distributed, possibly intermittent, thermal energy resources and to provide the required energy when needed through efficient resources utilization and intelligent management. District heating (DH) plays a significant role in the implementation of future smart energy systems. To fulfil its role, DH technologies must be further developed to integrate renewable resources, create low-temperature networks, and consequently to make existing or new DH networks ready for integration into future STGs. Solar heating is a promising option for low-temperature DH systems. Thermal energy storage (TES) can make the availability of the energy supply match the demand. An integration of centralized seasonal and distributed short-term thermal storages would facilitate an efficient recovery of the solar energy. This study, through modelling and simulation, investigates the impacts of such integration on the overall performance of a community-level solar DH system. The performance analysis results show that the solar DH system with integration of distributed and centralized seasonal TESs improves system overall efficiency, and reduces DH network heat losses, primary energy consumption and greenhouse gas emissions, in comparison to the one without integration. - Highlights: • STG should be designed to store energy in the most efficient way at the most effective location. • Integration of centralized seasonal and distributed TESs in a solar DH system is proposed. • Performance of such integrated solar DH system is evaluated and compared to the one without. • The integration results in reduction of primary energy consumption and GHG emission. • The integration improves the overall efficiency of the total solar energy system.

  5. ICECAP: an integrated, general-purpose, automation-assisted IC50/EC50 assay platform.

    Science.gov (United States)

    Li, Ming; Chou, Judy; King, Kristopher W; Jing, Jing; Wei, Dong; Yang, Liyu

    2015-02-01

    IC50 and EC50 values are commonly used to evaluate drug potency. Mass spectrometry (MS)-centric bioanalytical and biomarker labs are now conducting IC50/EC50 assays, which, if done manually, are tedious and error-prone. Existing bioanalytical sample preparation automation systems cannot meet IC50/EC50 assay throughput demand. A general-purpose, automation-assisted IC50/EC50 assay platform was developed to automate the calculations of spiking solutions and the matrix solutions preparation scheme, the actual spiking and matrix solutions preparations, as well as the flexible sample extraction procedures after incubation. In addition, the platform also automates the data extraction, nonlinear regression curve fitting, computation of IC50/EC50 values, graphing, and reporting. The automation-assisted IC50/EC50 assay platform can process the whole class of assays of varying assay conditions. In each run, the system can handle up to 32 compounds and up to 10 concentration levels per compound, and it greatly improves IC50/EC50 assay experimental productivity and data processing efficiency. © 2014 Society for Laboratory Automation and Screening.

  6. New automated pellet/powder assay system

    International Nuclear Information System (INIS)

    Olsen, R.N.

    1975-01-01

    This paper discusses an automated, high precision, pellet/ powder assay system. The system is an active assay system using a small isotopic neutron source and a coincidence detection system. The handling of the pellet powder samples has been automated and a programmable calculator has been integrated into the system to provide control and data analysis. The versatile system can assay uranium or plutonium in either active or passive modes

  7. Developing an Integration Infrastructure for Distributed Engine Control Technologies

    Science.gov (United States)

    Culley, Dennis; Zinnecker, Alicia; Aretskin-Hariton, Eliot; Kratz, Jonathan

    2014-01-01

    Turbine engine control technology is poised to make the first revolutionary leap forward since the advent of full authority digital engine control in the mid-1980s. This change aims squarely at overcoming the physical constraints that have historically limited control system hardware on aero-engines to a federated architecture. Distributed control architecture allows complex analog interfaces existing between system elements and the control unit to be replaced by standardized digital interfaces. Embedded processing, enabled by high temperature electronics, provides for digitization of signals at the source and network communications resulting in a modular system at the hardware level. While this scheme simplifies the physical integration of the system, its complexity appears in other ways. In fact, integration now becomes a shared responsibility among suppliers and system integrators. While these are the most obvious changes, there are additional concerns about performance, reliability, and failure modes due to distributed architecture that warrant detailed study. This paper describes the development of a new facility intended to address the many challenges of the underlying technologies of distributed control. The facility is capable of performing both simulation and hardware studies ranging from component to system level complexity. Its modular and hierarchical structure allows the user to focus their interaction on specific areas of interest.

  8. A system-level approach to automation research

    Science.gov (United States)

    Harrison, F. W.; Orlando, N. E.

    1984-01-01

    Automation is the application of self-regulating mechanical and electronic devices to processes that can be accomplished with the human organs of perception, decision, and actuation. The successful application of automation to a system process should reduce man/system interaction and the perceived complexity of the system, or should increase affordability, productivity, quality control, and safety. The expense, time constraints, and risk factors associated with extravehicular activities have led the Automation Technology Branch (ATB), as part of the NASA Automation Research and Technology Program, to investigate the use of robots and teleoperators as automation aids in the context of space operations. The ATB program addresses three major areas: (1) basic research in autonomous operations, (2) human factors research on man-machine interfaces with remote systems, and (3) the integration and analysis of automated systems. This paper reviews the current ATB research in the area of robotics and teleoperators.

  9. Integrated Nationwide Electronic Health Records system: Semi-distributed architecture approach.

    Science.gov (United States)

    Fragidis, Leonidas L; Chatzoglou, Prodromos D; Aggelidis, Vassilios P

    2016-11-14

    The integration of heterogeneous electronic health records systems by building an interoperable nationwide electronic health record system provides undisputable benefits in health care, like superior health information quality, medical errors prevention and cost saving. This paper proposes a semi-distributed system architecture approach for an integrated national electronic health record system incorporating the advantages of the two dominant approaches, the centralized architecture and the distributed architecture. The high level design of the main elements for the proposed architecture is provided along with diagrams of execution and operation and data synchronization architecture for the proposed solution. The proposed approach effectively handles issues related to redundancy, consistency, security, privacy, availability, load balancing, maintainability, complexity and interoperability of citizen's health data. The proposed semi-distributed architecture offers a robust interoperability framework without healthcare providers to change their local EHR systems. It is a pragmatic approach taking into account the characteristics of the Greek national healthcare system along with the national public administration data communication network infrastructure, for achieving EHR integration with acceptable implementation cost.

  10. Distributed and multi-core computation of 2-loop integrals

    International Nuclear Information System (INIS)

    De Doncker, E; Yuasa, F

    2014-01-01

    For an automatic computation of Feynman loop integrals in the physical region we rely on an extrapolation technique where the integrals of the sequence are obtained with iterated/repeated adaptive methods from the QUADPACK 1D quadrature package. The integration rule evaluations in the outer level, corresponding to independent inner integral approximations, are assigned to threads dynamically via the OpenMP runtime in the parallel implementation. Furthermore, multi-level (nested) parallelism enables an efficient utilization of hyperthreading or larger numbers of cores. For a class of loop integrals in the unphysical region, which do not suffer from singularities in the interior of the integration domain, we find that the distributed adaptive integration methods in the multivariate PARINT package are highly efficient and accurate. We apply these techniques without resorting to integral transformations and report on the capabilities of the algorithms and the parallel performance for a test set including various types of two-loop integrals

  11. The Automated Aircraft Rework System (AARS): A system integration approach

    Science.gov (United States)

    Benoit, Michael J.

    1994-01-01

    The Mercer Engineering Research Center (MERC), under contract to the United States Air Force (USAF) since 1989, has been actively involved in providing the Warner Robins Air Logistics Center (WR-ALC) with a robotic workcell designed to perform rework automated defastening and hole location/transfer operations on F-15 wings. This paper describes the activities required to develop and implement this workcell, known as the Automated Aircraft Rework System (AARS). AARS is scheduled to be completely installed and in operation at WR-ALC by September 1994.

  12. Advanced Air Traffic Management Research (Human Factors and Automation): NASA Research Initiatives in Human-Centered Automation Design in Airspace Management

    Science.gov (United States)

    Corker, Kevin M.; Condon, Gregory W. (Technical Monitor)

    1996-01-01

    NASA has initiated a significant thrust of research and development focused on providing the flight crew and air traffic managers automation aids to increase capacity in en route and terminal area operations through the use of flexible, more fuel-efficient routing, while improving the level of safety in commercial carrier operations. In that system development, definition of cognitive requirements for integrated multi-operator dynamic aiding systems is fundamental. The core processes of control and the distribution of decision making in that control are undergoing extensive analysis. From our perspective, the human operators and the procedures by which they interact are the fundamental determinants of the safe, efficient, and flexible operation of the system. In that perspective, we have begun to explore what our experience has taught will be the most challenging aspects of designing and integrating human-centered automation in the advanced system. We have performed a full mission simulation looking at the role shift to self-separation on board the aircraft with the rules of the air guiding behavior and the provision of a cockpit display of traffic information and an on-board traffic alert system that seamlessly integrates into the TCAS operations. We have performed and initial investigation of the operational impact of "Dynamic Density" metrics on controller relinquishing and reestablishing full separation authority. (We follow the assumption that responsibility at all times resides with the controller.) This presentation will describe those efforts as well as describe the process by which we will guide the development of error tolerant systems that are sensitive to shifts in operator work load levels and dynamic shifts in the operating point of air traffic management.

  13. Automation of analytical systems in power cycles

    International Nuclear Information System (INIS)

    Staub Lukas

    2008-01-01

    'Automation' is a widely used term in instrumentation and is often applied to signal exchange, PLC and SCADA systems. Common use, however, does not necessarily described autonomous operation of analytical devices. We define an automated analytical system as a black box with an input (sample) and an output (measured value). In addition we need dedicated status lines for assessing the validities of the input for our black box and the output for subsequent systems. We will discuss input parameters, automated analytical processes and output parameters. Further considerations will be given to signal exchange and integration into the operating routine of a power plant. Local control loops (chemical dosing) and the automation of sampling systems are not discussed here. (author)

  14. Distribution grid reconfiguration reduces power losses and helps integrate renewables

    International Nuclear Information System (INIS)

    Lueken, Colleen; Carvalho, Pedro M.S.; Apt, Jay

    2012-01-01

    A reconfigurable network can change its topology by opening and closing switches on power lines. We use real wind, solar, load, and cost data and a model of a reconfigurable distribution grid to show that reconfiguration allows a grid operator to reduce operational losses as well as to accept more intermittent renewable generation than a static configuration can. Net present value analysis of automated switch technology shows that the return on investment is negative for this test network when considering only loss reduction, but that the investment is attractive under certain conditions when reconfiguration is used to minimize curtailment. - Highlights: ► Reconfiguration may reduce losses in grids with solar or wind distributed generation. ► Reconfigurable networks can accept more solar or wind DG than static ones. ► Using reconfiguration for loss reduction would not create a positive ROI. ► Using reconfiguration to reduce curtailment usually would create a positive ROI.

  15. First Annual Workshop on Space Operations Automation and Robotics (SOAR 87)

    Science.gov (United States)

    Griffin, Sandy (Editor)

    1987-01-01

    Several topics relative to automation and robotics technology are discussed. Automation of checkout, ground support, and logistics; automated software development; man-machine interfaces; neural networks; systems engineering and distributed/parallel processing architectures; and artificial intelligence/expert systems are among the topics covered.

  16. Centrifugal LabTube platform for fully automated DNA purification and LAMP amplification based on an integrated, low-cost heating system.

    Science.gov (United States)

    Hoehl, Melanie M; Weißert, Michael; Dannenberg, Arne; Nesch, Thomas; Paust, Nils; von Stetten, Felix; Zengerle, Roland; Slocum, Alexander H; Steigert, Juergen

    2014-06-01

    This paper introduces a disposable battery-driven heating system for loop-mediated isothermal DNA amplification (LAMP) inside a centrifugally-driven DNA purification platform (LabTube). We demonstrate LabTube-based fully automated DNA purification of as low as 100 cell-equivalents of verotoxin-producing Escherichia coli (VTEC) in water, milk and apple juice in a laboratory centrifuge, followed by integrated and automated LAMP amplification with a reduction of hands-on time from 45 to 1 min. The heating system consists of two parallel SMD thick film resistors and a NTC as heating and temperature sensing elements. They are driven by a 3 V battery and controlled by a microcontroller. The LAMP reagents are stored in the elution chamber and the amplification starts immediately after the eluate is purged into the chamber. The LabTube, including a microcontroller-based heating system, demonstrates contamination-free and automated sample-to-answer nucleic acid testing within a laboratory centrifuge. The heating system can be easily parallelized within one LabTube and it is deployable for a variety of heating and electrical applications.

  17. The use of software agents and distributed objects to integrate enterprises: Compatible or competing technologies?

    Energy Technology Data Exchange (ETDEWEB)

    Pancerella, C.M.

    1998-04-01

    Distributed object and software agent technologies are two integration methods for connecting enterprises. The two technologies have overlapping goals--interoperability and architectural support for integrating software components--though to date little or no integration of the two technologies has been made at the enterprise level. The primary difference between these two technologies is that distributed object technologies focus on the problems inherent in connecting distributed heterogeneous systems whereas software agent technologies focus on the problems involved with coordination and knowledge exchange across domain boundaries. This paper addresses the integration of these technologies in support of enterprise integration across organizational and geographic boundaries. The authors discuss enterprise integration issues, review their experiences with both technologies, and make recommendations for future work. Neither technology is a panacea. Good software engineering techniques must be applied to integrate an enterprise because scalability and a distributed software development team are realities.

  18. Using system architecture, review entry criteria, and standard work package data to enable rapid development of integrated master schedules

    OpenAIRE

    Porter, Burton W., Jr.

    2016-01-01

    Approved for public release; distribution is unlimited While engineers must participate in the construction of the Integrated Master Schedule, this thesis proposes a way to reduce that effort through automation. When standardized sub processes exist, automated task name construction with consistent action/object naming convention can be applied to multiple system artifacts. These repeating sub processes also allow the derivation of task sequence and dependencies. The Architecture-Based Uti...

  19. Integration of Geographical Information Systems and Geophysical Applications with Distributed Computing Technologies.

    Science.gov (United States)

    Pierce, M. E.; Aktas, M. S.; Aydin, G.; Fox, G. C.; Gadgil, H.; Sayar, A.

    2005-12-01

    We examine the application of Web Service Architectures and Grid-based distributed computing technologies to geophysics and geo-informatics. We are particularly interested in the integration of Geographical Information System (GIS) services with distributed data mining applications. GIS services provide the general purpose framework for building archival data services, real time streaming data services, and map-based visualization services that may be integrated with data mining and other applications through the use of distributed messaging systems and Web Service orchestration tools. Building upon on our previous work in these areas, we present our current research efforts. These include fundamental investigations into increasing XML-based Web service performance, supporting real time data streams, and integrating GIS mapping tools with audio/video collaboration systems for shared display and annotation.

  20. GENIUS : An integrated environment for supporting the design of generic automated negotiators

    NARCIS (Netherlands)

    Lin, R.; Kraus, S.; Baarslag, T.; Tykhonov, D.; Hindriks, K.; Jonker, C.M.

    2012-01-01

    The design of automated negotiators has been the focus of abundant research in recent years. However, due to difficulties involved in creating generalized agents that can negotiate in several domains and against human counterparts, many automated negotiators are domain specific and their behavior

  1. AGIS: Integration of new technologies used in ATLAS Distributed Computing

    Science.gov (United States)

    Anisenkov, Alexey; Di Girolamo, Alessandro; Alandes Pradillo, Maria

    2017-10-01

    The variety of the ATLAS Distributed Computing infrastructure requires a central information system to define the topology of computing resources and to store different parameters and configuration data which are needed by various ATLAS software components. The ATLAS Grid Information System (AGIS) is the system designed to integrate configuration and status information about resources, services and topology of the computing infrastructure used by ATLAS Distributed Computing applications and services. Being an intermediate middleware system between clients and external information sources (like central BDII, GOCDB, MyOSG), AGIS defines the relations between experiment specific used resources and physical distributed computing capabilities. Being in production during LHC Runl AGIS became the central information system for Distributed Computing in ATLAS and it is continuously evolving to fulfil new user requests, enable enhanced operations and follow the extension of the ATLAS Computing model. The ATLAS Computing model and data structures used by Distributed Computing applications and services are continuously evolving and trend to fit newer requirements from ADC community. In this note, we describe the evolution and the recent developments of AGIS functionalities, related to integration of new technologies recently become widely used in ATLAS Computing, like flexible computing utilization of opportunistic Cloud and HPC resources, ObjectStore services integration for Distributed Data Management (Rucio) and ATLAS workload management (PanDA) systems, unified storage protocols declaration required for PandDA Pilot site movers and others. The improvements of information model and general updates are also shown, in particular we explain how other collaborations outside ATLAS could benefit the system as a computing resources information catalogue. AGIS is evolving towards a common information system, not coupled to a specific experiment.

  2. Final Technical Report: Integrated Distribution-Transmission Analysis for Very High Penetration Solar PV

    Energy Technology Data Exchange (ETDEWEB)

    Palmintier, Bryan [NREL (National Renewable Energy Laboratory (NREL), Golden, CO (United States)); Hale, Elaine [NREL (National Renewable Energy Laboratory (NREL), Golden, CO (United States)); Hansen, Timothy M. [NREL (National Renewable Energy Laboratory (NREL), Golden, CO (United States)); Jones, Wesley [NREL (National Renewable Energy Laboratory (NREL), Golden, CO (United States)); Biagioni, David [NREL (National Renewable Energy Laboratory (NREL), Golden, CO (United States)); Baker, Kyri [NREL (National Renewable Energy Laboratory (NREL), Golden, CO (United States)); Wu, Hongyu [NREL (National Renewable Energy Laboratory (NREL), Golden, CO (United States)); Giraldez, Julieta [NREL (National Renewable Energy Laboratory (NREL), Golden, CO (United States)); Sorensen, Harry [NREL (National Renewable Energy Laboratory (NREL), Golden, CO (United States)); Lunacek, Monte [NREL (National Renewable Energy Laboratory (NREL), Golden, CO (United States)); Merket, Noel [NREL (National Renewable Energy Laboratory (NREL), Golden, CO (United States)); Jorgenson, Jennie [NREL (National Renewable Energy Laboratory (NREL), Golden, CO (United States)); Hodge, Bri-Mathias [NREL (National Renewable Energy Laboratory (NREL), Golden, CO (United States))

    2016-01-29

    Transmission and distribution simulations have historically been conducted separately, echoing their division in grid operations and planning while avoiding inherent computational challenges. Today, however, rapid growth in distributed energy resources (DERs)--including distributed generation from solar photovoltaics (DGPV)--requires understanding the unprecedented interactions between distribution and transmission. To capture these interactions, especially for high-penetration DGPV scenarios, this research project developed a first-of-its-kind, high performance computer (HPC) based, integrated transmission-distribution tool, the Integrated Grid Modeling System (IGMS). The tool was then used in initial explorations of system-wide operational interactions of high-penetration DGPV.

  3. System Performance of an Integrated Airborne Spacing Algorithm with Ground Automation

    Science.gov (United States)

    Swieringa, Kurt A.; Wilson, Sara R.; Baxley, Brian T.

    2016-01-01

    The National Aeronautics and Space Administration's (NASA's) first Air Traffic Management (ATM) Technology Demonstration (ATD-1) was created to facilitate the transition of mature ATM technologies from the laboratory to operational use. The technologies selected for demonstration are the Traffic Management Advisor with Terminal Metering (TMA-TM), which provides precise time-based scheduling in the Terminal airspace; Controller Managed Spacing (CMS), which provides controllers with decision support tools to enable precise schedule conformance; and Interval Management (IM), which consists of flight deck automation that enables aircraft to achieve or maintain precise spacing behind another aircraft. Recent simulations and IM algorithm development at NASA have focused on trajectory-based IM operations where aircraft equipped with IM avionics are expected to achieve a spacing goal, assigned by air traffic controllers, at the final approach fix. The recently published IM Minimum Operational Performance Standards describe five types of IM operations. This paper discusses the results and conclusions of a human-in-the-loop simulation that investigated three of those IM operations. The results presented in this paper focus on system performance and integration metrics. Overall, the IM operations conducted in this simulation integrated well with ground-based decisions support tools and certain types of IM operational were able to provide improved spacing precision at the final approach fix; however, some issues were identified that should be addressed prior to implementing IM procedures into real-world operations.

  4. Distributed optical fiber sensors for integrated monitoring of railway infrastructures

    Science.gov (United States)

    Minardo, Aldo; Coscetta, Agnese; Porcaro, Giuseppe; Giannetta, Daniele; Bernini, Romeo; Zeni, Luigi

    2014-05-01

    We propose the application of a distributed optical fiber sensor based on stimulated Brillouin scattering, as an integrated system for safety monitoring of railway infrastructures. The strain distribution was measured dynamically along a 60 meters length of rail track, as well as along a 3-m stone arch bridge. The results indicate that distributed sensing technology is able to provide useful information in railway traffic and safety monitoring.

  5. INTEGRATING DISTRIBUTED WORK: COMPARING TASK DESIGN, COMMUNICATION, AND TACIT COORDINATION MECHANISMS

    DEFF Research Database (Denmark)

    Srikanth, K.; Puranam, P.

    2011-01-01

    We investigate coordination strategies in integrating distributed work. In the context of Business Process Offshoring (BPO), we analyze survey data from 126 offshored processes to understand both the sources of difficulty in integrating distributed work as well as how organizations overcome...... on tacit coordination-and theoretically articulate and empirically show that tacit coordination mechanisms are distinct from the well-known duo of coordination strategies: building communication channels or modularizing processes to minimize the need for communication. We discuss implications for the study...

  6. Design and development on automated control system of coated fuel particle fabrication process

    International Nuclear Information System (INIS)

    Liu Malin; Shao Youlin; Liu Bing

    2013-01-01

    With the development trend of the large-scale production of the HTR coated fuel particles, the original manual control system can not meet the requirement and the automation control system of coated fuel particle fabrication in modern industrial grade is needed to develop. The comprehensive analysis aiming at successive 4-layer coating process of TRISO type coated fuel particles was carried out. It was found that the coating process could be divided into five subsystems and nine operating states. The establishment of DCS-type (distributed control system) of automation control system was proposed. According to the rigorous requirements of preparation process for coated particles, the design considerations of DCS were proposed, including the principle of coordinated control, safety and reliability, integration specification, practical and easy to use, and open and easy to update. A complete set of automation control system for coated fuel particle preparation process was manufactured based on fulfilling the requirements of these principles in manufacture practice. The automated control system was put into operation in the production of irradiated samples for HTRPM demonstration project. The experimental results prove that the system can achieve better control of coated fuel particle preparation process and meet the requirements of factory-scale production. (authors)

  7. Optimal Real-time Dispatch for Integrated Energy Systems

    DEFF Research Database (Denmark)

    Anvari-Moghaddam, Amjad; Guerrero, Josep M.; Rahimi-Kian, Ashkan

    2016-01-01

    With the emerging of small-scale integrated energy systems (IESs), there are significant potentials to increase the functionality of a typical demand-side management (DSM) strategy and typical implementation of building-level distributed energy resources (DERs). By integrating DSM and DERs...... into a cohesive, networked package that fully utilizes smart energy-efficient end-use devices, advanced building control/automation systems, and integrated communications architectures, it is possible to efficiently manage energy and comfort at the end-use location. In this paper, an ontology-driven multi......-agent control system with intelligent optimizers is proposed for optimal real-time dispatch of an integrated building and microgrid system considering coordinated demand response (DR) and DERs management. The optimal dispatch problem is formulated as a mixed integer nonlinear programing problem (MINLP...

  8. Optimal Solar PV Arrays Integration for Distributed Generation

    Energy Technology Data Exchange (ETDEWEB)

    Omitaomu, Olufemi A [ORNL; Li, Xueping [University of Tennessee, Knoxville (UTK)

    2012-01-01

    Solar photovoltaic (PV) systems hold great potential for distributed energy generation by installing PV panels on rooftops of residential and commercial buildings. Yet challenges arise along with the variability and non-dispatchability of the PV systems that affect the stability of the grid and the economics of the PV system. This paper investigates the integration of PV arrays for distributed generation applications by identifying a combination of buildings that will maximize solar energy output and minimize system variability. Particularly, we propose mean-variance optimization models to choose suitable rooftops for PV integration based on Markowitz mean-variance portfolio selection model. We further introduce quantity and cardinality constraints to result in a mixed integer quadratic programming problem. Case studies based on real data are presented. An efficient frontier is obtained for sample data that allows decision makers to choose a desired solar energy generation level with a comfortable variability tolerance level. Sensitivity analysis is conducted to show the tradeoffs between solar PV energy generation potential and variability.

  9. Developing an automated water emitting-sensing system, based on integral tensiometers placed in homogenous environment.

    Science.gov (United States)

    Dabach, Sharon; Shani, Uri

    2010-05-01

    As the population grows, irrigated agriculture is using more water and fertilizers to supply the growing food demand. However, the uptake by various plants is only 30 to 50% of the water applied. The remaining water flows to surface water and groundwater and causes their contamination by fertilizers or other toxins such as herbicides or pesticides. To improve the water use efficiency of crops and decrease the drainage below the root zone, irrigation water should be applied according to the plant demand. The aim of this work is to develop an automated irrigation system based on real-time feedback from an inexpensive and reliable integrated sensing system. This system will supply water to plants according to their demand, without any user interference during the entire growth season. To achieve this goal a sensor (Geo-Tensiometer) was designed and tested. This sensor has better contact with the surrounding soil, is more reliable and much cheaper than the ceramic cup tensiometer. A lysimeter experiment was conducted to evaluate a subsurface drip irrigation regime based on the Geo-Tensiometer and compare it to a daily irrigation regime. All of the drippers were wrapped in Geo-textile. By integrating the Geo-Tensiometer within the Geo-textile which surrounds the drippers, we created a homogenous media in the entire lysimeter in which the reading of the matric potential takes place. This media, the properties of which are set and known to us, encourages root growth therein. Root density in this media is very high; therefore most of the plant water uptake is from this area. The irrigation system in treatment A irrigated when the matric potential reached a threshold which was set every morning automatically by the system. The daily treatment included a single irrigation each morning that was set to return 120% of the evapotranspiration of the previous day. All Geo-Tensiometers were connected to an automated washing system, that flushed air trapped in the Geo

  10. Classification of Automated Search Traffic

    Science.gov (United States)

    Buehrer, Greg; Stokes, Jack W.; Chellapilla, Kumar; Platt, John C.

    As web search providers seek to improve both relevance and response times, they are challenged by the ever-increasing tax of automated search query traffic. Third party systems interact with search engines for a variety of reasons, such as monitoring a web site’s rank, augmenting online games, or possibly to maliciously alter click-through rates. In this paper, we investigate automated traffic (sometimes referred to as bot traffic) in the query stream of a large search engine provider. We define automated traffic as any search query not generated by a human in real time. We first provide examples of different categories of query logs generated by automated means. We then develop many different features that distinguish between queries generated by people searching for information, and those generated by automated processes. We categorize these features into two classes, either an interpretation of the physical model of human interactions, or as behavioral patterns of automated interactions. Using the these detection features, we next classify the query stream using multiple binary classifiers. In addition, a multiclass classifier is then developed to identify subclasses of both normal and automated traffic. An active learning algorithm is used to suggest which user sessions to label to improve the accuracy of the multiclass classifier, while also seeking to discover new classes of automated traffic. Performance analysis are then provided. Finally, the multiclass classifier is used to predict the subclass distribution for the search query stream.

  11. Mining Repair Actions for Guiding Automated Program Fixing

    OpenAIRE

    Martinez , Matias; Monperrus , Martin

    2012-01-01

    Automated program fixing consists of generating source code in order to fix bugs in an automated manner. Our intuition is that automated program fixing can imitate human-based program fixing. Hence, we present a method to mine repair actions from software repositories. A repair action is a small semantic modification on code such as adding a method call. We then decorate repair actions with a probability distribution also learnt from software repositories. Our probabilistic repair models enab...

  12. Development of an Automated Technique for Failure Modes and Effect Analysis

    DEFF Research Database (Denmark)

    Blanke, M.; Borch, Ole; Allasia, G.

    1999-01-01

    Advances in automation have provided integration of monitoring and control functions to enhance the operator's overview and ability to take remedy actions when faults occur. Automation in plant supervision is technically possible with integrated automation systems as platforms, but new design...... methods are needed to cope efficiently with the complexity and to ensure that the functionality of a supervisor is correct and consistent. In particular these methods are expected to significantly improve fault tolerance of the designed systems. The purpose of this work is to develop a software module...... implementing an automated technique for Failure Modes and Effects Analysis (FMEA). This technique is based on the matrix formulation of FMEA for the investigation of failure propagation through a system. As main result, this technique will provide the design engineer with decision tables for fault handling...

  13. Development of an automated technique for failure modes and effect analysis

    DEFF Research Database (Denmark)

    Blanke, Mogens; Borch, Ole; Bagnoli, F.

    1999-01-01

    Advances in automation have provided integration of monitoring and control functions to enhance the operator's overview and ability to take remedy actions when faults occur. Automation in plant supervision is technically possible with integrated automation systems as platforms, but new design...... methods are needed to cope efficiently with the complexity and to ensure that the functionality of a supervisor is correct and consistent. In particular these methods are expected to significantly improve fault tolerance of the designed systems. The purpose of this work is to develop a software module...... implementing an automated technique for Failure Modes and Effects Analysis (FMEA). This technique is based on the matrix formulation of FMEA for the investigation of failure propagation through a system. As main result, this technique will provide the design engineer with decision tables for fault handling...

  14. AMIC: an expandable integrated analog front-end for light distribution moments analysis

    OpenAIRE

    SPAGGIARI, MICHELE; Herrero Bosch, Vicente; Lerche, Christoph Werner; Aliaga Varea, Ramón José; Monzó Ferrer, José María; Gadea Gironés, Rafael

    2011-01-01

    In this article we introduce AMIC (Analog Moments Integrated Circuit), a novel analog Application Specific Integrated Circuit (ASIC) front-end for Positron Emission Tomography (PET) applications. Its working principle is based on mathematical analysis of light distribution through moments calculation. Each moment provides useful information about light distribution, such as energy, position, depth of interaction, skewness (deformation due to border effect) etc. A current buffer delivers a cop...

  15. Development and evaluation of a profile negotiation process for integrating aircraft and air traffic control automation

    Science.gov (United States)

    Green, Steven M.; Denbraven, Wim; Williams, David H.

    1993-01-01

    The development and evaluation of the profile negotiation process (PNP), an interactive process between an aircraft and air traffic control (ATC) that integrates airborne and ground-based automation capabilities to determine conflict-free trajectories that are as close to an aircraft's preference as possible, are described. The PNP was evaluated in a real-time simulation experiment conducted jointly by NASA's Ames and Langley Research Centers. The Ames Center/TRACON Automation System (CTAS) was used to support the ATC environment, and the Langley Transport Systems Research Vehicle (TSRV) piloted cab was used to simulate a 4D Flight Management System (FMS) capable aircraft. Both systems were connected in real time by way of voice and data lines; digital datalink communications capability was developed and evaluated as a means of supporting the air/ground exchange of trajectory data. The controllers were able to consistently and effectively negotiate nominally conflict-free vertical profiles with the 4D-equipped aircraft. The actual profiles flown were substantially closer to the aircraft's preference than would have been possible without the PNP. However, there was a strong consensus among the pilots and controllers that the level of automation of the PNP should be increased to make the process more transparent. The experiment demonstrated the importance of an aircraft's ability to accurately execute a negotiated profile as well as the need for digital datalink to support advanced air/ground data communications. The concept of trajectory space is proposed as a comprehensive approach for coupling the processes of trajectory planning and tracking to allow maximum pilot discretion in meeting ATC constraints.

  16. Making progress with the automation of systematic reviews: principles of the International Collaboration for the Automation of Systematic Reviews (ICASR).

    Science.gov (United States)

    Beller, Elaine; Clark, Justin; Tsafnat, Guy; Adams, Clive; Diehl, Heinz; Lund, Hans; Ouzzani, Mourad; Thayer, Kristina; Thomas, James; Turner, Tari; Xia, Jun; Robinson, Karen; Glasziou, Paul

    2018-05-19

    Systematic reviews (SR) are vital to health care, but have become complicated and time-consuming, due to the rapid expansion of evidence to be synthesised. Fortunately, many tasks of systematic reviews have the potential to be automated or may be assisted by automation. Recent advances in natural language processing, text mining and machine learning have produced new algorithms that can accurately mimic human endeavour in systematic review activity, faster and more cheaply. Automation tools need to be able to work together, to exchange data and results. Therefore, we initiated the International Collaboration for the Automation of Systematic Reviews (ICASR), to successfully put all the parts of automation of systematic review production together. The first meeting was held in Vienna in October 2015. We established a set of principles to enable tools to be developed and integrated into toolkits.This paper sets out the principles devised at that meeting, which cover the need for improvement in efficiency of SR tasks, automation across the spectrum of SR tasks, continuous improvement, adherence to high quality standards, flexibility of use and combining components, the need for a collaboration and varied skills, the desire for open source, shared code and evaluation, and a requirement for replicability through rigorous and open evaluation.Automation has a great potential to improve the speed of systematic reviews. Considerable work is already being done on many of the steps involved in a review. The 'Vienna Principles' set out in this paper aim to guide a more coordinated effort which will allow the integration of work by separate teams and build on the experience, code and evaluations done by the many teams working across the globe.

  17. Managing Risks in Distributed Software Projects: An Integrative Framework

    DEFF Research Database (Denmark)

    Persson, John Stouby; Mathiassen, Lars; Boeg, Jesper

    2009-01-01

    techniques into an integrative framework for managing risks in distributed contexts. Subsequent implementation of a Web-based tool helped us refine the framework based on empirical evaluation of its practical usefulness.We conclude by discussing implications for both research and practice.......Software projects are increasingly geographically distributed with limited face-to-face interaction between participants. These projects face particular challenges that need carefulmanagerial attention. While risk management has been adopted with success to address other challenges within software...... development, there are currently no frameworks available for managing risks related to geographical distribution. On this background, we systematically review the literature on geographically distributed software projects. Based on the review, we synthesize what we know about risks and risk resolution...

  18. Integrated operation of electric vehicles and renewable generation in a smart distribution system

    International Nuclear Information System (INIS)

    Zakariazadeh, Alireza; Jadid, Shahram; Siano, Pierluigi

    2015-01-01

    Highlights: • The contribution of electric vehicles to provide the reserve capacity is analyzed. • Decentralized energy and reserve scheduling in a distribution system is presented. • The integrated operation of renewable generation and electric vehicles is proposed. - Abstract: Distribution system complexity is increasing mainly due to technological innovation, renewable Distributed Generation (DG) and responsive loads. This complexity makes difficult the monitoring, control and operation of distribution networks for Distribution System Operators (DSOs). In order to cope with this complexity, a novel method for the integrated operational planning of a distribution system is presented in this paper. The method introduces the figure of the aggregator, conceived as an intermediate agent between end-users and DSOs. In the proposed method, energy and reserve scheduling is carried out by both aggregators and DSO. Moreover, Electric Vehicles (EVs) are considered as responsive loads that can participate in ancillary service programs by providing reserve to the system. The efficiency of the proposed method is evaluated on an 84-bus distribution test system. Simulation results show that the integrated scheduling of EVs and renewable generators can mitigate the negative effects related to the uncertainty of renewable generation

  19. A distribution management system

    Energy Technology Data Exchange (ETDEWEB)

    Verho, P.; Jaerventausta, P.; Kaerenlampi, M.; Paulasaari, H. [Tampere Univ. of Technology (Finland); Partanen, J. [Lappeenranta Univ. of Technology (Finland)

    1996-12-31

    The development of new distribution automation applications is considerably wide nowadays. One of the most interesting areas is the development of a distribution management system (DMS) as an expansion of the traditional SCADA system. At the power transmission level such a system is called an energy management system (EMS). The idea of these expansions is to provide supporting tools for control center operators in system analysis and operation planning. The needed data for new applications is mainly available in some existing systems. Thus the computer systems of utilities must be integrated. The main data source for the new applications in the control center are the AM/FM/GIS (i.e. the network database system), the SCADA, and the customer information system (CIS). The new functions can be embedded in some existing computer system. This means a strong dependency on the vendor of the existing system. An alternative strategy is to develop an independent system which is integrated with other computer systems using well-defined interfaces. The latter approach makes it possible to use the new applications in various computer environments, having only a weak dependency on the vendors of the other systems. In the research project this alternative is preferred and used in developing an independent distribution management system

  20. A distribution management system

    Energy Technology Data Exchange (ETDEWEB)

    Verho, P; Jaerventausta, P; Kaerenlampi, M; Paulasaari, H [Tampere Univ. of Technology (Finland); Partanen, J [Lappeenranta Univ. of Technology (Finland)

    1997-12-31

    The development of new distribution automation applications is considerably wide nowadays. One of the most interesting areas is the development of a distribution management system (DMS) as an expansion of the traditional SCADA system. At the power transmission level such a system is called an energy management system (EMS). The idea of these expansions is to provide supporting tools for control center operators in system analysis and operation planning. The needed data for new applications is mainly available in some existing systems. Thus the computer systems of utilities must be integrated. The main data source for the new applications in the control center are the AM/FM/GIS (i.e. the network database system), the SCADA, and the customer information system (CIS). The new functions can be embedded in some existing computer system. This means a strong dependency on the vendor of the existing system. An alternative strategy is to develop an independent system which is integrated with other computer systems using well-defined interfaces. The latter approach makes it possible to use the new applications in various computer environments, having only a weak dependency on the vendors of the other systems. In the research project this alternative is preferred and used in developing an independent distribution management system

  1. Leadership in building automation aspired; Fuehrungsrolle in der Gebaeudeautomation angestrebt

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2002-04-01

    Siemens Building Technologies AG (SBT) aspires to global leadership in building automation and control. Building Automation, one of the six Divisions of Siemens Building Technologies, plans to increasingly offer concepts with comprehensive support for specific customer groups via its more than 500 branch offices worldwide. The basis for future building management systems is the new Desigo system, which includes not only integrated overall systems but also web-based services. The new building automation and control system is the culmination of the integration process for the systems of Landis and Gyr, Staefa Control System and Siemens GTA. (orig.) [German] Building Automation, eine der sechs Divisionen der Siemens Building Technologies AG, wird auf der bevorstehenden internationalen 'Light+Building 2002' das neue Gebaeudeautomationssystem Desigo praesentieren. Aus der Zusammenfuehrung der Gebaeudeautomationssysteme von Landis and Gyr, Staefa Control System und Siemens GTA ist dieses neue System entstanden, das auf internationalen Standards aufbaut. (orig.)

  2. The CTTC 5G End-to-End Experimental Platform : Integrating Heterogeneous Wireless/Optical Networks, Distributed Cloud, and IoT Devices

    OpenAIRE

    Munoz, Raul; Mangues-Bafalluy, Josep; Vilalta, Ricard; Verikoukis, Christos; Alonso-Zarate, Jesus; Bartzoudis, Nikolaos; Georgiadis, Apostolos; Payaro, Miquel; Perez-Neira, Ana; Casellas, Ramon; Martinez, Ricardo; Nunez-Martinez, Jose; Requena Esteso, Manuel; Pubill, David; Font-Bach, Oriol

    2016-01-01

    The Internet of Things (IoT) will facilitate a wide variety of applications in different domains, such as smart cities, smart grids, industrial automation (Industry 4.0), smart driving, assistance of the elderly, and home automation. Billions of heterogeneous smart devices with different application requirements will be connected to the networks and will generate huge aggregated volumes of data that will be processed in distributed cloud infrastructures. On the other hand, there is also a gen...

  3. Evaluation of Representative Smart Grid Investment Grant Project Technologies: Distributed Generation

    Energy Technology Data Exchange (ETDEWEB)

    Singh, Ruchi; Vyakaranam, Bharat GNVSR

    2012-02-14

    This document is one of a series of reports estimating the benefits of deploying technologies similar to those implemented on the Smart Grid Investment Grant (SGIG) projects. Four technical reports cover the various types of technologies deployed in the SGIG projects, distribution automation, demand response, energy storage, and renewables integration. A fifth report in the series examines the benefits of deploying these technologies on a national level. This technical report examines the impacts of addition of renewable resources- solar and wind in the distribution system as deployed in the SGIG projects.

  4. Problems of complex automation of process at a NPP

    International Nuclear Information System (INIS)

    Naumov, A.V.

    1981-01-01

    The importance of theoretical investigation in determining the level and quality of NPP automation is discussed. Achievements gained in this direction are briefly reviewed on the example of domestic NPPs. Two models of the problem solution on function distribution between the operator and technical means are outlined. The processes subjected to automation are enumerated. Development of the optimal methods of power automatic control of power units is one of the most important problems of NPP automation. Automation of discrete operations especially during the start-up, shut-down or in imergency situations becomes important [ru

  5. Design automation, languages, and simulations

    CERN Document Server

    Chen, Wai-Kai

    2003-01-01

    As the complexity of electronic systems continues to increase, the micro-electronic industry depends upon automation and simulations to adapt quickly to market changes and new technologies. Compiled from chapters contributed to CRC's best-selling VLSI Handbook, this volume covers a broad range of topics relevant to design automation, languages, and simulations. These include a collaborative framework that coordinates distributed design activities through the Internet, an overview of the Verilog hardware description language and its use in a design environment, hardware/software co-design, syst

  6. Effects of further integration of distributed generation on the electricity market

    NARCIS (Netherlands)

    Frunt, J.; Kling, W.L.; Myrzik, J.M.A.; Nobel, Frank; Klaar, D.A.M.

    2006-01-01

    Environmental concern leads to legislation to stimulate the further integration of renewable energy in the Dutch electricity supply system. Distributed generation is suited for the integration of renewable energy sources. Furthermore it can be used to generate both heat and electricity in a more

  7. Space station automation study: Automation requirements derived from space manufacturing concepts. Volume 1: Executive summary

    Science.gov (United States)

    1984-01-01

    The electroepitaxial process and the Very Large Scale Integration (VLSI) circuits (chips) facilities were chosen because each requires a very high degree of automation, and therefore involved extensive use of teleoperators, robotics, process mechanization, and artificial intelligence. Both cover a raw materials process and a sophisticated multi-step process and are therfore highly representative of the kinds of difficult operation, maintenance, and repair challenges which can be expected for any type of space manufacturing facility. Generic areas were identified which will require significant further study. The initial design will be based on terrestrial state-of-the-art hard automation. One hundred candidate missions were evaluated on the basis of automation portential and availability of meaning ful knowldege. The design requirements and unconstrained design concepts developed for the two missions are presented.

  8. Automating spectral measurements

    Science.gov (United States)

    Goldstein, Fred T.

    2008-09-01

    This paper discusses the architecture of software utilized in spectroscopic measurements. As optical coatings become more sophisticated, there is mounting need to automate data acquisition (DAQ) from spectrophotometers. Such need is exacerbated when 100% inspection is required, ancillary devices are utilized, cost reduction is crucial, or security is vital. While instrument manufacturers normally provide point-and-click DAQ software, an application programming interface (API) may be missing. In such cases automation is impossible or expensive. An API is typically provided in libraries (*.dll, *.ocx) which may be embedded in user-developed applications. Users can thereby implement DAQ automation in several Windows languages. Another possibility, developed by FTG as an alternative to instrument manufacturers' software, is the ActiveX application (*.exe). ActiveX, a component of many Windows applications, provides means for programming and interoperability. This architecture permits a point-and-click program to act as automation client and server. Excel, for example, can control and be controlled by DAQ applications. Most importantly, ActiveX permits ancillary devices such as barcode readers and XY-stages to be easily and economically integrated into scanning procedures. Since an ActiveX application has its own user-interface, it can be independently tested. The ActiveX application then runs (visibly or invisibly) under DAQ software control. Automation capabilities are accessed via a built-in spectro-BASIC language with industry-standard (VBA-compatible) syntax. Supplementing ActiveX, spectro-BASIC also includes auxiliary serial port commands for interfacing programmable logic controllers (PLC). A typical application is automatic filter handling.

  9. Automated packing systems: review of industrial implementations

    Science.gov (United States)

    Whelan, Paul F.; Batchelor, Bruce G.

    1993-08-01

    A rich theoretical background to the problems that occur in the automation of material handling can be found in operations research, production engineering, systems engineering and automation, more specifically machine vision, literature. This work has contributed towards the design of intelligent handling systems. This paper will review the application of these automated material handling and packing techniques to industrial problems. The discussion will also highlight the systems integration issues involved in these applications. An outline of one such industrial application, the automated placement of shape templates on to leather hides, is also discussed. The purpose of this system is to arrange shape templates on a leather hide in an efficient manner, so as to minimize the leather waste, before they are automatically cut from the hide. These pieces are used in the furniture and car manufacturing industries for the upholstery of high quality leather chairs and car seats. Currently this type of operation is semi-automated. The paper will outline the problems involved in the full automation of such a procedure.

  10. Reliability assessment of distribution system with the integration of renewable distributed generation

    International Nuclear Information System (INIS)

    Adefarati, T.; Bansal, R.C.

    2017-01-01

    Highlights: • Addresses impacts of renewable DG on the reliability of the distribution system. • Multi-objective formulation for maximizing the cost saving with integration of DG. • Uses Markov model to study the stochastic characteristics of the major components. • The investigation is done using modified RBTS bus test distribution system. • Proposed approach is useful for electric utilities to enhance the reliability. - Abstract: Recent studies have shown that renewable energy resources will contribute substantially to future energy generation owing to the rapid depletion of fossil fuels. Wind and solar energy resources are major sources of renewable energy that have the ability to reduce the energy crisis and the greenhouse gases emitted by the conventional power plants. Reliability assessment is one of the key indicators to measure the impact of the renewable distributed generation (DG) units in the distribution networks and to minimize the cost that is associated with power outage. This paper presents a comprehensive reliability assessment of the distribution system that satisfies the consumer load requirements with the penetration of wind turbine generator (WTG), electric storage system (ESS) and photovoltaic (PV). A Markov model is proposed to access the stochastic characteristics of the major components of the renewable DG resources as well as their influence on the reliability of a conventional distribution system. The results obtained from the case studies have demonstrated the effectiveness of using WTG, ESS and PV to enhance the reliability of the conventional distribution system.

  11. Network-Based Real-time Integrated Fire Detection and Alarm (FDA) System with Building Automation

    Science.gov (United States)

    Anwar, F.; Boby, R. I.; Rashid, M. M.; Alam, M. M.; Shaikh, Z.

    2017-11-01

    Fire alarm systems have become increasingly an important lifesaving technology in many aspects, such as applications to detect, monitor and control any fire hazard. A large sum of money is being spent annually to install and maintain the fire alarm systems in buildings to protect property and lives from the unexpected spread of fire. Several methods are already developed and it is improving on a daily basis to reduce the cost as well as increase quality. An integrated Fire Detection and Alarm (FDA) systems with building automation was studied, to reduce cost and improve their reliability by preventing false alarm. This work proposes an improved framework for FDA system to ensure a robust intelligent network of FDA control panels in real-time. A shortest path algorithmic was chosen for series of buildings connected by fiber optic network. The framework shares information and communicates with each fire alarm panels connected in peer to peer configuration and declare the network state using network address declaration from any building connected in network. The fiber-optic connection was proposed to reduce signal noises, thus increasing large area coverage, real-time communication and long-term safety. Based on this proposed method an experimental setup was designed and a prototype system was developed to validate the performance in practice. Also, the distributed network system was proposed to connect with an optional remote monitoring terminal panel to validate proposed network performance and ensure fire survivability where the information is sequentially transmitted. The proposed FDA system is different from traditional fire alarm and detection system in terms of topology as it manages group of buildings in an optimal and efficient manner.Introduction

  12. [Quality of buffy-coat-derived platelet concentrates prepared using automated system terumo automated centrifuge and separator integration (TACSI)].

    Science.gov (United States)

    Zebrowska, Agnieszka; Lipska, Alina; Rogowska, Anna; Bujno, Magdalena; Nedzi, Marta; Radziwon, Piotr

    2011-03-01

    Platelet recovery, and viability, and function is strongly dependent on the method of the preparation of platelet concentrate (PC). The glucose consumption, decrease of pH, release of alpha granules during storage in platelet concentrate impair their clinical effectiveness. To compare of the quality of buffy-coat-derieved platelet concentrates prepared using automatic system terumo automated centrifuge and separator integration (TACSI) and stored over 7 days. PCs were prepared from buffy coats using manual method (group I), or automatic system TACSI (group II). Fifteen PCs prepared from the 5 buffy coats each were stored over 7 days in 22-24 degrees C and tested. Samples were taken from the PCs container on days 1 and 7. The following laboratory tests were performed: number of platelets, platelets derived microparticles, CD62P expression, platelet adhesion, pH, glucose, lactate dehydrogenase activity. We have observed higher expression of CD62P in PCs prepared using manual method compared to the PCs produced automatically Platelet recovery was significantly higher in PCs prepared using automatic systems compare to manual method. Compared to manual methods, automatic system for preparation of buffy coats, is more efficient and enable production of platelets concentrates of higher quality.

  13. Does Automation Improve Stock Market Efficiency? Evidence from Ghana

    OpenAIRE

    Mensah, Justice T.; Pomaa-Berko, Maame; Adom, Philip Kofi

    2012-01-01

    As a burgeoning capital market in an emerging economy, automation of the stock market is regarded as a major step towards integrating the financial market as a conduit for economic growth. The automation of the Ghana Stock Exchange (GSE) in 2008 is expected among other things to improve the efficiency of the market. This paper therefore investigates the impact of the automation on the efficiency of the GSE within the framework of the weak-form Efficient Market Hypothesis (EMH) using daily mar...

  14. An automated optofluidic biosensor platform combining interferometric sensors and injection moulded microfluidics.

    Science.gov (United States)

    Szydzik, C; Gavela, A F; Herranz, S; Roccisano, J; Knoerzer, M; Thurgood, P; Khoshmanesh, K; Mitchell, A; Lechuga, L M

    2017-08-08

    A primary limitation preventing practical implementation of photonic biosensors within point-of-care platforms is their integration with fluidic automation subsystems. For most diagnostic applications, photonic biosensors require complex fluid handling protocols; this is especially prominent in the case of competitive immunoassays, commonly used for detection of low-concentration, low-molecular weight biomarkers. For this reason, complex automated microfluidic systems are needed to realise the full point-of-care potential of photonic biosensors. To fulfil this requirement, we propose an on-chip valve-based microfluidic automation module, capable of automating such complex fluid handling. This module is realised through application of a PDMS injection moulding fabrication technique, recently described in our previous work, which enables practical fabrication of normally closed pneumatically actuated elastomeric valves. In this work, these valves are configured to achieve multiplexed reagent addressing for an on-chip diaphragm pump, providing the sample and reagent processing capabilities required for automation of cyclic competitive immunoassays. Application of this technique simplifies fabrication and introduces the potential for mass production, bringing point-of-care integration of complex automated microfluidics into the realm of practicality. This module is integrated with a highly sensitive, label-free bimodal waveguide photonic biosensor, and is demonstrated in the context of a proof-of-concept biosensing assay, detecting the low-molecular weight antibiotic tetracycline.

  15. System Integration of Distributed Power for Complete Building Systems: Phase 2 Report

    Energy Technology Data Exchange (ETDEWEB)

    Kramer, R.

    2003-12-01

    This report describes NiSource Energy Technologies Inc.'s second year of a planned 3-year effort to advance distributed power development, deployment, and integration. Its long-term goal is to design ways to extend distributed generation into the physical design and controls of buildings. NET worked to meet this goal through advances in the implementation and control of combined heat and power systems in end-user environments and a further understanding of electric interconnection and siting issues. The specific objective of work under this subcontract is to identify the system integration and implementation issues of DG and develop and test potential solutions to these issues. In addition, recommendations are made to resolve identified issues that may hinder or slow the integration of integrated energy systems into the national energy picture.

  16. Automated Resource Classifier for agglomerative functional ...

    Indian Academy of Sciences (India)

    2007-06-16

    Jun 16, 2007 ... Automated resource; functional classification; integrative biology ... which is an open source software meeting the user requirements of flexibility. ... entries into any of the 7 basic non-overlapping functional classes: Cell wall, ...

  17. Thermal battery automated assembly station conceptual design

    Energy Technology Data Exchange (ETDEWEB)

    Jacobs, D

    1988-08-01

    Thermal battery assembly involves many operations which are labor- intense. In August 1986, a project team was formed at GE Neutron Devices to investigate and evaluate more efficient and productive battery assembly techniques through the use of automation. The result of this study was the acceptance of a plan to automate the piece part pellet fabrication and battery stacking operations by using computerized pellet presses and robots which would be integrated by a main computer. This report details the conceptual design and development plan to be followed in the fabrication, development, and implementation of a thermal battery automated assembly station. 4 figs., 8 tabs.

  18. User-friendly establishment of trust in distributed home automation networks

    DEFF Research Database (Denmark)

    Hjorth, Theis Solberg; Madsen, Per Printz; Torbensen, Rune

    2012-01-01

    Current wireless technologies use a variety of methods to locally exchange and verify credentials between devices to establish trusted relationships. Scenarios in home automation networks also require this capability over the Internet, but the necessary involvement of non-expert users to setup...... these relationships can lead to misconfiguration or breaches of security. We outline a security system for Home Automation called Trusted Domain that can establish and maintain cryptographically secure relationships between devices connected via IP-based networks and the Internet. Trust establishment is presented...... of predefined pictograms. This method is designed to scale from smart-phones and tablets down to low-resource embedded systems. The presented approach is supported by an extensive literature study, and the ease of use and feasibility of the method has been indicated through a preliminary user study...

  19. Distribution automation and control support; Analysis and interpretation of DAC working group results for use in project planning

    Science.gov (United States)

    Klock, P.; Evans, D.

    1979-01-01

    The Executive Summary and Proceedings of the Working Group Meeting was analyzed to identify specific projects appropriate for Distribution Automation and Control DAC RD&D. Specific projects that should be undertaken in the DAC RD&D program were recommended. The projects are presented under broad categories of work selected based on ESC's interpretation of the results of the Working Group Meeting. Some of the projects are noted as utility industry projects. The ESC recommendations regarding program management are presented. Utility versus Government management responsibilities are noted.

  20. AUTOMATING ASSET KNOWLEDGE WITH MTCONNECT.

    Science.gov (United States)

    Venkatesh, Sid; Ly, Sidney; Manning, Martin; Michaloski, John; Proctor, Fred

    2016-01-01

    In order to maximize assets, manufacturers should use real-time knowledge garnered from ongoing and continuous collection and evaluation of factory-floor machine status data. In discrete parts manufacturing, factory machine monitoring has been difficult, due primarily to closed, proprietary automation equipment that make integration difficult. Recently, there has been a push in applying the data acquisition concepts of MTConnect to the real-time acquisition of machine status data. MTConnect is an open, free specification aimed at overcoming the "Islands of Automation" dilemma on the shop floor. With automated asset analysis, manufacturers can improve production to become lean, efficient, and effective. The focus of this paper will be on the deployment of MTConnect to collect real-time machine status to automate asset management. In addition, we will leverage the ISO 22400 standard, which defines an asset and quantifies asset performance metrics. In conjunction with these goals, the deployment of MTConnect in a large aerospace manufacturing facility will be studied with emphasis on asset management and understanding the impact of machine Overall Equipment Effectiveness (OEE) on manufacturing.

  1. 2013 Chinese Intelligent Automation Conference

    CERN Document Server

    Deng, Zhidong

    2013-01-01

    Proceedings of the 2013 Chinese Intelligent Automation Conference presents selected research papers from the CIAC’13, held in Yangzhou, China. The topics include e.g. adaptive control, fuzzy control, neural network based control, knowledge based control, hybrid intelligent control, learning control, evolutionary mechanism based control, multi-sensor integration, failure diagnosis, and reconfigurable control. Engineers and researchers from academia, industry, and government can gain an inside view of new solutions combining ideas from multiple disciplines in the field of intelligent automation.   Zengqi Sun and Zhidong Deng are professors at the Department of Computer Science, Tsinghua University, China.

  2. 2013 Chinese Intelligent Automation Conference

    CERN Document Server

    Deng, Zhidong

    2013-01-01

    Proceedings of the 2013 Chinese Intelligent Automation Conference presents selected research papers from the CIAC’13, held in Yangzhou, China. The topics include e.g. adaptive control, fuzzy control, neural network based control, knowledge based control, hybrid intelligent control, learning control, evolutionary mechanism based control, multi-sensor integration, failure diagnosis, and reconfigurable control. Engineers and researchers from academia, industry, and government can gain an inside view of new solutions combining ideas from multiple disciplines in the field of intelligent automation. Zengqi Sun and Zhidong Deng are professors at the Department of Computer Science, Tsinghua University, China.

  3. Automated Podcasting System for Universities

    Directory of Open Access Journals (Sweden)

    Ypatios Grigoriadis

    2013-03-01

    Full Text Available This paper presents the results achieved at Graz University of Technology (TU Graz in the field of automating the process of recording and publishing university lectures in a very new way. It outlines cornerstones of the development and integration of an automated recording system such as the lecture hall setup, the recording hardware and software architecture as well as the development of a text-based search for the final product by method of indexing video podcasts. Furthermore, the paper takes a look at didactical aspects, evaluations done in this context and future outlook.

  4. Crew/Automation Interaction in Space Transportation Systems: Lessons Learned from the Glass Cockpit

    Science.gov (United States)

    Rudisill, Marianne

    2000-01-01

    The progressive integration of automation technologies in commercial transport aircraft flight decks - the 'glass cockpit' - has had a major, and generally positive, impact on flight crew operations. Flight deck automation has provided significant benefits, such as economic efficiency, increased precision and safety, and enhanced functionality within the crew interface. These enhancements, however, may have been accrued at a price, such as complexity added to crew/automation interaction that has been implicated in a number of aircraft incidents and accidents. This report briefly describes 'glass cockpit' evolution. Some relevant aircraft accidents and incidents are described, followed by a more detailed description of human/automation issues and problems (e.g., crew error, monitoring, modes, command authority, crew coordination, workload, and training). This paper concludes with example principles and guidelines for considering 'glass cockpit' human/automation integration within space transportation systems.

  5. Automated local bright feature image analysis of nuclear protein distribution identifies changes in tissue phenotype

    International Nuclear Information System (INIS)

    Knowles, David; Sudar, Damir; Bator, Carol; Bissell, Mina

    2006-01-01

    The organization of nuclear proteins is linked to cell and tissue phenotypes. When cells arrest proliferation, undergo apoptosis, or differentiate, the distribution of nuclear proteins changes. Conversely, forced alteration of the distribution of nuclear proteins modifies cell phenotype. Immunostaining and fluorescence microscopy have been critical for such findings. However, there is an increasing need for quantitative analysis of nuclear protein distribution to decipher epigenetic relationships between nuclear structure and cell phenotype, and to unravel the mechanisms linking nuclear structure and function. We have developed imaging methods to quantify the distribution of fluorescently-stained nuclear protein NuMA in different mammary phenotypes obtained using three-dimensional cell culture. Automated image segmentation of DAPI-stained nuclei was generated to isolate thousands of nuclei from three-dimensional confocal images. Prominent features of fluorescently-stained NuMA were detected using a novel local bright feature analysis technique, and their normalized spatial density calculated as a function of the distance from the nuclear perimeter to its center. The results revealed marked changes in the distribution of the density of NuMA bright features as non-neoplastic cells underwent phenotypically normal acinar morphogenesis. In contrast, we did not detect any reorganization of NuMA during the formation of tumor nodules by malignant cells. Importantly, the analysis also discriminated proliferating non-neoplastic cells from proliferating malignant cells, suggesting that these imaging methods are capable of identifying alterations linked not only to the proliferation status but also to the malignant character of cells. We believe that this quantitative analysis will have additional applications for classifying normal and pathological tissues

  6. Distribution Grid Integration Costs Under High PV Penetrations Workshop |

    Science.gov (United States)

    utility business model and structure: policies and regulations, revenue requirements and investment Practices Panel 3: Future Directions in Grid Integration Cost-Benefit Analysis Determining Distribution Grid into Utility Planning Notes on Future Needs All speakers were asked to include their opinions on

  7. Automating ATLAS Computing Operations using the Site Status Board

    CERN Document Server

    Andreeva, J; The ATLAS collaboration; Campana, S; Di Girolamo, A; Espinal Curull, X; Gayazov, S; Magradze, E; Nowotka, MM; Rinaldi, L; Saiz, P; Schovancova, J; Stewart, GA; Wright, M

    2012-01-01

    The automation of operations is essential to reduce manpower costs and improve the reliability of the system. The Site Status Board (SSB) is a framework which allows Virtual Organizations to monitor their computing activities at distributed sites and to evaluate site performance. The ATLAS experiment intensively uses SSB for the distributed computing shifts, for estimating data processing and data transfer efficiencies at a particular site, and for implementing automatic exclusion of sites from computing activities, in case of potential problems. ATLAS SSB provides a real-time aggregated monitoring view and keeps the history of the monitoring metrics. Based on this history, usability of a site from the perspective of ATLAS is calculated. The presentation will describe how SSB is integrated in the ATLAS operations and computing infrastructure and will cover implementation details of the ATLAS SSB sensors and alarm system, based on the information in SSB. It will demonstrate the positive impact of the use of SS...

  8. Impact of automation on mass spectrometry.

    Science.gov (United States)

    Zhang, Yan Victoria; Rockwood, Alan

    2015-10-23

    Mass spectrometry coupled to liquid chromatography (LC-MS and LC-MS/MS) is an analytical technique that has rapidly grown in popularity in clinical practice. In contrast to traditional technology, mass spectrometry is superior in many respects including resolution, specificity, multiplex capability and has the ability to measure analytes in various matrices. Despite these advantages, LC-MS/MS remains high cost, labor intensive and has limited throughput. This specialized technology requires highly trained personnel and therefore has largely been limited to large institutions, academic organizations and reference laboratories. Advances in automation will be paramount to break through this bottleneck and increase its appeal for routine use. This article reviews these challenges, shares perspectives on essential features for LC-MS/MS total automation and proposes a step-wise and incremental approach to achieve total automation through reducing human intervention, increasing throughput and eventually integrating the LC-MS/MS system into the automated clinical laboratory operations. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. 23rd International Conference on Flexible Automation & Intelligent Manufacturing

    CERN Document Server

    2013-01-01

    The proceedings includes the set of revised papers from the 23rd International Conference on Flexible Automation and Intelligent Manufacturing (FAIM 2013). This conference aims to provide an international forum for the exchange of leading edge scientific knowledge and industrial experience regarding the development and integration of the various aspects of Flexible Automation and Intelligent Manufacturing Systems covering the complete life-cycle of a company’s Products and Processes. Contents will include topics such as: Product, Process and Factory Integrated Design, Manufacturing Technology and Intelligent Systems, Manufacturing Operations Management and Optimization and Manufacturing Networks and MicroFactories.

  10. Survey and comparison of automated UT systems

    International Nuclear Information System (INIS)

    Neeley, V.I.; Avioli, M.J.

    1988-01-01

    In the past decade, the limitations of manual UT inspections have become more severe and adverse. Perhaps the best evidence of this has been the problem of intergranular stress corrosion cracking (IGSCC) in boiling water reactors (BWR). The onset of this problem clearly showed that better and more sophisticated UT inspection methods must be developed to assure the industry that an appropriate level of inspection integrity could be maintained. While automated UT inspection systems have been under development for some time, this event certainly spurred this activity and has resulted in a variety of commercial systems. The intent of this project, sponsored by EPRI, is to develop a utility engineer's 'Buyer guide' to automated UT systems. Comparison of different automated UT systems along with results of questionnaires on manual UT versus automated UT costs and effectiveness are reviewed. (author)

  11. Automating Ontological Annotation with WordNet

    Energy Technology Data Exchange (ETDEWEB)

    Sanfilippo, Antonio P.; Tratz, Stephen C.; Gregory, Michelle L.; Chappell, Alan R.; Whitney, Paul D.; Posse, Christian; Paulson, Patrick R.; Baddeley, Bob L.; Hohimer, Ryan E.; White, Amanda M.

    2006-01-22

    Semantic Web applications require robust and accurate annotation tools that are capable of automating the assignment of ontological classes to words in naturally occurring text (ontological annotation). Most current ontologies do not include rich lexical databases and are therefore not easily integrated with word sense disambiguation algorithms that are needed to automate ontological annotation. WordNet provides a potentially ideal solution to this problem as it offers a highly structured lexical conceptual representation that has been extensively used to develop word sense disambiguation algorithms. However, WordNet has not been designed as an ontology, and while it can be easily turned into one, the result of doing this would present users with serious practical limitations due to the great number of concepts (synonym sets) it contains. Moreover, mapping WordNet to an existing ontology may be difficult and requires substantial labor. We propose to overcome these limitations by developing an analytical platform that (1) provides a WordNet-based ontology offering a manageable and yet comprehensive set of concept classes, (2) leverages the lexical richness of WordNet to give an extensive characterization of concept class in terms of lexical instances, and (3) integrates a class recognition algorithm that automates the assignment of concept classes to words in naturally occurring text. The ensuing framework makes available an ontological annotation platform that can be effectively integrated with intelligence analysis systems to facilitate evidence marshaling and sustain the creation and validation of inference models.

  12. Computer system architecture for laboratory automation

    International Nuclear Information System (INIS)

    Penney, B.K.

    1978-01-01

    This paper describes the various approaches that may be taken to provide computing resources for laboratory automation. Three distinct approaches are identified, the single dedicated small computer, shared use of a larger computer, and a distributed approach in which resources are provided by a number of computers, linked together, and working in some cooperative way. The significance of the microprocessor in laboratory automation is discussed, and it is shown that it is not simply a cheap replacement of the minicomputer. (Auth.)

  13. The nucleon-nucleon correlations and the integral characteristics of the potential distributions in nuclei

    International Nuclear Information System (INIS)

    Knyaz'kov, O.M.; Kukhtina, I.N.

    1989-01-01

    The integral characteristics of the potential distribution in nuclei, namely the volume integrals, moments and mean square radii are studied in the framework of the semimicroscopic approach to the interaction of low energy nucleons with nuclei on the base of the exchange nucleon-nucleon correlations and the density dependence of effective forces. The ratio of the normalized multipole moments of potential and matter distributions is investigated. The energy dependence of the integral characteristics is analyzed. 15 refs.; 2 tabs

  14. Conference on renewable energies integration to power grids

    International Nuclear Information System (INIS)

    Laffaille, Didier; Bischoff, Torsten; Merkel, Marcus; Rohrig, Kurt; Glatigny, Alain; Quitmann, Eckard; Lehec, Guillaume; Teirlynck, Thierry; Stahl, Oliver

    2014-01-01

    The French-German office for Renewable energies (OFAEnR) organised a conference on renewable energies integration to power grids. In the framework of this French-German exchange of experience, more than 150 participants exchanged views on the perspectives and possible solutions of this integration in order to warrant the security of supplies and the grid stability in a context of increasing injection and decentralization of renewable power sources. This document brings together the available presentations (slides) made during this event: 1 - French distribution grids - Overview and perspectives (Didier Laffaille); 2 - Distribution Grids in Germany - Overview and Perspective (Torsten Bischoff); 3 - Integration of renewable energies into distribution grids - a case example from Germany (Marcus Merkel); 4 - Regeneratives Kombikraftwerk Deutschland: System Services with 100 % Renewable energies (Kurt Rohrig); 5 - Overview of the different grid instrumentation-control and automation tools (Alain Glatigny); 6 - Which Ancillary Services needs the Power System? The contribution from Wind Power Plants (Eckard Quitmann); 7 - The Flexibility Aggregator - the example of the GreenLys Project (Guillaume Lehec); 8 - Energy Pool - Providing flexibility to the electric system. Consumption cut-off solutions in France (Thierry Teirlynck); 9 - Demand Response experiences from Germany (Oliver Stahl)

  15. Identifying Requirements for Effective Human-Automation Teamwork

    Energy Technology Data Exchange (ETDEWEB)

    Jeffrey C. Joe; John O' Hara; Heather D. Medema; Johanna H. Oxstrand

    2014-06-01

    Previous studies have shown that poorly designed human-automation collaboration, such as poorly designed communication protocols, often leads to problems for the human operators, such as: lack of vigilance, complacency, and loss of skills. These problems often lead to suboptimal system performance. To address this situation, a considerable amount of research has been conducted to improve human-automation collaboration and to make automation function better as a “team player.” Much of this research is based on an understanding of what it means to be a good team player from the perspective of a human team. However, the research is often based on a simplified view of human teams and teamwork. In this study, we sought to better understand the capabilities and limitations of automation from the standpoint of human teams. We first examined human teams to identify the principles for effective teamwork. We next reviewed the research on integrating automation agents and human agents into mixed agent teams to identify the limitations of automation agents to conform to teamwork principles. This research resulted in insights that can lead to more effective human-automation collaboration by enabling a more realistic set of requirements to be developed based on the strengths and limitations of all agents.

  16. An automated digital imaging system for environmental monitoring applications

    Science.gov (United States)

    Bogle, Rian; Velasco, Miguel; Vogel, John

    2013-01-01

    Recent improvements in the affordability and availability of high-resolution digital cameras, data loggers, embedded computers, and radio/cellular modems have advanced the development of sophisticated automated systems for remote imaging. Researchers have successfully placed and operated automated digital cameras in remote locations and in extremes of temperature and humidity, ranging from the islands of the South Pacific to the Mojave Desert and the Grand Canyon. With the integration of environmental sensors, these automated systems are able to respond to local conditions and modify their imaging regimes as needed. In this report we describe in detail the design of one type of automated imaging system developed by our group. It is easily replicated, low-cost, highly robust, and is a stand-alone automated camera designed to be placed in remote locations, without wireless connectivity.

  17. Heterogeneous Wireless Networks for Smart Grid Distribution Systems: Advantages and Limitations.

    Science.gov (United States)

    Khalifa, Tarek; Abdrabou, Atef; Shaban, Khaled; Gaouda, A M

    2018-05-11

    Supporting a conventional power grid with advanced communication capabilities is a cornerstone to transferring it to a smart grid. A reliable communication infrastructure with a high throughput can lay the foundation towards the ultimate objective of a fully automated power grid with self-healing capabilities. In order to realize this objective, the communication infrastructure of a power distribution network needs to be extended to cover all substations including medium/low voltage ones. This shall enable information exchange among substations for a variety of system automation purposes with a low latency that suits time critical applications. This paper proposes the integration of two heterogeneous wireless technologies (such as WiFi and cellular 3G/4G) to provide reliable and fast communication among primary and secondary distribution substations. This integration allows the transmission of different data packets (not packet replicas) over two radio interfaces, making these interfaces act like a one data pipe. Thus, the paper investigates the applicability and effectiveness of employing heterogeneous wireless networks (HWNs) in achieving the desired reliability and timeliness requirements of future smart grids. We study the performance of HWNs in a realistic scenario under different data transfer loads and packet loss ratios. Our findings reveal that HWNs can be a viable data transfer option for smart grids.

  18. Heterogeneous Wireless Networks for Smart Grid Distribution Systems: Advantages and Limitations

    Directory of Open Access Journals (Sweden)

    Tarek Khalifa

    2018-05-01

    Full Text Available Supporting a conventional power grid with advanced communication capabilities is a cornerstone to transferring it to a smart grid. A reliable communication infrastructure with a high throughput can lay the foundation towards the ultimate objective of a fully automated power grid with self-healing capabilities. In order to realize this objective, the communication infrastructure of a power distribution network needs to be extended to cover all substations including medium/low voltage ones. This shall enable information exchange among substations for a variety of system automation purposes with a low latency that suits time critical applications. This paper proposes the integration of two heterogeneous wireless technologies (such as WiFi and cellular 3G/4G to provide reliable and fast communication among primary and secondary distribution substations. This integration allows the transmission of different data packets (not packet replicas over two radio interfaces, making these interfaces act like a one data pipe. Thus, the paper investigates the applicability and effectiveness of employing heterogeneous wireless networks (HWNs in achieving the desired reliability and timeliness requirements of future smart grids. We study the performance of HWNs in a realistic scenario under different data transfer loads and packet loss ratios. Our findings reveal that HWNs can be a viable data transfer option for smart grids.

  19. Automated Subsystem Control for Life Support System (ASCLSS)

    Science.gov (United States)

    Block, Roger F.

    1987-01-01

    The Automated Subsystem Control for Life Support Systems (ASCLSS) program has successfully developed and demonstrated a generic approach to the automation and control of space station subsystems. The automation system features a hierarchical and distributed real-time control architecture which places maximum controls authority at the lowest or process control level which enhances system autonomy. The ASCLSS demonstration system pioneered many automation and control concepts currently being considered in the space station data management system (DMS). Heavy emphasis is placed on controls hardware and software commonality implemented in accepted standards. The approach demonstrates successfully the application of real-time process and accountability with the subsystem or process developer. The ASCLSS system completely automates a space station subsystem (air revitalization group of the ASCLSS) which moves the crew/operator into a role of supervisory control authority. The ASCLSS program developed over 50 lessons learned which will aide future space station developers in the area of automation and controls..

  20. Distributed process control system for remote control and monitoring of the TFTR tritium systems

    International Nuclear Information System (INIS)

    Schobert, G.; Arnold, N.; Bashore, D.; Mika, R.; Oliaro, G.

    1989-01-01

    This paper reviews the progress made in the application of a commercially available distributed process control system to support the requirements established for the Tritium REmote Control And Monitoring System (TRECAMS) of the Tokamak Fusion Test REactor (TFTR). The system that will discussed was purchased from Texas (TI) Instruments Automation Controls Division), previously marketed by Rexnord Automation. It consists of three, fully redundant, distributed process controllers interfaced to over 1800 analog and digital I/O points. The operator consoles located throughout the facility are supported by four Digital Equipment Corporation (DEC) PDP-11/73 computers. The PDP-11/73's and the three process controllers communicate over a fully redundant one megabaud fiber optic network. All system functionality is based on a set of completely integrated databases loaded to the process controllers and the PDP-11/73's. (author). 2 refs.; 2 figs

  1. The SSM/PMAD automated test bed project

    Science.gov (United States)

    Lollar, Louis F.

    1991-01-01

    The Space Station Module/Power Management and Distribution (SSM/PMAD) autonomous subsystem project was initiated in 1984. The project's goal has been to design and develop an autonomous, user-supportive PMAD test bed simulating the SSF Hab/Lab module(s). An eighteen kilowatt SSM/PMAD test bed model with a high degree of automated operation has been developed. This advanced automation test bed contains three expert/knowledge based systems that interact with one another and with other more conventional software residing in up to eight distributed 386-based microcomputers to perform the necessary tasks of real-time and near real-time load scheduling, dynamic load prioritizing, and fault detection, isolation, and recovery (FDIR).

  2. AGIS: Integration of new technologies used in ATLAS Distributed Computing

    OpenAIRE

    Anisenkov, Alexey; Di Girolamo, Alessandro; Alandes Pradillo, Maria

    2017-01-01

    The variety of the ATLAS Distributed Computing infrastructure requires a central information system to define the topology of computing resources and to store different parameters and configuration data which are needed by various ATLAS software components. The ATLAS Grid Information System (AGIS) is the system designed to integrate configuration and status information about resources, services and topology of the computing infrastructure used by ATLAS Distributed Computing applications and s...

  3. Modeling the Energy Use of a Connected and Automated Transportation System (Poster)

    Energy Technology Data Exchange (ETDEWEB)

    Gonder, J.; Brown, A.

    2014-07-01

    Early research points to large potential impacts of connected and automated vehicles (CAVs) on transportation energy use - dramatic savings, increased use, or anything in between. Due to a lack of suitable data and integrated modeling tools to explore these complex future systems, analyses to date have relied on simple combinations of isolated effects. This poster proposes a framework for modeling the potential energy implications from increasing penetration of CAV technologies and for assessing technology and policy options to steer them toward favorable energy outcomes. Current CAV modeling challenges include estimating behavior change, understanding potential vehicle-to-vehicle interactions, and assessing traffic flow and vehicle use under different automation scenarios. To bridge these gaps and develop a picture of potential future automated systems, NREL is integrating existing modeling capabilities with additional tools and data inputs to create a more fully integrated CAV assessment toolkit.

  4. Transfusion management using a remote-controlled, automated blood storage.

    Science.gov (United States)

    Pagliaro, Pasqualepaolo; Turdo, Rosalia

    2008-04-01

    Generally, the safety of transfusion terapies for patients depends in part on the distribution of the blood products. The prevention of adverse events can be aided by technological means, which, besides improving the traceability of the process, make errors less likely. In this context, the latest frontier in automation and computerisation is the remote-controlled, automated refrigerator for blood storage. Computer cross-matching is an efficient and safe method for assigning blood components, based on Information Technology applied to typing and screening. This method can be extended to the management of an automated blood refrigerator, the programme of which is interfaced with the Transfusion Service's information system. The connection we made in our Service between EmoNet and Hemosafe enables real-time, remote-controlled management of the following aspects of blood component distribution: a) release of autologous and allogeneic units already allocated to a patient, b) release of available units, which can be allocated by remote-control to known patients, in the presence of a valid computer cross-match, c) release of O-negative units of blood for emergencies. Our system combines an information database, which enables computer cross-matching, with an automated refrigerator for blood storage with controlled access managed remotely by the Transfusion Service. The effectiveness and safety of the system were validated during the 4 months of its routine use in the Transfusion Service's outpatient department. The safety and efficiency of the distribution of blood products can and must be increased by the use of technological innovations. With the EmoNet/Hemosafe system, the responsibility for the remote-controlled distribution of red blood cell concentrates remains with the chief of the Transfusion Services, through the use of automated computer procedures and supported by continuous training of technicians and nursing staff.

  5. Automated tools and techniques for distributed Grid Software Development of the testbed infrastructure

    CERN Document Server

    Aguado Sanchez, C

    2007-01-01

    Grid technology is becoming more and more important as the new paradigm for sharing computational resources across different organizations in a secure way. The great powerfulness of this solution, requires the definition of a generic stack of services and protocols and this is the scope of the different Grid initiatives. As a result of international collaborations for its development, the Open Grid Forum created the Open Grid Services Architecture (OGSA) which aims to define the common set of services that will enable interoperability across the different implementations. This master thesis has been developed in this framework, as part of the two European-funded projects ETICS and OMII-Europe. The main objective is to contribute to the design and maintenance of large distributed development projects with the automated tool that enables to implement Software Engineering techniques oriented to achieve an acceptable level of quality at the release process. Specifically, this thesis develops the testbed concept a...

  6. Automated Operations Development for Advanced Exploration Systems

    Science.gov (United States)

    Haddock, Angie T.; Stetson, Howard

    2012-01-01

    Automated space operations command and control software development and its implementation must be an integral part of the vehicle design effort. The software design must encompass autonomous fault detection, isolation, recovery capabilities and also provide "single button" intelligent functions for the crew. Development, operations and safety approval experience with the Timeliner system onboard the International Space Station (ISS), which provided autonomous monitoring with response and single command functionality of payload systems, can be built upon for future automated operations as the ISS Payload effort was the first and only autonomous command and control system to be in continuous execution (6 years), 24 hours a day, 7 days a week within a crewed spacecraft environment. Utilizing proven capabilities from the ISS Higher Active Logic (HAL) System, along with the execution component design from within the HAL 9000 Space Operating System, this design paper will detail the initial HAL System software architecture and interfaces as applied to NASA's Habitat Demonstration Unit (HDU) in support of the Advanced Exploration Systems, Autonomous Mission Operations project. The development and implementation of integrated simulators within this development effort will also be detailed and is the first step in verifying the HAL 9000 Integrated Test-Bed Component [2] designs effectiveness. This design paper will conclude with a summary of the current development status and future development goals as it pertains to automated command and control for the HDU.

  7. Microseismic event location using global optimization algorithms: An integrated and automated workflow

    Science.gov (United States)

    Lagos, Soledad R.; Velis, Danilo R.

    2018-02-01

    We perform the location of microseismic events generated in hydraulic fracturing monitoring scenarios using two global optimization techniques: Very Fast Simulated Annealing (VFSA) and Particle Swarm Optimization (PSO), and compare them against the classical grid search (GS). To this end, we present an integrated and optimized workflow that concatenates into an automated bash script the different steps that lead to the microseismic events location from raw 3C data. First, we carry out the automatic detection, denoising and identification of the P- and S-waves. Secondly, we estimate their corresponding backazimuths using polarization information, and propose a simple energy-based criterion to automatically decide which is the most reliable estimate. Finally, after taking proper care of the size of the search space using the backazimuth information, we perform the location using the aforementioned algorithms for 2D and 3D usual scenarios of hydraulic fracturing processes. We assess the impact of restricting the search space and show the advantages of using either VFSA or PSO over GS to attain significant speed-ups.

  8. Distributed Access View Integrated Database (DAVID) system

    Science.gov (United States)

    Jacobs, Barry E.

    1991-01-01

    The Distributed Access View Integrated Database (DAVID) System, which was adopted by the Astrophysics Division for their Astrophysics Data System, is a solution to the system heterogeneity problem. The heterogeneous components of the Astrophysics problem is outlined. The Library and Library Consortium levels of the DAVID approach are described. The 'books' and 'kits' level is discussed. The Universal Object Typer Management System level is described. The relation of the DAVID project with the Small Business Innovative Research (SBIR) program is explained.

  9. Laboratory automation: trajectory, technology, and tactics.

    Science.gov (United States)

    Markin, R S; Whalen, S A

    2000-05-01

    Laboratory automation is in its infancy, following a path parallel to the development of laboratory information systems in the late 1970s and early 1980s. Changes on the horizon in healthcare and clinical laboratory service that affect the delivery of laboratory results include the increasing age of the population in North America, the implementation of the Balanced Budget Act (1997), and the creation of disease management companies. Major technology drivers include outcomes optimization and phenotypically targeted drugs. Constant cost pressures in the clinical laboratory have forced diagnostic manufacturers into less than optimal profitability states. Laboratory automation can be a tool for the improvement of laboratory services and may decrease costs. The key to improvement of laboratory services is implementation of the correct automation technology. The design of this technology should be driven by required functionality. Automation design issues should be centered on the understanding of the laboratory and its relationship to healthcare delivery and the business and operational processes in the clinical laboratory. Automation design philosophy has evolved from a hardware-based approach to a software-based approach. Process control software to support repeat testing, reflex testing, and transportation management, and overall computer-integrated manufacturing approaches to laboratory automation implementation are rapidly expanding areas. It is clear that hardware and software are functionally interdependent and that the interface between the laboratory automation system and the laboratory information system is a key component. The cost-effectiveness of automation solutions suggested by vendors, however, has been difficult to evaluate because the number of automation installations are few and the precision with which operational data have been collected to determine payback is suboptimal. The trend in automation has moved from total laboratory automation to a

  10. Integration of distributed plant process computer systems to nuclear power generation facilities

    International Nuclear Information System (INIS)

    Bogard, T.; Finlay, K.

    1996-01-01

    Many operating nuclear power generation facilities are replacing their plant process computer. Such replacement projects are driven by equipment obsolescence issues and associated objectives to improve plant operability, increase plant information access, improve man machine interface characteristics, and reduce operation and maintenance costs. This paper describes a few recently completed and on-going replacement projects with emphasis upon the application integrated distributed plant process computer systems. By presenting a few recent projects, the variations of distributed systems design show how various configurations can address needs for flexibility, open architecture, and integration of technological advancements in instrumentation and control technology. Architectural considerations for optimal integration of the plant process computer and plant process instrumentation ampersand control are evident from variations of design features

  11. Automated optical assembly

    Science.gov (United States)

    Bala, John L.

    1995-08-01

    Automation and polymer science represent fundamental new technologies which can be directed toward realizing the goal of establishing a domestic, world-class, commercial optics business. Use of innovative optical designs using precision polymer optics will enable the US to play a vital role in the next generation of commercial optical products. The increased cost savings inherent in the utilization of optical-grade polymers outweighs almost every advantage of using glass for high volume situations. Optical designers must gain experience with combined refractive/diffractive designs and broaden their knowledge base regarding polymer technology beyond a cursory intellectual exercise. Implementation of a fully automated assembly system, combined with utilization of polymer optics, constitutes the type of integrated manufacturing process which will enable the US to successfully compete with the low-cost labor employed in the Far East, as well as to produce an equivalent product.

  12. An Intelligent Automation Platform for Rapid Bioprocess Design.

    Science.gov (United States)

    Wu, Tianyi; Zhou, Yuhong

    2014-08-01

    Bioprocess development is very labor intensive, requiring many experiments to characterize each unit operation in the process sequence to achieve product safety and process efficiency. Recent advances in microscale biochemical engineering have led to automated experimentation. A process design workflow is implemented sequentially in which (1) a liquid-handling system performs high-throughput wet lab experiments, (2) standalone analysis devices detect the data, and (3) specific software is used for data analysis and experiment design given the user's inputs. We report an intelligent automation platform that integrates these three activities to enhance the efficiency of such a workflow. A multiagent intelligent architecture has been developed incorporating agent communication to perform the tasks automatically. The key contribution of this work is the automation of data analysis and experiment design and also the ability to generate scripts to run the experiments automatically, allowing the elimination of human involvement. A first-generation prototype has been established and demonstrated through lysozyme precipitation process design. All procedures in the case study have been fully automated through an intelligent automation platform. The realization of automated data analysis and experiment design, and automated script programming for experimental procedures has the potential to increase lab productivity. © 2013 Society for Laboratory Automation and Screening.

  13. Review of Integration of Distributed Energy Resources (DERs) into Power Systems

    DEFF Research Database (Denmark)

    Wu, Qiuwei; Xu, Zhao

    2011-01-01

    state‐of‐the‐art DER integration concepts  relations existing DER integration concepts to the EV system The power balancing challenges of power systems brought by high penetration of intermittent DER have been discussed, especially the wind power integration in the Danish context. The relevance...... of the integration of electric vehicles (EVs) to the DER integration concepts have been analyzed as well based on the energy storage potential of EVs.   Two main concepts for DER integration, virtual power plant (VPP) and microgrids, are described and a comparison of the two concepts have been done. The comparison......An overview of the integration of distributed energy resources (DER) into power systems has been presented in this report. Different aspects of integration of DER into power systems have been reviewed and discussed which are listed below.    needs of DER integration into power systems  various...

  14. Optimal distribution of integration time for intensity measurements in degree of linear polarization polarimetry.

    Science.gov (United States)

    Li, Xiaobo; Hu, Haofeng; Liu, Tiegen; Huang, Bingjing; Song, Zhanjie

    2016-04-04

    We consider the degree of linear polarization (DOLP) polarimetry system, which performs two intensity measurements at orthogonal polarization states to estimate DOLP. We show that if the total integration time of intensity measurements is fixed, the variance of the DOLP estimator depends on the distribution of integration time for two intensity measurements. Therefore, by optimizing the distribution of integration time, the variance of the DOLP estimator can be decreased. In this paper, we obtain the closed-form solution of the optimal distribution of integration time in an approximate way by employing Delta method and Lagrange multiplier method. According to the theoretical analyses and real-world experiments, it is shown that the variance of the DOLP estimator can be decreased for any value of DOLP. The method proposed in this paper can effectively decrease the measurement variance and thus statistically improve the measurement accuracy of the polarimetry system.

  15. Rules-based analysis with JBoss Drools: adding intelligence to automation

    International Nuclear Information System (INIS)

    Ley, E. de; Jacobs, D.

    2012-01-01

    Rule engines are specialized software systems for applying conditional actions (if/then rules) on data. They are also known as 'production rule systems'. Rules engines are less-known as software technology than the traditional procedural, object-oriented, scripting or dynamic development languages. This is a pity, as their usage may offer an important enrichment to a development toolbox. JBoss Drools is an open-source rules engine that can easily be embedded in any Java application. Through an integration in our Passerelle process automation suite, we have been able to provide advanced solutions for intelligent process automation, complex event processing, system monitoring and alarming, automated repair etc. This platform has been proven for many years as an automated diagnosis and repair engine for Belgium's largest telecom provider, and it is being piloted at Synchrotron Soleil for device monitoring and alarming. After an introduction to rules engines in general and JBoss Drools in particular, we will present its integration in a solution platform, some important principles and a practical use case. (authors)

  16. Automating large-scale reactor systems

    International Nuclear Information System (INIS)

    Kisner, R.A.

    1985-01-01

    This paper conveys a philosophy for developing automated large-scale control systems that behave in an integrated, intelligent, flexible manner. Methods for operating large-scale systems under varying degrees of equipment degradation are discussed, and a design approach that separates the effort into phases is suggested. 5 refs., 1 fig

  17. AUTOMATED PROCESS MONITORING: APPLYING PROVEN AUTOMATION TECHNIQUES TO INTERNATIONAL SAFEGUARDS NEEDS

    International Nuclear Information System (INIS)

    O'Hara, Matthew J.; Durst, Philip C.; Grate, Jay W.; Devol, Timothy A.; Egorov, Oleg; Clements, John P.

    2008-01-01

    Identification and quantification of specific alpha- and beta-emitting radionuclides in complex liquid matrices is highly challenging, and is typically accomplished through laborious wet chemical sample preparation and separations followed by analysis using a variety of detection methodologies (e.g., liquid scintillation, gas proportional counting, alpha energy analysis, mass spectrometry). Analytical results may take days or weeks to report. Chains of custody and sample security measures may also complicate or slow the analytical process. When an industrial process-scale plant requires the monitoring of specific radionuclides as an indication of the composition of its feed stream or of plant performance, radiochemical measurements must be fast, accurate, and reliable. Scientists at Pacific Northwest National Laboratory have assembled a fully automated prototype Process Monitor instrument capable of a variety of tasks: automated sampling directly from a feed stream, sample digestion/analyte redox adjustment, chemical separations, radiochemical detection and data analysis/reporting. The system is compact, its components are fluidically inter-linked, and analytical results could be immediately transmitted to on- or off-site locations. The development of a rapid radiochemical Process Monitor for 99Tc in Hanford tank waste processing streams, capable of performing several measurements per hour, will be discussed in detail. More recently, the automated platform was modified to perform measurements of 90Sr in Hanford tank waste stimulant. The system exemplifies how automation could be integrated into reprocessing facilities to support international nuclear safeguards needs

  18. Plant automation

    International Nuclear Information System (INIS)

    Christensen, L.J.; Sackett, J.I.; Dayal, Y.; Wagner, W.K.

    1989-01-01

    This paper describes work at EBR-II in the development and demonstration of new control equipment and methods and associated schemes for plant prognosis, diagnosis, and automation. The development work has attracted the interest of other national laboratories, universities, and commercial companies. New initiatives include use of new control strategies, expert systems, advanced diagnostics, and operator displays. The unique opportunity offered by EBR-II is as a test bed where a total integrated approach to automatic reactor control can be directly tested under real power plant conditions

  19. Synthesis of tracers using automated radiochemistry and robotics

    International Nuclear Information System (INIS)

    Dannals, R.F.

    1992-07-01

    Synthesis of high specific activity radiotracers labeled with short-lived positron-emitting radionuclides for positron emission tomography (PET) often requires handling large initial quantities of radioactivity. High specific activities are required when preparing tracers for use in PET studies of neuroreceptors. A fully automated approach for tracer synthesis is highly desirable. This proposal involves the development of a system for the Synthesis of Tracers using Automated Radiochemistry and Robotics (STARR) for this purpose. While the long range objective of the proposed research is the development of a totally automated radiochemistry system for the production of major high specific activity 11 C-radiotracers for use in PET, the specific short range objectives are the automation of 11 C-methyl iodide ( 11 CH 3 I) production via an integrated approach using both radiochemistry modular labstations and robotics, and the extension of this automated capability to the production of several radiotracers for PET (initially, 11 C-methionine, 3-N-[ 11 C-methyl]spiperone, and [ 11 C]-carfentanil)

  20. Towards intelligent automation of power plant design and operations: The role of interactive simulations and distributed expert systems

    International Nuclear Information System (INIS)

    Otaduy, P.J.

    1992-01-01

    The design process of a power plant can be viewed as machine- chromosome engineering: When the final layout is implemented, the lifetime operating characteristics, constraints, strengths, and weaknesses of the resulting power-plant-specimen are durably determined. Hence, the safety, operability, maneuverability, availability, maintenance requirements, and costs of a power plant are directly related to the goodness of its electromechanical-genes. This paper addresses the desirability of incorporating distributed computing, distributed object management, and multimedia technologies to power plant engineering, in particular, to design and operations. The promise these technologies have for enhancing the quality and amount of engineering knowledge available, concurrently, online, to plant designers, maintenance crews, and operators is put into perspective. The role that advanced interactive simulations and expert systems will play in the intelligent automation of power plant design and operations is discussed

  1. A Multiagent System-Based Protection and Control Scheme for Distribution System With Distributed-Generation Integration

    DEFF Research Database (Denmark)

    Liu, Z.; Su, Chi; Hoidalen, Hans

    2017-01-01

    In this paper, a multi agent system (MAS) based protection and control scheme is proposed to deal with diverse operation conditions in distribution system due to distributed generation (DG) integration. Based on cooperation between DG controller and relays, an adaptive protection and control...... algorithm is designed on converter based wind turbine DG to limit the influence of infeed fault current. With the consideration of DG control modes, an adaptive relay setting strategy is developed to help protective relays adapt suitable settings to different operation conditions caused by the variations...

  2. A generalization information management system applied to electrical distribution

    Energy Technology Data Exchange (ETDEWEB)

    Geisler, K.I.; Neumann, S.A.; Nielsen, T.D.; Bower, P.K. (Empros Systems International (US)); Hughes, B.A.

    1990-07-01

    This article presents a system solution approach that meets the requirements being imposed by industry trends and the electric utility customer. Specifically, the solution addresses electric distribution management systems. Electrical distribution management is a particularly well suited area of application because it involves a high diversity of tasks, which are currently supported by a proliferation of automated islands. Islands of automation which currently exist include (among others) distribution operations, load management, automated mapping, facility management, work order processing, and planning.

  3. Applying machine learning to pattern analysis for automated in-design layout optimization

    Science.gov (United States)

    Cain, Jason P.; Fakhry, Moutaz; Pathak, Piyush; Sweis, Jason; Gennari, Frank; Lai, Ya-Chieh

    2018-04-01

    Building on previous work for cataloging unique topological patterns in an integrated circuit physical design, a new process is defined in which a risk scoring methodology is used to rank patterns based on manufacturing risk. Patterns with high risk are then mapped to functionally equivalent patterns with lower risk. The higher risk patterns are then replaced in the design with their lower risk equivalents. The pattern selection and replacement is fully automated and suitable for use for full-chip designs. Results from 14nm product designs show that the approach can identify and replace risk patterns with quantifiable positive impact on the risk score distribution after replacement.

  4. The ECLSS Advanced Automation Project Evolution and Technology Assessment

    Science.gov (United States)

    Dewberry, Brandon S.; Carnes, James R.; Lukefahr, Brenda D.; Rogers, John S.; Rochowiak, Daniel M.; Mckee, James W.; Benson, Brian L.

    1990-01-01

    Viewgraphs on Environmental Control and Life Support System (ECLSS) advanced automation project evolution and technology assessment are presented. Topics covered include: the ECLSS advanced automation project; automatic fault diagnosis of ECLSS subsystems descriptions; in-line, real-time chemical and microbial fluid analysis; and object-oriented, distributed chemical and microbial modeling of regenerative environmental control systems description.

  5. Methodology, Algorithms, and Emerging Tool for Automated Design of Intelligent Integrated Multi-Sensor Systems

    Directory of Open Access Journals (Sweden)

    Andreas König

    2009-11-01

    Full Text Available The emergence of novel sensing elements, computing nodes, wireless communication and integration technology provides unprecedented possibilities for the design and application of intelligent systems. Each new application system must be designed from scratch, employing sophisticated methods ranging from conventional signal processing to computational intelligence. Currently, a significant part of this overall algorithmic chain of the computational system model still has to be assembled manually by experienced designers in a time and labor consuming process. In this research work, this challenge is picked up and a methodology and algorithms for automated design of intelligent integrated and resource-aware multi-sensor systems employing multi-objective evolutionary computation are introduced. The proposed methodology tackles the challenge of rapid-prototyping of such systems under realization constraints and, additionally, includes features of system instance specific self-correction for sustained operation of a large volume and in a dynamically changing environment. The extension of these concepts to the reconfigurable hardware platform renders so called self-x sensor systems, which stands, e.g., for self-monitoring, -calibrating, -trimming, and -repairing/-healing systems. Selected experimental results prove the applicability and effectiveness of our proposed methodology and emerging tool. By our approach, competitive results were achieved with regard to classification accuracy, flexibility, and design speed under additional design constraints.

  6. 用于统计测试概率分布生成的自动搜索方法%Automated Search Method for Statistical Test Probability Distribution Generation

    Institute of Scientific and Technical Information of China (English)

    周晓莹; 高建华

    2013-01-01

    A strategy based on automated search for probability distribution construction is proposed, which comprises the design of representation format and evaluation function for the probability distribution. Combining with simulated annealing algorithm, an indicator is defined to formalize the automated search process based on the Markov model. Experimental results show that the method effectively improves the accuracy of the automated search, which can reduce the expense of statistical test by providing the statistical test with fairly efficient test data since it successfully finds the neat-optimal probability distribution within a certain time.%提出一种基于自动搜索的概率分布生成方法,设计对概率分布的表示形式与评估函数,同时结合模拟退火算法设计基于马尔可夫模型的自动搜索过程.实验结果表明,该方法能够有效地提高自动搜索的准确性,在一定时间内成功找到接近最优的概率分布,生成高效的测试数据,同时达到降低统计测试成本的目的.

  7. Nuon integrates fragmented IT environment with Converge 'meter-to-bill' enterprise system

    Energy Technology Data Exchange (ETDEWEB)

    Johnston, J.; Richmond, P. [Siemens Metering, Inc (Netherlands)

    2002-10-01

    Integration of the IT services of three large utility companies that merged into one is described. Prior to the merger each company owned its own data acquisition and billing applications and followed its own information processing and business rules. The newly formed company had the unenviable task of integrating these disparate applications into one streamlined and efficient company-wide system. A complicating factor was the earlier deregulation of the Dutch energy market which already forced vertically structured utilities to split into separate functional companies, with the result that meter and other system and logistic information was widely dispersed across multiple systems and locations. To bring order out of this chaos and to create a single company-wide integrated system Nuon partnered with Siemens to evaluate the existing system and to design a new one that would improve the situation. The new system that emerged is a cohesive, robust IT environment with many benefits. Some of these are: consolidated metering data in one open architecture environment; a fully automated interface to the SAP billing system; fully flexible and automated data reporting and validation functionality; data distribution capability throughout the organization and to energy consumers via the Internet; and fully automated data exchange capability with other market players and the System Operator.

  8. The Automated DC Parameter Testing of GaAs MESFETs Using the Singer Automatic Integrated Circuit Test System.

    Science.gov (United States)

    1980-09-01

    USING THE SINGER AUTOMATIC INTEGRATED CIRCUIT TEST SYSTEM, THOMAS L. HARPER AFIT/EE/GE/80- 7 Ist LT USAF -- -- - - __ AFIT/EE/GE/80-7 THE AUTOMATED DC...THOMAS L. HARPER ist Lt USAF Graduate Electrical Engineering September 1980 it’ Codes A _ _ _ J PREFACE This report is in support of the ongoing effort in...8217.-- I *t -1 ,p - tUel-, ir. ( /.s , j Yf) L) b ..... l P i:. +’ ,T i~: ",,’+l l L V i i ,’b : O Iil r, P V 47 C’+t ( ’ I ViH 47 V ’L 4 £, ,.;l 1 , h

  9. Integrated automation of the New Waddell Dam performance data acquisition system

    International Nuclear Information System (INIS)

    Welch, L.R.; Fields, P.E.

    1999-01-01

    New Waddell Dam, a key feature of the US Bureau of Reclamation's Central Arizona Project, had elements of its dam safety data acquisition system incorporated into the design and construction. The instrumentation array is a reflection of the dam's large size and foundation complexity. Much of the instrumentation is automated. This automation was accomplished while maintaining independent communication connections to major divisions of the instrument array. Fiber optic cables are used to provide high Quality data, free from voltage surges that could originate in a nearby powerplant switchyard or from lightning. The system has been working well but there are concerns with a lack of continued equipment manufacturer support

  10. A Framework for the Automation of Air Defence Systems

    NARCIS (Netherlands)

    Choenni, R.S.; Leijnse, C.

    The need for more efficiency in military organizations is growing. It is expected that a significant increase in efficiency can be obtained by an integration of communication and information technology. This integration may result in (sub)systems that are fully automated, i.e., systems that are

  11. Enhancing Cooperative Loan Scheme Through Automated Loan ...

    African Journals Online (AJOL)

    Journal Home > Vol 6, No 1 (2013) > ... The concept of automation has been variously applied in most computing fields. ... competent capabilities to eliminate data inconsistency and redundancy as well as ensuring data integrity and security, ...

  12. Automated Blood Sample Preparation Unit (ABSPU) for Portable Microfluidic Flow Cytometry.

    Science.gov (United States)

    Chaturvedi, Akhil; Gorthi, Sai Siva

    2017-02-01

    Portable microfluidic diagnostic devices, including flow cytometers, are being developed for point-of-care settings, especially in conjunction with inexpensive imaging devices such as mobile phone cameras. However, two pervasive drawbacks of these have been the lack of automated sample preparation processes and cells settling out of sample suspensions, leading to inaccurate results. We report an automated blood sample preparation unit (ABSPU) to prevent blood samples from settling in a reservoir during loading of samples in flow cytometers. This apparatus automates the preanalytical steps of dilution and staining of blood cells prior to microfluidic loading. It employs an assembly with a miniature vibration motor to drive turbulence in a sample reservoir. To validate performance of this system, we present experimental evidence demonstrating prevention of blood cell settling, cell integrity, and staining of cells prior to flow cytometric analysis. This setup is further integrated with a microfluidic imaging flow cytometer to investigate cell count variability. With no need for prior sample preparation, a drop of whole blood can be directly introduced to the setup without premixing with buffers manually. Our results show that integration of this assembly with microfluidic analysis provides a competent automation tool for low-cost point-of-care blood-based diagnostics.

  13. A Software Architecture for Simulation Support in Building Automation

    Directory of Open Access Journals (Sweden)

    Sergio Leal

    2014-07-01

    Full Text Available Building automation integrates the active components in a building and, thus, has to connect components of different industries. The goal is to provide reliable and efficient operation. This paper describes how simulation can support building automation and how the deployment process of simulation assisted building control systems can be structured. We look at the process as a whole and map it to a set of formally described workflows that can partly be automated. A workbench environment supports the process execution by means of improved planning, collaboration and deployment. This framework allows integration of existing tools, as well as manual tasks, and is, therefore, many more intricate than regular software deployment tools. The complex environment of building commissioning requires expertise in different domains, especially lighting, heating, ventilation, air conditioning, measurement and control technology, as well as energy efficiency; therefore, we present a framework for building commissioning and describe a deployment process that is capable of supporting the various phases of this approach.

  14. NextGen Technologies on the FAA's Standard Terminal Automation Replacement System

    Science.gov (United States)

    Witzberger, Kevin; Swenson, Harry; Martin, Lynne; Lin, Melody; Cheng, Jinn-Hwei

    2014-01-01

    This paper describes the integration, evaluation, and results from a high-fidelity human-in-the-loop (HITL) simulation of key NASA Air Traffic Management Technology Demonstration - 1 (ATD- 1) technologies implemented in an enhanced version of the FAA's Standard Terminal Automation Replacement System (STARS) platform. These ATD-1 technologies include: (1) a NASA enhanced version of the FAA's Time-Based Flow Management, (2) a NASA ground-based automation technology known as controller-managed spacing (CMS), and (3) a NASA advanced avionics airborne technology known as flight-deck interval management (FIM). These ATD-1 technologies have been extensively tested in large-scale HITL simulations using general-purpose workstations to study air transportation technologies. These general purpose workstations perform multiple functions and are collectively referred to as the Multi-Aircraft Control System (MACS). Researchers at NASA Ames Research Center and Raytheon collaborated to augment the STARS platform by including CMS and FIM advisory tools to validate the feasibility of integrating these automation enhancements into the current FAA automation infrastructure. NASA Ames acquired three STARS terminal controller workstations, and then integrated the ATD-1 technologies. HITL simulations were conducted to evaluate the ATD-1 technologies when using the STARS platform. These results were compared with the results obtained when the ATD-1 technologies were tested in the MACS environment. Results collected from the numerical data show acceptably minor differences, and, together with the subjective controller questionnaires showing a trend towards preferring STARS, validate the ATD-1/STARS integration.

  15. Can pilots still fly? Role distribution and hybrid interaction in advanced automated aircraft

    OpenAIRE

    Weyer, Johannes

    2015-01-01

    Recent accidents of commercial airplanes have raised the question once more whether pilots can rely on automation in order to fly advanced aircraft safely. Although the issue of human-machine interaction in aviation has been investigated frequently, profound knowledge about pilots’ perceptions and attitudes is fragmentary and partly out-dated. The paper at hand presents the results of a pilot survey, which has been guided by a collaborative perspective of human-automation decision-making. It ...

  16. Automated bar coding of air samples at Hanford (ABCASH)

    International Nuclear Information System (INIS)

    Troyer, G.L.; Brayton, D.D.; McNeece, S.G.

    1992-10-01

    This article describes the basis, main features and benefits of an automated system for tracking and reporting radioactive air particulate samples. The system was developed due to recognized need for improving the quality and integrity of air sample data related to personnel and environmental protection. The data capture, storage, and retrieval of air sample data are described. The automation aspect of the associated and data input eliminates a large potential for human error. The system utilizes personal computers, handheld computers, a commercial personal computer database package, commercial programming languages, and complete documentation to satisfy the system's automation objective

  17. A Gordeyev integral for electrostatic waves in a magnetized plasma with a kappa velocity distribution

    International Nuclear Information System (INIS)

    Mace, R.L.

    2003-01-01

    A Gordeyev-type integral for the investigation of electrostatic waves in magnetized plasma having a kappa or generalized Lorentzian velocity distribution is derived. The integral readily reduces, in the unmagnetized and parallel propagation limits, to simple expressions involving the Z κ function. For propagation perpendicular to the magnetic field, it is shown that the Gordeyev integral can be written in closed form as a sum of two generalized hypergeometric functions, which permits easy analysis of the dispersion relation for electrostatic waves. Employing the same analytical techniques used for the kappa distribution, it is further shown that the well-known Gordeyev integral for a Maxwellian distribution can be written very concisely as a generalized hypergeometric function in the limit of perpendicular propagation. This expression, in addition to its mathematical conciseness, has other advantages over the traditional sum over modified Bessel functions form. Examples of the utility of these generalized hypergeometric series, especially how they simplify analyses of electrostatic waves propagating perpendicular to the magnetic field, are given. The new expression for the Gordeyev integral for perpendicular propagation is solved numerically to obtain the dispersion relations for the electrostatic Bernstein modes in a plasma with a kappa distribution

  18. Towards a fully automated lab-on-a-disc system integrating sample enrichment and detection of analytes from complex matrices

    DEFF Research Database (Denmark)

    Andreasen, Sune Zoëga

    the technology on a large scale from fulfilling its potential for maturing into applied technologies and products. In this work, we have taken the first steps towards realizing a capable and truly automated “sample-to-answer” analysis system, aimed at small molecule detection and quantification from a complex...... sample matrix. The main result is a working prototype of a microfluidic system, integrating both centrifugal microfluidics for sample handling, supported liquid membrane extraction (SLM) for selective and effective sample treatment, as well as in-situ electrochemical detection. As a case study...

  19. Automated pipe handling systems for new and retrofit applications in shallow drilling markets

    Energy Technology Data Exchange (ETDEWEB)

    McDougall, P.; Fikowski, L.M. [Blackbird Well Servicing Inc., Calgary, AB (Canada)

    2003-07-01

    This presentation discussed the importance of the human interface as the main element in the development of automated mechanical systems on drilling rigs. Improvements in drilling rig designs are meant to improve manpower efficiencies and performance. The goal for Blackbird Well Servicing is to design automated and integrated processes that can be controlled manually at any point during an operation. Although some drilling operations can be fully automated and fully integrated, certain steps in the process are intentionally left open ended for human intervention. It was concluded that the consistency of performance is the most significant feature of integrated systems and that all drilling contractors should strive for smooth, steady performance rather than brute labour. Speed and efficiency increases with consistent performance. Reliability results in better performance, thereby lowering operating costs and more work for drilling contractors.

  20. Transcranial Magnetic Stimulation: An Automated Procedure to Obtain Coil-specific Models for Field Calculations

    DEFF Research Database (Denmark)

    Madsen, Kristoffer Hougaard; Ewald, Lars; Siebner, Hartwig R.

    2015-01-01

    Background: Field calculations for transcranial magnetic stimulation (TMS) are increasingly implemented online in neuronavigation systems and in more realistic offline approaches based on finite-element methods. They are often based on simplified and/or non-validated models of the magnetic vector...... potential of the TMS coils. Objective: To develop an approach to reconstruct the magnetic vector potential based on automated measurements. Methods: We implemented a setup that simultaneously measures the three components of the magnetic field with high spatial resolution. This is complemented by a novel...... approach to determine the magnetic vector potential via volume integration of the measured field. Results: The integration approach reproduces the vector potential with very good accuracy. The vector potential distribution of a standard figure-of-eight shaped coil determined with our setup corresponds well...

  1. Frameworks for Performing on Cloud Automated Software Testing Using Swarm Intelligence Algorithm: Brief Survey

    Directory of Open Access Journals (Sweden)

    Mohammad Hossain

    2018-04-01

    Full Text Available This paper surveys on Cloud Based Automated Testing Software that is able to perform Black-box testing, White-box testing, as well as Unit and Integration Testing as a whole. In this paper, we discuss few of the available automated software testing frameworks on the cloud. These frameworks are found to be more efficient and cost effective because they execute test suites over a distributed cloud infrastructure. One of the framework effectiveness was attributed to having a module that accepts manual test cases from users and it prioritize them accordingly. Software testing, in general, accounts for as much as 50% of the total efforts of the software development project. To lessen the efforts, one the frameworks discussed in this paper used swarm intelligence algorithms. It uses the Ant Colony Algorithm for complete path coverage to minimize time and the Bee Colony Optimization (BCO for regression testing to ensure backward compatibility.

  2. Automating ASW fusion

    OpenAIRE

    Pabelico, James C.

    2011-01-01

    Approved for public release; distribution is unlimited. This thesis examines ASW eFusion, an anti-submarine warfare (ASW) tactical decision aid (TDA) that utilizes Kalman filtering to improve battlespace awareness by simplifying and automating the track management process involved in anti-submarine warfare (ASW) watchstanding operations. While this program can currently help the ASW commander manage uncertainty and make better tactical decisions, the program has several limitations. Comman...

  3. Distributed Energy Resources and Dynamic Microgrid: An Integrated Assessment

    Science.gov (United States)

    Shang, Duo Rick

    The overall goal of this thesis is to improve understanding in terms of the benefit of DERs to both utility and to electricity end-users when integrated in power distribution system. To achieve this goal, a series of two studies was conducted to assess the value of DERs when integrated with new power paradigms. First, the arbitrage value of DERs was examined in markets with time-variant electricity pricing rates (e.g., time of use, real time pricing) under a smart grid distribution paradigm. This study uses a stochastic optimization model to estimate the potential profit from electricity price arbitrage over a five-year period. The optimization process involves two types of PHEVs (PHEV-10, and PHEV-40) under three scenarios with different assumptions on technology performance, electricity market and PHEV owner types. The simulation results indicate that expected arbitrage profit is not a viable option to engage PHEVs in dispatching and in providing ancillary services without more favorable policy and PHEV battery technologies. Subsidy or change in electricity tariff or both are needed. Second, it examined the concept of dynamic microgrid as a measure to improve distribution resilience, and estimates the prices of this emerging service. An economic load dispatch (ELD) model is developed to estimate the market-clearing price in a hypothetical community with single bid auction electricity market. The results show that the electricity market clearing price on the dynamic microgrid is predominantly decided by power output and cost of electricity of each type of DGs. At circumstances where CHP is the only source, the electricity market clearing price in the island is even cheaper than the on-grid electricity price at normal times. Integration of PHEVs in the dynamic microgrid will increase electricity market clearing prices. It demonstrates that dynamic microgrid is an economically viable alternative to enhance grid resilience.

  4. Evaluating Maximum Photovoltaic Integration in District Distribution Systems Considering Optimal Inverter Dispatch and Cloud Shading Conditions

    DEFF Research Database (Denmark)

    Ding, Tao; Kou, Yu; Yang, Yongheng

    2017-01-01

    . However, the intermittency of solar PV energy (e.g., due to passing clouds) may affect the PV generation in the district distribution network. To address this issue, the voltage magnitude constraints under the cloud shading conditions should be taken into account in the optimization model, which can......As photovoltaic (PV) integration increases in distribution systems, to investigate the maximum allowable PV integration capacity for a district distribution system becomes necessary in the planning phase, an optimization model is thus proposed to evaluate the maximum PV integration capacity while...

  5. Distribution of costs induced by the integration of RES-E power

    International Nuclear Information System (INIS)

    Barth, Ruediger; Weber, Christoph; Swider, Derk J.

    2008-01-01

    This article focuses on the distribution of costs induced by the integration of electricity generation from renewable energy sources (RES-E). The treatment to distribute these costs on different market actors is crucial for its development. For this purpose, individual actors of electricity markets and several cost categories are identified. According to the defined cost structure, possible treatments to distribute the individual cost categories on different relevant actors are described. Finally, an evaluation of the cost distribution treatments based on an economic analysis is given. Economic efficiency recommends that clearly attributable (shallow) grid connection as well as (deep) grid costs are charged to the corresponding RES-E producer and that the RES-E producers are also charged the regulating power costs. However, deep grid integration costs should be updated to reflect evolving scarcities. Also regulating power costs should reflect actual scarcity and thus be symmetric and based on real-time prices, taking into account the overall system imbalance. Moreover, the time span between the closure of the spot market and actual delivery should be chosen as short as possible to enable accurate RES-E production forecasts

  6. AUTOMATION OF CHAMPAGNE WINES PROCESS IN SPARKLING WINE PRESSURE TANK

    Directory of Open Access Journals (Sweden)

    E. V. Lukyanchuk

    2016-08-01

    Full Text Available The wine industry is now successfully solved the problem for the implementation of automation receiving points of grapes, crushing and pressing departments installation continuous fermentation work, blend tanks, production lines ordinary Madeira continuously working plants for ethyl alcohol installations champagne wine in continuous flow, etc. With the development of automation of technological progress productivity winemaking process develops in the following areas: organization of complex avtomatization sites grape processing with bulk transportation of the latter; improving the quality and durability of wines by the processing of a wide applying wine cold and heat, as well as technical and microbiological control most powerful automation equipment; the introduction of automated production processes of continuous technical champagne, sherry wine and cognac alcohol madery; the use of complex automation auxiliary production sites (boilers, air conditioners, refrigeration unitsand other.; complex avtomatization creation of enterprises, and sites manufactory bottling wines. In the wine industry developed more sophisticated schemes of automation and devices that enable the transition to integrated production automation, will create, are indicative automated enterprise serving for laboratories to study of the main problems of automation of production processes of winemaking.

  7. An Intelligent Automation Platform for Rapid Bioprocess Design

    Science.gov (United States)

    Wu, Tianyi

    2014-01-01

    Bioprocess development is very labor intensive, requiring many experiments to characterize each unit operation in the process sequence to achieve product safety and process efficiency. Recent advances in microscale biochemical engineering have led to automated experimentation. A process design workflow is implemented sequentially in which (1) a liquid-handling system performs high-throughput wet lab experiments, (2) standalone analysis devices detect the data, and (3) specific software is used for data analysis and experiment design given the user’s inputs. We report an intelligent automation platform that integrates these three activities to enhance the efficiency of such a workflow. A multiagent intelligent architecture has been developed incorporating agent communication to perform the tasks automatically. The key contribution of this work is the automation of data analysis and experiment design and also the ability to generate scripts to run the experiments automatically, allowing the elimination of human involvement. A first-generation prototype has been established and demonstrated through lysozyme precipitation process design. All procedures in the case study have been fully automated through an intelligent automation platform. The realization of automated data analysis and experiment design, and automated script programming for experimental procedures has the potential to increase lab productivity. PMID:24088579

  8. Integrated Cost-Benefit Assessment of Customer-Driven Distributed Generatio

    Directory of Open Access Journals (Sweden)

    Čedomir Zeljković

    2014-06-01

    Full Text Available Distributed generation (DG has the potential to bring respectable benefits to electricity customers, distribution utilities and community in general. Among the customer benefits, the most important are the electricity bill reduction, reliability improvement, use of recovered heat, and qualifying for financial incentives. In this paper, an integrated cost-benefit methodology for assessment of customer-driven DG is presented. Target customers are the industrial and commercial end-users that are critically dependent on electricity supply, due to high consumption, high power peak demand or high electricity supply reliability requirements. Stochastic inputs are represented by the appropriate probability models and then the Monte Carlo simulation is employed for each investment alternative. The obtained probability distributions for the prospective profit are used to assess the risk, compare the alternatives and make decisions.

  9. Appropriate Automation-Integrating Technical, Human, Organisational, Economic and Cultural Factors

    NARCIS (Netherlands)

    Martin, T.; Kiwinen, J.; Rijnsdorp, J.E.; Rijnsdorp, J.E.; Rodd, M.G.; Rouse, W.B.

    1991-01-01

    Automation technology, including digital computer and communication techniques, is being applied in an ever-increasing range of private and public spheres, and reaching third world cultures not previously exposed to such technology. It is engineers' responsibility to consider the direct and indirect

  10. Power electronics for renewable and distributed energy systems a sourcebook of topologies, control and integration

    CERN Document Server

    Chakraborty, Sudipta; Kramer, William E

    2013-01-01

    While most books approach power electronics and renewable energy as two separate subjects, Power Electronics for Renewable and Distributed Energy Systems takes an integrative approach; discussing power electronic converters topologies, controls and integration that are specific to the renewable and distributed energy system applications. An overview of power electronic technologies is followed by the introduction of various renewable and distributed energy resources that includes photovoltaics, wind, small hydroelectric, fuel cells, microturbines and variable speed generation. Energy storage s

  11. Automation of an energy-autarkic manufacturing plant following IEC 61499; Automatisierung einer energieautarken Fertigungsanlage nach IEC 61499

    Energy Technology Data Exchange (ETDEWEB)

    Gerber, Christian; Hirsch, Martin; Hanisch, Hans-Michael [Halle-Wittenberg Univ., Halle (Saale) (Germany). Lehrstuhl Automatisierungstechnik

    2009-07-01

    The requirements for future manufacturing plants are, beyond others, seamless reconfiguration, autonomy as far as possible as well as easy employment and maintenance for the end user. Within the EnAS project (Energy-Autarkic Actuators and Sensors), the group of the Automation Technology Lab in Halle has challenged the fulfillment of those requirements. Therefore, IEC 61499 compliant distributed controllers have been developed for the demonstrator-plant under particular consideration of reconfigurability. These controllers have been integrated into the process sequences of the demonstrator and afterwards several reconfiguration scenarios have been designed. The building of a Human-Machine-Interface for visualization and reconfiguration of the plant was an essential issue as well. The result is a highly flexible, easily reconfigurable system, which can be regarded as a prototype for automated manufacturing plants of a new generation. (orig.)

  12. Enhancing Business Process Automation by Integrating RFID Data and Events

    Science.gov (United States)

    Zhao, Xiaohui; Liu, Chengfei; Lin, Tao

    Business process automation is one of the major benefits for utilising Radio Frequency Identification (RFID) technology. Through readers to RFID middleware systems, the information and the movements of tagged objects can be used to trigger business transactions. These features change the way of business applications for dealing with the physical world from mostly quantity-based to object-based. Aiming to facilitate business process automation, this paper introduces a new method to model and incorporate business logics into RFID edge systems from an object-oriented perspective with emphasises on RFID's event-driven characteristics. A framework covering business rule modelling, event handling and system operation invocations is presented on the basis of the event calculus. In regard to the identified delayed effects in RFID-enabled applications, a two-block buffering mechanism is proposed to improve RFID query efficiency within the framework. The performance improvements are analysed with related experiments.

  13. An integrated economic and distributional analysis of energy policies

    Energy Technology Data Exchange (ETDEWEB)

    Labandeira, Xavier [Facultade de CC. Economicas, University of Vigo, 36310 Vigo (Spain); Labeaga, Jose M. [Instituto de Estudios Fiscales, Avda. Cardenal Herrera Oria 378, 28035 Madrid (Spain); Rodriguez, Miguel [Facultade de CC. Empresariais e Turismo, University of Vigo, 32004 Ourense (Spain)

    2009-12-15

    Most public policies, particularly those in the energy sphere, have not only efficiency but also distributional effects. However, there is a trade-off between modelling approaches suitable for calculating those impacts on the economy. For the former most of the studies have been conducted with general equilibrium models, whereas partial equilibrium models represent the main approach for distributional analysis. This paper proposes a methodology to simultaneously carry out an analysis of the distributional and efficiency consequences of changes in energy taxation. In order to do so, we have integrated a microeconomic household demand model and a computable general equilibrium model for the Spanish economy. We illustrate the advantages of this approach by simulating a revenue-neutral reform in Spanish indirect taxation, with a large increase of energy taxes that serve an environmental purpose. The results show that the reforms bring about significant efficiency and distributional effects, in some cases counterintuitive, and demonstrate the academic and social utility of this approximation. (author)

  14. An integrated economic and distributional analysis of energy policies

    International Nuclear Information System (INIS)

    Labandeira, Xavier; Labeaga, Jose M.; Rodriguez, Miguel

    2009-01-01

    Most public policies, particularly those in the energy sphere, have not only efficiency but also distributional effects. However, there is a trade-off between modelling approaches suitable for calculating those impacts on the economy. For the former most of the studies have been conducted with general equilibrium models, whereas partial equilibrium models represent the main approach for distributional analysis. This paper proposes a methodology to simultaneously carry out an analysis of the distributional and efficiency consequences of changes in energy taxation. In order to do so, we have integrated a microeconomic household demand model and a computable general equilibrium model for the Spanish economy. We illustrate the advantages of this approach by simulating a revenue-neutral reform in Spanish indirect taxation, with a large increase of energy taxes that serve an environmental purpose. The results show that the reforms bring about significant efficiency and distributional effects, in some cases counterintuitive, and demonstrate the academic and social utility of this approximation. (author)

  15. A Toolset for Supporting Iterative Human Automation: Interaction in Design

    Science.gov (United States)

    Feary, Michael S.

    2010-01-01

    The addition of automation has greatly extended humans' capability to accomplish tasks, including those that are difficult, complex and safety critical. The majority of Human - Automation Interacton (HAl) results in more efficient and safe operations, ho,,:,ever ertain unpected atomatlon behaviors or "automation surprises" can be frustrating and, In certain safety critical operations (e.g. transporttion, manufacturing control, medicine), may result in injuries or. the loss of life.. (Mellor, 1994; Leveson, 1995; FAA, 1995; BASI, 1998; Sheridan, 2002). This papr describes he development of a design tool that enables on the rapid development and evaluation. of automaton prototypes. The ultimate goal of the work is to provide a design platform upon which automation surprise vulnerability analyses can be integrated.

  16. Nonanalytic Laboratory Automation: A Quarter Century of Progress.

    Science.gov (United States)

    Hawker, Charles D

    2017-06-01

    Clinical laboratory automation has blossomed since the 1989 AACC meeting, at which Dr. Masahide Sasaki first showed a western audience what his laboratory had implemented. Many diagnostics and other vendors are now offering a variety of automated options for laboratories of all sizes. Replacing manual processing and handling procedures with automation was embraced by the laboratory community because of the obvious benefits of labor savings and improvement in turnaround time and quality. Automation was also embraced by the diagnostics vendors who saw automation as a means of incorporating the analyzers purchased by their customers into larger systems in which the benefits of automation were integrated to the analyzers.This report reviews the options that are available to laboratory customers. These options include so called task-targeted automation-modules that range from single function devices that automate single tasks (e.g., decapping or aliquoting) to multifunction workstations that incorporate several of the functions of a laboratory sample processing department. The options also include total laboratory automation systems that use conveyors to link sample processing functions to analyzers and often include postanalytical features such as refrigerated storage and sample retrieval.Most importantly, this report reviews a recommended process for evaluating the need for new automation and for identifying the specific requirements of a laboratory and developing solutions that can meet those requirements. The report also discusses some of the practical considerations facing a laboratory in a new implementation and reviews the concept of machine vision to replace human inspections. © 2017 American Association for Clinical Chemistry.

  17. Distributional, differential and integral problems: Equivalence and existence results

    Czech Academy of Sciences Publication Activity Database

    Monteiro, Giselle Antunes; Satco, B. R.

    2017-01-01

    Roč. 2017, č. 7 (2017), s. 1-26 ISSN 1417-3875 Institutional support: RVO:67985840 Keywords : derivative with respect to functions * distribution * Kurzweil-Stieltjes integral Subject RIV: BA - General Mathematics OBOR OECD: Pure mathematics Impact factor: 0.926, year: 2016 http://www.math.u-szeged.hu/ejqtde/periodica.html?periodica=1¶mtipus_ertek= publication ¶m_ertek=4753

  18. A model based message passing approach for flexible and scalable home automation controllers

    Energy Technology Data Exchange (ETDEWEB)

    Bienhaus, D. [INNIAS GmbH und Co. KG, Frankenberg (Germany); David, K.; Klein, N.; Kroll, D. [ComTec Kassel Univ., SE Kassel Univ. (Germany); Heerdegen, F.; Jubeh, R.; Zuendorf, A. [Kassel Univ. (Germany). FG Software Engineering; Hofmann, J. [BSC Computer GmbH, Allendorf (Germany)

    2012-07-01

    There is a large variety of home automation systems that are largely proprietary systems from different vendors. In addition, the configuration and administration of home automation systems is frequently a very complex task especially, if more complex functionality shall be achieved. Therefore, an open model for home automation was developed that is especially designed for easy integration of various home automation systems. This solution also provides a simple modeling approach that is inspired by typical home automation components like switches, timers, etc. In addition, a model based technology to achieve rich functionality and usability was implemented. (orig.)

  19. Realtime Automation Networks in moVing industrial Environments

    Directory of Open Access Journals (Sweden)

    Rafael Leidinger

    2012-04-01

    Full Text Available The radio-based wireless data communication has made the realization of new technical solutions possible in many fields of the automation technology (AT. For about ten years, a constant disproportionate growth of wireless technologies can be observed in the automation technology. However, it shows that especially for the AT, conven-tional technologies of office automation are unsuitable and/or not manageable. The employment of mobile ser-vices in the industrial automation technology has the potential of significant cost and time savings. This leads to an increased productivity in various fields of the AT, for example in the factory and process automation or in production logistics. In this paper technologies and solu-tions for an automation-suited supply of mobile wireless services will be introduced under the criteria of real time suitability, IT-security and service orientation. Emphasis will be put on the investigation and develop-ment of wireless convergence layers for different radio technologies, on the central provision of support services for an easy-to-use, central, backup enabled management of combined wired / wireless networks and on the study on integrability in a Profinet real-time Ethernet network [1].

  20. A Case Study of Reverse Engineering Integrated in an Automated Design Process

    Science.gov (United States)

    Pescaru, R.; Kyratsis, P.; Oancea, G.

    2016-11-01

    This paper presents a design methodology which automates the generation of curves extracted from the point clouds that have been obtained by digitizing the physical objects. The methodology is described on a product belonging to the industry of consumables, respectively a footwear type product that has a complex shape with many curves. The final result is the automated generation of wrapping curves, surfaces and solids according to the characteristics of the customer's foot, and to the preferences for the chosen model, which leads to the development of customized products.

  1. The Political Economy of Automation: Occupational Automatability and Preferences for Redistribution

    OpenAIRE

    van Hoorn, Andre

    2018-01-01

    Although the importance of technological change for increasing prosperity is undisputed and economists typically deem it unlikely that labor-saving technology causes long-term employment losses, people’s anxiety about automation and its distributive consequences can be an important shaper of economic and social policies. This paper considers the political economy of automation, proposing that individuals in occupations that are more at risk of losing their job to automation have stronger pref...

  2. Automated data collection based on RoboDiff at the ESRF beamline MASSIF-1

    Energy Technology Data Exchange (ETDEWEB)

    Nurizzo, Didier, E-mail: Didier.nurizzo@esrf.fr; Guichard, Nicolas; McSweeney, Sean; Theveneau, Pascal; Guijarro, Matias; Svensson, Olof; Mueller-Dieckmann, Christoph; Leonard, Gordon [ESRF, The European Synchrotron, 71, Avenue des Martyrs,CS 40220, 38043 Grenoble (France); Bowler, Matthew W. [EMBL Grenoble Outstation, 71 Avenue des Martyrs, CS90181, 38042 Grenoble Cedex 9 (France)

    2016-07-27

    The European Synchrotron Radiation Facility has a long standing history in the automation of experiments in Macromolecular Crystallography. MASSIF-1 (Massively Automated Sample Screening and evaluation Integrated Facility), a beamline constructed as part of the ESRF Upgrade Phase I program, has been open to the external user community since July 2014 and offers a unique completely automated data collection service to both academic and industrial structural biologists.

  3. 75 FR 5244 - Pipeline Safety: Integrity Management Program for Gas Distribution Pipelines; Correction

    Science.gov (United States)

    2010-02-02

    ... Management Program for Gas Distribution Pipelines; Correction AGENCY: Pipeline and Hazardous Materials Safety... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration 49 CFR Part... Regulations to require operators of gas distribution pipelines to develop and implement integrity management...

  4. An integrated drug prescription and distribution system: challenges and opportunities.

    Science.gov (United States)

    Lanssiers, R; Everaert, E; De Win, M; Van De Velde, R; De Clercq, H

    2002-01-01

    Using the hospital's drug prescription and distribution system as a guide, benefits and drawbacks of a medical activity management system that is tightly integrated with the supply chain management of a hospital will be discussed from the point of view of various participating healthcare actors.

  5. Gauss-Kronrod-Trapezoidal Integration Scheme for Modeling Biological Tissues with Continuous Fiber Distributions

    Science.gov (United States)

    Hou, Chieh; Ateshian, Gerard A.

    2015-01-01

    Fibrous biological tissues may be modeled using a continuous fiber distribution (CFD) to capture tension-compression nonlinearity, anisotropic fiber distributions, and load-induced anisotropy. The CFD framework requires spherical integration of weighted individual fiber responses, with fibers contributing to the stress response only when they are in tension. The common method for performing this integration employs the discretization of the unit sphere into a polyhedron with nearly uniform triangular faces (finite element integration or FEI scheme). Although FEI has proven to be more accurate and efficient than integration using spherical coordinates, it presents three major drawbacks: First, the number of elements on the unit sphere needed to achieve satisfactory accuracy becomes a significant computational cost in a finite element analysis. Second, fibers may not be in tension in some regions on the unit sphere, where the integration becomes a waste. Third, if tensed fiber bundles span a small region compared to the area of the elements on the sphere, a significant discretization error arises. This study presents an integration scheme specialized to the CFD framework, which significantly mitigates the first drawback of the FEI scheme, while eliminating the second and third completely. Here, integration is performed only over the regions of the unit sphere where fibers are in tension. Gauss-Kronrod quadrature is used across latitudes and the trapezoidal scheme across longitudes. Over a wide range of strain states, fiber material properties, and fiber angular distributions, results demonstrate that this new scheme always outperforms FEI, sometimes by orders of magnitude in the number of computational steps and relative accuracy of the stress calculation. PMID:26291492

  6. Automated sampling and data processing derived from biomimetic membranes

    DEFF Research Database (Denmark)

    Perry, Mark; Vissing, Thomas; Boesen, P.

    2009-01-01

    data processing software to analyze and organize the large amounts of data generated. In this work, we developed an automated instrumental voltage clamp solution based on a custom-designed software controller application (the WaveManager), which enables automated on-line voltage clamp data acquisition...... applicable to long-time series experiments. We designed another software program for off-line data processing. The automation of the on-line voltage clamp data acquisition and off-line processing was furthermore integrated with a searchable database (DiscoverySheet (TM)) for efficient data management......Recent advances in biomimetic membrane systems have resulted in an increase in membrane lifetimes from hours to days and months. Long-lived membrane systems demand the development of both new automated monitoring equipment capable of measuring electrophysiological membrane characteristics and new...

  7. Driver-centred vehicle automation: using network analysis for agent-based modelling of the driver in highly automated driving systems.

    Science.gov (United States)

    Banks, Victoria A; Stanton, Neville A

    2016-11-01

    To the average driver, the concept of automation in driving infers that they can become completely 'hands and feet free'. This is a common misconception, however, one that has been shown through the application of Network Analysis to new Cruise Assist technologies that may feature on our roads by 2020. Through the adoption of a Systems Theoretic approach, this paper introduces the concept of driver-initiated automation which reflects the role of the driver in highly automated driving systems. Using a combination of traditional task analysis and the application of quantitative network metrics, this agent-based modelling paper shows how the role of the driver remains an integral part of the driving system implicating the need for designers to ensure they are provided with the tools necessary to remain actively in-the-loop despite giving increasing opportunities to delegate their control to the automated subsystems. Practitioner Summary: This paper describes and analyses a driver-initiated command and control system of automation using representations afforded by task and social networks to understand how drivers remain actively involved in the task. A network analysis of different driver commands suggests that such a strategy does maintain the driver in the control loop.

  8. Integration of renewable generation and elastic loads into distribution grids

    CERN Document Server

    Ardakanian, Omid; Rosenberg, Catherine

    2016-01-01

    This brief examines the challenges of integrating distributed energy resources and high-power elastic loads into low-voltage distribution grids, as well as the potential for pervasive measurement. It explores the control needed to address these challenges and achieve various system-level and user-level objectives. A mathematical framework is presented for the joint control of active end-nodes at scale, and extensive numerical simulations demonstrate that proper control of active end-nodes can significantly enhance reliable and economical operation of the power grid.

  9. Maximized integration of photovoltaics into distribution grids using smart charging strategies for electric vehicles; Maximierte PV-Integration in Niederspannungsnetzen durch intelligente Nutzung von Elektrofahrzeugen

    Energy Technology Data Exchange (ETDEWEB)

    Troeschel, Martin; Scherfke, Stefan; Schuette, Steffen; Appelrath, H Juergen; Sonnenschein, Michael [OFFIS - Institut fuer Informatik, Oldenburg (Germany). Bereich Energie

    2011-07-01

    An ICT-based integration of electric vehicles (EV) offers a promising potential on the distribution grid level, especially regarding synergies with renewable and distributed energy systems. Of special interest are (a) the charging of EV with electric power from renewable energy sources and (b) a preferentially local usage of feed-in from distributed energy systems. Regarding a systemic analysis of electromobility, a stable and reliable operation of the electric power grid is a major constraint for the integration of EV and maximized usage of renewable energy. In the model project GridSurfer, we conducted simulation-based analyses on the integration of EV. In this contribution, we present results from an analysis of future Smart-Grid-scenarios with special regard to rural areas and distribution grids in north-western Germany. (orig.)

  10. MannDB – A microbial database of automated protein sequence analyses and evidence integration for protein characterization

    Directory of Open Access Journals (Sweden)

    Kuczmarski Thomas A

    2006-10-01

    Full Text Available Abstract Background MannDB was created to meet a need for rapid, comprehensive automated protein sequence analyses to support selection of proteins suitable as targets for driving the development of reagents for pathogen or protein toxin detection. Because a large number of open-source tools were needed, it was necessary to produce a software system to scale the computations for whole-proteome analysis. Thus, we built a fully automated system for executing software tools and for storage, integration, and display of automated protein sequence analysis and annotation data. Description MannDB is a relational database that organizes data resulting from fully automated, high-throughput protein-sequence analyses using open-source tools. Types of analyses provided include predictions of cleavage, chemical properties, classification, features, functional assignment, post-translational modifications, motifs, antigenicity, and secondary structure. Proteomes (lists of hypothetical and known proteins are downloaded and parsed from Genbank and then inserted into MannDB, and annotations from SwissProt are downloaded when identifiers are found in the Genbank entry or when identical sequences are identified. Currently 36 open-source tools are run against MannDB protein sequences either on local systems or by means of batch submission to external servers. In addition, BLAST against protein entries in MvirDB, our database of microbial virulence factors, is performed. A web client browser enables viewing of computational results and downloaded annotations, and a query tool enables structured and free-text search capabilities. When available, links to external databases, including MvirDB, are provided. MannDB contains whole-proteome analyses for at least one representative organism from each category of biological threat organism listed by APHIS, CDC, HHS, NIAID, USDA, USFDA, and WHO. Conclusion MannDB comprises a large number of genomes and comprehensive protein

  11. Combining Archetypes, Ontologies and Formalization Enables Automated Computation of Quality Indicators.

    Science.gov (United States)

    Legaz-García, María Del Carmen; Dentler, Kathrin; Fernández-Breis, Jesualdo Tomás; Cornet, Ronald

    2017-01-01

    ArchMS is a framework that represents clinical information and knowledge using ontologies in OWL, which facilitates semantic interoperability and thereby the exploitation and secondary use of clinical data. However, it does not yet support the automated assessment of quality of care. CLIF is a stepwise method to formalize quality indicators. The method has been implemented in the CLIF tool which supports its users in generating computable queries based on a patient data model which can be based on archetypes. To enable the automated computation of quality indicators using ontologies and archetypes, we tested whether ArchMS and the CLIF tool can be integrated. We successfully automated the process of generating SPARQL queries from quality indicators that have been formalized with CLIF and integrated them into ArchMS. Hence, ontologies and archetypes can be combined for the execution of formalized quality indicators.

  12. Changing technology in transportation : automated vehicles in freight.

    Science.gov (United States)

    2017-06-27

    The world of transportation is on the verge of undergoing an impactful transformation. Over the past decade, automotive computing technology has progressed far more rapidly than anticipated. Most major auto manufacturers integrated automated features...

  13. EpHLA: an innovative and user-friendly software automating the HLAMatchmaker algorithm for antibody analysis.

    Science.gov (United States)

    Sousa, Luiz Cláudio Demes da Mata; Filho, Herton Luiz Alves Sales; Von Glehn, Cristina de Queiroz Carrascosa; da Silva, Adalberto Socorro; Neto, Pedro de Alcântara dos Santos; de Castro, José Adail Fonseca; do Monte, Semíramis Jamil Hadad

    2011-12-01

    The global challenge for solid organ transplantation programs is to distribute organs to the highly sensitized recipients. The purpose of this work is to describe and test the functionality of the EpHLA software, a program that automates the analysis of acceptable and unacceptable HLA epitopes on the basis of the HLAMatchmaker algorithm. HLAMatchmaker considers small configurations of polymorphic residues referred to as eplets as essential components of HLA-epitopes. Currently, the analyses require the creation of temporary files and the manual cut and paste of laboratory tests results between electronic spreadsheets, which is time-consuming and prone to administrative errors. The EpHLA software was developed in Object Pascal programming language and uses the HLAMatchmaker algorithm to generate histocompatibility reports. The automated generation of reports requires the integration of files containing the results of laboratory tests (HLA typing, anti-HLA antibody signature) and public data banks (NMDP, IMGT). The integration and the access to this data were accomplished by means of the framework called eDAFramework. The eDAFramework was developed in Object Pascal and PHP and it provides data access functionalities for software developed in these languages. The tool functionality was successfully tested in comparison to actual, manually derived reports of patients from a renal transplantation program with related donors. We successfully developed software, which enables the automated definition of the epitope specificities of HLA antibodies. This new tool will benefit the management of recipient/donor pairs selection for highly sensitized patients. Copyright © 2011 Elsevier B.V. All rights reserved.

  14. SLAE-CPS: Smart Lean Automation Engine Enabled by Cyber-Physical Systems Technologies.

    Science.gov (United States)

    Ma, Jing; Wang, Qiang; Zhao, Zhibiao

    2017-06-28

    In the context of Industry 4.0, the demand for the mass production of highly customized products will lead to complex products and an increasing demand for production system flexibility. Simply implementing lean production-based human-centered production or high automation to improve system flexibility is insufficient. Currently, lean automation (Jidoka) that utilizes cyber-physical systems (CPS) is considered a cost-efficient and effective approach for improving system flexibility under shrinking global economic conditions. Therefore, a smart lean automation engine enabled by CPS technologies (SLAE-CPS), which is based on an analysis of Jidoka functions and the smart capacity of CPS technologies, is proposed in this study to provide an integrated and standardized approach to design and implement a CPS-based smart Jidoka system. A set of comprehensive architecture and standardized key technologies should be presented to achieve the above-mentioned goal. Therefore, a distributed architecture that joins service-oriented architecture, agent, function block (FB), cloud, and Internet of things is proposed to support the flexible configuration, deployment, and performance of SLAE-CPS. Then, several standardized key techniques are proposed under this architecture. The first one is for converting heterogeneous physical data into uniform services for subsequent abnormality analysis and detection. The second one is a set of Jidoka scene rules, which is abstracted based on the analysis of the operator, machine, material, quality, and other factors in different time dimensions. These Jidoka rules can support executive FBs in performing different Jidoka functions. Finally, supported by the integrated and standardized approach of our proposed engine, a case study is conducted to verify the current research results. The proposed SLAE-CPS can serve as an important reference value for combining the benefits of innovative technology and proper methodology.

  15. SLAE–CPS: Smart Lean Automation Engine Enabled by Cyber-Physical Systems Technologies

    Science.gov (United States)

    Ma, Jing; Wang, Qiang; Zhao, Zhibiao

    2017-01-01

    In the context of Industry 4.0, the demand for the mass production of highly customized products will lead to complex products and an increasing demand for production system flexibility. Simply implementing lean production-based human-centered production or high automation to improve system flexibility is insufficient. Currently, lean automation (Jidoka) that utilizes cyber-physical systems (CPS) is considered a cost-efficient and effective approach for improving system flexibility under shrinking global economic conditions. Therefore, a smart lean automation engine enabled by CPS technologies (SLAE–CPS), which is based on an analysis of Jidoka functions and the smart capacity of CPS technologies, is proposed in this study to provide an integrated and standardized approach to design and implement a CPS-based smart Jidoka system. A set of comprehensive architecture and standardized key technologies should be presented to achieve the above-mentioned goal. Therefore, a distributed architecture that joins service-oriented architecture, agent, function block (FB), cloud, and Internet of things is proposed to support the flexible configuration, deployment, and performance of SLAE–CPS. Then, several standardized key techniques are proposed under this architecture. The first one is for converting heterogeneous physical data into uniform services for subsequent abnormality analysis and detection. The second one is a set of Jidoka scene rules, which is abstracted based on the analysis of the operator, machine, material, quality, and other factors in different time dimensions. These Jidoka rules can support executive FBs in performing different Jidoka functions. Finally, supported by the integrated and standardized approach of our proposed engine, a case study is conducted to verify the current research results. The proposed SLAE–CPS can serve as an important reference value for combining the benefits of innovative technology and proper methodology. PMID:28657577

  16. Start up testing for the secure automated fabrication line

    International Nuclear Information System (INIS)

    Gerber, E.W.; Benson, E.M.; Dahl, R.E.

    1987-01-01

    The secure automated fabrication (SAF) line is a remotely operated, liquid metal reactor fuel fabrication process being built by Westinghouse Hanford Company for the Department of Energy. All process and control equipment is installed and start up testing has been initiated. Start up testing is comprised of five phases, each incorporating higher degrees of equipment integration, automation, and remote control. Testing methodology for SAF line start up is described in this report

  17. Automation from pictures

    International Nuclear Information System (INIS)

    Kozubal, A.J.

    1992-01-01

    The state transition diagram (STD) model has been helpful in the design of real time software, especially with the emergence of graphical computer aided software engineering (CASE) tools. Nevertheless, the translation of the STD to real time code has in the past been primarily a manual task. At Los Alamos we have automated this process. The designer constructs the STD using a CASE tool (Cadre Teamwork) using a special notation for events and actions. A translator converts the STD into an intermediate state notation language (SNL), and this SNL is compiled directly into C code (a state program). Execution of the state program is driven by external events, allowing multiple state programs to effectively share the resources of the host processor. Since the design and the code are tightly integrated through the CASE tool, the design and code never diverge, and we avoid design obsolescence. Furthermore, the CASE tool automates the production of formal technical documents from the graphic description encapsulated by the CASE tool. (author)

  18. Definition, analysis and development of an optical data distribution network for integrated avionics and control systems. Part 2: Component development and system integration

    Science.gov (United States)

    Yen, H. W.; Morrison, R. J.

    1984-01-01

    Fiber optic transmission is emerging as an attractive concept in data distribution onboard civil aircraft. Development of an Optical Data Distribution Network for Integrated Avionics and Control Systems for commercial aircraft will provide a data distribution network that gives freedom from EMI-RFI and ground loop problems, eliminates crosstalk and short circuits, provides protection and immunity from lightning induced transients and give a large bandwidth data transmission capability. In addition there is a potential for significantly reducing the weight and increasing the reliability over conventional data distribution networks. Wavelength Division Multiplexing (WDM) is a candidate method for data communication between the various avionic subsystems. With WDM all systems could conceptually communicate with each other without time sharing and requiring complicated coding schemes for each computer and subsystem to recognize a message. However, the state of the art of optical technology limits the application of fiber optics in advanced integrated avionics and control systems. Therefore, it is necessary to address the architecture for a fiber optics data distribution system for integrated avionics and control systems as well as develop prototype components and systems.

  19. Some classes of multivariate infinitely divisible distributions admitting stochastic integral representations

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler; Maejima, M.; Sato, K.

    2006-01-01

    The class of distributions on R generated by convolutions of Γ-distributions and the class generated by convolutions of mixtures of exponential distributions are generalized to higher dimensions and denoted by T(Rd) and B(Rd) . From the Lévy process {Xt(μ)} on Rd with distribution μ at t=1, Υ...... divisible distributions and of self-decomposable distributions on Rd , respectively. The relations with the mapping Φ from μ to the distribution at each time of the stationary process of Ornstein-Uhlenbeck type with background driving Lévy process {Xt(μ)} are studied. Developments of these results......(μ) is defined as the distribution of the stochastic integral ∫01log(1/t)dXt(μ) . This mapping is a generalization of the mapping Υ introduced by Barndorff-Nielsen and Thorbjørnsen in one dimension. It is proved that ϒ(ID(Rd))=B(Rd) and ϒ(L(Rd))=T(Rd) , where ID(Rd) and L(Rd) are the classes of infinitely...

  20. Controller Design Automation for Aeroservoelastic Design Optimization of Wind Turbines

    NARCIS (Netherlands)

    Ashuri, T.; Van Bussel, G.J.W.; Zaayer, M.B.; Van Kuik, G.A.M.

    2010-01-01

    The purpose of this paper is to integrate the controller design of wind turbines with structure and aerodynamic analysis and use the final product in the design optimization process (DOP) of wind turbines. To do that, the controller design is automated and integrated with an aeroelastic simulation

  1. Performance of an Additional Task During Level 2 Automated Driving: An On-Road Study Comparing Drivers With and Without Experience With Partial Automation.

    Science.gov (United States)

    Solís-Marcos, Ignacio; Ahlström, Christer; Kircher, Katja

    2018-05-01

    To investigate the influence of prior experience with Level 2 automation on additional task performance during manual and Level 2 partially automated driving. Level 2 automation is now on the market, but its effects on driver behavior remain unclear. Based on previous studies, we could expect an increase in drivers' engagement in secondary tasks during Level 2 automated driving, but it is yet unknown how drivers will integrate all the ongoing demands in such situations. Twenty-one drivers (12 without, 9 with Level 2 automation experience) drove on a highway manually and with Level 2 automation (exemplified by Volvo Pilot Assist generation 2; PA2) while performing an additional task. In half of the conditions, the task could be interrupted (self-paced), and in the other half, it could not (system-paced). Drivers' visual attention, additional task performance, and other compensatory strategies were analyzed. Driving with PA2 led to decreased scores in the additional task and more visual attention to the dashboard. In the self-paced condition, all drivers looked more to the task and perceived a lower mental demand. The drivers experienced with PA2 used the system and the task more than the novice group and performed more overtakings. The additional task interfered more with Level 2 automation than with manual driving. The drivers, particularly the automation novice drivers, used some compensatory strategies. Automation designers need to consider these potential effects in the development of future automated systems.

  2. Distributed XQuery-Based Integration and Visualization of Multimodality Brain Mapping Data.

    Science.gov (United States)

    Detwiler, Landon T; Suciu, Dan; Franklin, Joshua D; Moore, Eider B; Poliakov, Andrew V; Lee, Eunjung S; Corina, David P; Ojemann, George A; Brinkley, James F

    2009-01-01

    This paper addresses the need for relatively small groups of collaborating investigators to integrate distributed and heterogeneous data about the brain. Although various national efforts facilitate large-scale data sharing, these approaches are generally too "heavyweight" for individual or small groups of investigators, with the result that most data sharing among collaborators continues to be ad hoc. Our approach to this problem is to create a "lightweight" distributed query architecture, in which data sources are accessible via web services that accept arbitrary query languages but return XML results. A Distributed XQuery Processor (DXQP) accepts distributed XQueries in which subqueries are shipped to the remote data sources to be executed, with the resulting XML integrated by DXQP. A web-based application called DXBrain accesses DXQP, allowing a user to create, save and execute distributed XQueries, and to view the results in various formats including a 3-D brain visualization. Example results are presented using distributed brain mapping data sources obtained in studies of language organization in the brain, but any other XML source could be included. The advantage of this approach is that it is very easy to add and query a new source, the tradeoff being that the user needs to understand XQuery and the schemata of the underlying sources. For small numbers of known sources this burden is not onerous for a knowledgeable user, leading to the conclusion that the system helps to fill the gap between ad hoc local methods and large scale but complex national data sharing efforts.

  3. Distributed XQuery-based integration and visualization of multimodality brain mapping data

    Directory of Open Access Journals (Sweden)

    Landon T Detwiler

    2009-01-01

    Full Text Available This paper addresses the need for relatively small groups of collaborating investigators to integrate distributed and heterogeneous data about the brain. Although various national efforts facilitate large-scale data sharing, these approaches are generally too “heavyweight” for individual or small groups of investigators, with the result that most data sharing among collaborators continues to be ad hoc. Our approach to this problem is to create a “lightweight” distributed query architecture, in which data sources are accessible via web services that accept arbitrary query languages but return XML results. A Distributed XQuery Processor (DXQP accepts distributed XQueries in which subqueries are shipped to the remote data sources to be executed, with the resulting XML integrated by DXQP. A web-based application called DXBrain accesses DXQP, allowing a user to create, save and execute distributed XQueries, and to view the results in various formats including a 3-D brain visualization. Example results are presented using distributed brain mapping data sources obtained in studies of language organization in the brain, but any other XML source could be included. The advantage of this approach is that it is very easy to add and query a new source, the tradeoff being that the user needs to understand XQuery and the schemata of the underlying sources. For small numbers of known sources this burden is not onerous for a knowledgeable user, leading to the conclusion that the system helps to fill the gap between ad hoc local methods and large scale but complex national data sharing efforts.

  4. AGIS: Integration of new technologies used in ATLAS Distributed Computing

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00291854; The ATLAS collaboration; Di Girolamo, Alessandro; Alandes Pradillo, Maria

    2017-01-01

    The variety of the ATLAS Distributed Computing infrastructure requires a central information system to define the topology of computing resources and to store different parameters and configuration data which are needed by various ATLAS software components. The ATLAS Grid Information System (AGIS) is the system designed to integrate configuration and status information about resources, services and topology of the computing infrastructure used by ATLAS Distributed Computing applications and services. Being an intermediate middleware system between clients and external information sources (like central BDII, GOCDB, MyOSG), AGIS defines the relations between experiment specific used resources and physical distributed computing capabilities. Being in production during LHC Runl AGIS became the central information system for Distributed Computing in ATLAS and it is continuously evolving to fulfil new user requests, enable enhanced operations and follow the extension of the ATLAS Computing model. The ATLAS Computin...

  5. Active integration of electric vehicles in the distribution network - theory, modelling and practice

    DEFF Research Database (Denmark)

    Knezovic, Katarina

    an attractive asset for the distribution system operator (DSO). This thesis investigates how EVs can mitigate the self-induced adverse effects and actively help the distribution grid operation, either autonomously or in coordination, e.g., with an EV aggregator. The general framework for EV integration...

  6. Remote fabrication of nuclear fuel: a secure automated fabrication overview

    International Nuclear Information System (INIS)

    Nyman, D.H.; Benson, E.M.; Yatabe, J.M.; Nagamoto, T.T.

    1981-01-01

    An automated line for the fabrication of breeder reactor fuel pins is being developed. The line will be installed in the Fuels and Materials Examination Facility (FMEF) presently under construction at the Hanford site near Richland, Washington. The application of automation and remote operations to fuel processing technology is needed to meet program requirements of reduced personnel exposure, enhanced safeguards, improved product quality, and increased productivity. Commercially available robots are being integrated into operations such as handling of radioactive material within a process operation. These and other automated equipment and chemistry analyses systems under development are described

  7. Altering user' acceptance of automation through prior automation exposure.

    Science.gov (United States)

    Bekier, Marek; Molesworth, Brett R C

    2017-06-01

    Air navigation service providers worldwide see increased use of automation as one solution to overcome the capacity constraints imbedded in the present air traffic management (ATM) system. However, increased use of automation within any system is dependent on user acceptance. The present research sought to determine if the point at which an individual is no longer willing to accept or cooperate with automation can be manipulated. Forty participants underwent training on a computer-based air traffic control programme, followed by two ATM exercises (order counterbalanced), one with and one without the aid of automation. Results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation ('tipping point') decreased; suggesting it is indeed possible to alter automation acceptance. Practitioner Summary: This paper investigates whether the point at which a user of automation rejects automation (i.e. 'tipping point') is constant or can be manipulated. The results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation decreased; suggesting it is possible to alter automation acceptance.

  8. Automated Patent Searching in the EPO: From Online Searching to Document Delivery.

    Science.gov (United States)

    Nuyts, Annemie; Jonckheere, Charles

    The European Patent Office (EPO) has recently implemented the last part of its ambitious automation project aimed at creating an automated search environment for approximately 1200 EPO patent search examiners. The examiners now have at their disposal an integrated set of tools offering a full range of functionalities from online searching, via…

  9. Automated Structure Solution with the PHENIX Suite

    Energy Technology Data Exchange (ETDEWEB)

    Zwart, Peter H.; Zwart, Peter H.; Afonine, Pavel; Grosse-Kunstleve, Ralf W.; Hung, Li-Wei; Ioerger, Tom R.; McCoy, A.J.; McKee, Eric; Moriarty, Nigel; Read, Randy J.; Sacchettini, James C.; Sauter, Nicholas K.; Storoni, L.C.; Terwilliger, Tomas C.; Adams, Paul D.

    2008-06-09

    Significant time and effort are often required to solve and complete a macromolecular crystal structure. The development of automated computational methods for the analysis, solution and completion of crystallographic structures has the potential to produce minimally biased models in a short time without the need for manual intervention. The PHENIX software suite is a highly automated system for macromolecular structure determination that can rapidly arrive at an initial partial model of a structure without significant human intervention, given moderate resolution and good quality data. This achievement has been made possible by the development of new algorithms for structure determination, maximum-likelihood molecular replacement (PHASER), heavy-atom search (HySS), template and pattern-based automated model-building (RESOLVE, TEXTAL), automated macromolecular refinement (phenix.refine), and iterative model-building, density modification and refinement that can operate at moderate resolution (RESOLVE, AutoBuild). These algorithms are based on a highly integrated and comprehensive set of crystallographic libraries that have been built and made available to the community. The algorithms are tightly linked and made easily accessible to users through the PHENIX Wizards and the PHENIX GUI.

  10. Automated structure solution with the PHENIX suite

    Energy Technology Data Exchange (ETDEWEB)

    Terwilliger, Thomas C [Los Alamos National Laboratory; Zwart, Peter H [LBNL; Afonine, Pavel V [LBNL; Grosse - Kunstleve, Ralf W [LBNL

    2008-01-01

    Significant time and effort are often required to solve and complete a macromolecular crystal structure. The development of automated computational methods for the analysis, solution, and completion of crystallographic structures has the potential to produce minimally biased models in a short time without the need for manual intervention. The PHENIX software suite is a highly automated system for macromolecular structure determination that can rapidly arrive at an initial partial model of a structure without significant human intervention, given moderate resolution, and good quality data. This achievement has been made possible by the development of new algorithms for structure determination, maximum-likelihood molecular replacement (PHASER), heavy-atom search (HySS), template- and pattern-based automated model-building (RESOLVE, TEXTAL), automated macromolecular refinement (phenix. refine), and iterative model-building, density modification and refinement that can operate at moderate resolution (RESOLVE, AutoBuild). These algorithms are based on a highly integrated and comprehensive set of crystallographic libraries that have been built and made available to the community. The algorithms are tightly linked and made easily accessible to users through the PHENIX Wizards and the PHENIX GUI.

  11. Transforming Our SMEX Organization by Way of Innovation, Standardization, and Automation

    Science.gov (United States)

    Madden, Maureen; Crouse, Pat; Carry, Everett; Esposito, timothy; Parker, Jeffrey; Bradley, David

    2006-01-01

    NASA's Small Explorer (SMEX) Flight Operations Team (FOT) is currently tackling the challenge of supporting ground operations for several satellites that have surpassed their designed lifetime and have a dwindling budget. At Goddard Space Flight Center (GSFC), these missions are presently being reengineered into a fleet-oriented ground system. When complete, this ground system will provide command and control of four SMEX missions, and will demonstrate fleet automation and control concepts as a pathfinder for additional mission integrations. A goal of this reengineering effort is to demonstrate new ground-system technologies that show promise of supporting longer mission lifecycles and simplifying component integration. In pursuit of this goal, the SMEX organization has had to examine standardization, innovation, and automation. A core technology being demonstrated in this effort is the GSFC Mission Services Evolution Center (GMSEC) architecture. The GMSEC architecture focuses on providing standard interfaces for ground system applications to promote application interoperability. Building around commercial Message Oriented Middleware and providing a common messaging standard allows GMSEC to provide the capabilities necessary to support integration of new software components into existing missions and increase the level of interaction within the system. For SMS, GMSEC has become the technology platform to transform flight operations with the innovation and automation necessary to reduce operational costs. The automation technologies supported in SMEX are built upon capabilities provided by the GMSEC architecture that allows the FOT to further reduce the involvement of the console, operator. Initially, SMEX is automating only routine operations, such as safety and health monitoring, basic commanding, and system recovery. The operational concepts being developed here will reduce the need for staffed passes and are a necessity for future fleet management. As this

  12. An integrated DEA-COLS-SFA algorithm for optimization and policy making of electricity distribution units

    International Nuclear Information System (INIS)

    Azadeh, A.; Ghaderi, S.F.; Omrani, H.; Eivazy, H.

    2009-01-01

    This paper presents an integrated data envelopment analysis (DEA)-corrected ordinary least squares (COLS)-stochastic frontier analysis (SFA)-principal component analysis (PCA)-numerical taxonomy (NT) algorithm for performance assessment, optimization and policy making of electricity distribution units. Previous studies have generally used input-output DEA models for benchmarking and evaluation of electricity distribution units. However, this study proposes an integrated flexible approach to measure the rank and choose the best version of the DEA method for optimization and policy making purposes. It covers both static and dynamic aspects of information environment due to involvement of SFA which is finally compared with the best DEA model through the Spearman correlation technique. The integrated approach would yield in improved ranking and optimization of electricity distribution systems. To illustrate the usability and reliability of the proposed algorithm, 38 electricity distribution units in Iran have been considered, ranked and optimized by the proposed algorithm of this study.

  13. JTst - An Automated Unit Testing Tool for Java Program

    OpenAIRE

    Kamal Z.  Zamli; Nor A. M.  Isa

    2008-01-01

    Software testing is an integral part of software development lifecycle. Lack of testing can often lead to disastrous consequences including lost of data, fortunes, and even lives. Despite its importance, current software testing practice lacks automation, and is still primarily based on highly manual processes from the generation of test cases up to the actual execution of the test. Although the emergence of helpful automated testing tools in the market is blooming, their adoptions are lackin...

  14. Models of Automation Surprise: Results of a Field Survey in Aviation

    Directory of Open Access Journals (Sweden)

    Robert De Boer

    2017-09-01

    Full Text Available Automation surprises in aviation continue to be a significant safety concern and the community’s search for effective strategies to mitigate them are ongoing. The literature has offered two fundamentally divergent directions, based on different ideas about the nature of cognition and collaboration with automation. In this paper, we report the results of a field study that empirically compared and contrasted two models of automation surprises: a normative individual-cognition model and a sensemaking model based on distributed cognition. Our data prove a good fit for the sense-making model. This finding is relevant for aviation safety, since our understanding of the cognitive processes that govern human interaction with automation drive what we need to do to reduce the frequency of automation-induced events.

  15. Control and Automation Systems at the TSO/DSO interface

    DEFF Research Database (Denmark)

    Silvestro, F.; Pilo, F.; Mauri, G.

    2017-01-01

    (Distribution Network Operator) have to assure a secure reliable and good power quality, without taking into consideration any real-time operation of the active components present in their systems. In order to accomplish their missions, DNOs will have to exploit the support of control and automation systems...... and protection systems, but also “external inputs” coming from the Transmission Networks (operated by the Transmission System Operator) and the forthcoming “smart world” (i.e. smart cities, smart transports, smart industries, smart customers etc.). The processing of all such inputs will still have...... to be subordinated to the possibility for Distribution Companies to operate their network under their ultimate responsibility (DSO – Distribution System Operators). This paper presents an overview of the activities of CIGRE C6.25 Working Group (JWG), focusing on the control and automation systems for the future...

  16. Coordinated Demand Response and Distributed Generation Management in Residential Smart Microgrids

    DEFF Research Database (Denmark)

    Anvari-Moghaddam, Amjad; Mokhtari, Ghassem; Guerrero, Josep M.

    2016-01-01

    potentials to increase the functionality of a typical demand-side management (DSM) strategy, and typical implementation of building-level DERs by integrating them into a cohesive, networked package that fully utilizes smart energy-efficient end-use devices, advanced building control/automation systems......Nowadays with the emerging of small-scale integrated energy systems (IESs) in form of residential smart microgrids (SMGs), a large portion of energy can be saved through coordinated scheduling of smart household devices and management of distributed energy resources (DERs). There are significant......, and an integrated communications architecture to efficiently manage energy and comfort at the end-use location. By the aid of such technologies, residential consumers have also the capability to mitigate their energy costs and satisfy their own requirements paying less attention to the configuration of the energy...

  17. Knowledge-Based Aircraft Automation: Managers Guide on the use of Artificial Intelligence for Aircraft Automation and Verification and Validation Approach for a Neural-Based Flight Controller

    Science.gov (United States)

    Broderick, Ron

    1997-01-01

    The ultimate goal of this report was to integrate the powerful tools of artificial intelligence into the traditional process of software development. To maintain the US aerospace competitive advantage, traditional aerospace and software engineers need to more easily incorporate the technology of artificial intelligence into the advanced aerospace systems being designed today. The future goal was to transition artificial intelligence from an emerging technology to a standard technology that is considered early in the life cycle process to develop state-of-the-art aircraft automation systems. This report addressed the future goal in two ways. First, it provided a matrix that identified typical aircraft automation applications conducive to various artificial intelligence methods. The purpose of this matrix was to provide top-level guidance to managers contemplating the possible use of artificial intelligence in the development of aircraft automation. Second, the report provided a methodology to formally evaluate neural networks as part of the traditional process of software development. The matrix was developed by organizing the discipline of artificial intelligence into the following six methods: logical, object representation-based, distributed, uncertainty management, temporal and neurocomputing. Next, a study of existing aircraft automation applications that have been conducive to artificial intelligence implementation resulted in the following five categories: pilot-vehicle interface, system status and diagnosis, situation assessment, automatic flight planning, and aircraft flight control. The resulting matrix provided management guidance to understand artificial intelligence as it applied to aircraft automation. The approach taken to develop a methodology to formally evaluate neural networks as part of the software engineering life cycle was to start with the existing software quality assurance standards and to change these standards to include neural network

  18. High-dimensional quantum key distribution based on multicore fiber using silicon photonic integrated circuits

    DEFF Research Database (Denmark)

    Ding, Yunhong; Bacco, Davide; Dalgaard, Kjeld

    2017-01-01

    is intrinsically limited to 1 bit/photon. Here we propose and experimentally demonstrate, for the first time, a high-dimensional quantum key distribution protocol based on space division multiplexing in multicore fiber using silicon photonic integrated lightwave circuits. We successfully realized three mutually......-dimensional quantum states, and enables breaking the information efficiency limit of traditional quantum key distribution protocols. In addition, the silicon photonic circuits used in our work integrate variable optical attenuators, highly efficient multicore fiber couplers, and Mach-Zehnder interferometers, enabling...

  19. Complacency and Automation Bias in the Use of Imperfect Automation.

    Science.gov (United States)

    Wickens, Christopher D; Clegg, Benjamin A; Vieane, Alex Z; Sebok, Angelia L

    2015-08-01

    We examine the effects of two different kinds of decision-aiding automation errors on human-automation interaction (HAI), occurring at the first failure following repeated exposure to correctly functioning automation. The two errors are incorrect advice, triggering the automation bias, and missing advice, reflecting complacency. Contrasts between analogous automation errors in alerting systems, rather than decision aiding, have revealed that alerting false alarms are more problematic to HAI than alerting misses are. Prior research in decision aiding, although contrasting the two aiding errors (incorrect vs. missing), has confounded error expectancy. Participants performed an environmental process control simulation with and without decision aiding. For those with the aid, automation dependence was created through several trials of perfect aiding performance, and an unexpected automation error was then imposed in which automation was either gone (one group) or wrong (a second group). A control group received no automation support. The correct aid supported faster and more accurate diagnosis and lower workload. The aid failure degraded all three variables, but "automation wrong" had a much greater effect on accuracy, reflecting the automation bias, than did "automation gone," reflecting the impact of complacency. Some complacency was manifested for automation gone, by a longer latency and more modest reduction in accuracy. Automation wrong, creating the automation bias, appears to be a more problematic form of automation error than automation gone, reflecting complacency. Decision-aiding automation should indicate its lower degree of confidence in uncertain environments to avoid the automation bias. © 2015, Human Factors and Ergonomics Society.

  20. Virtual model of an automated system for the storage of collected waste

    Directory of Open Access Journals (Sweden)

    Enciu George

    2017-01-01

    Full Text Available One of the problems identified in waste collection integrated systems is the storage space. The design process of an automated system for the storage of collected waste includes finding solutions for the optimal exploitation of the limited storage space, seen that the equipment for the loading, identification, transport and transfer of the waste covers most of the available space inside the integrated collection system. In the present paper a three-dimensional model of an automated storage system designed by the authors for a business partner is presented. The storage system can be used for the following types of waste: plastic and glass recipients, aluminium cans, paper, cardboard and WEEE (waste electrical and electronic equipment. Special attention has been given to the transfer subsystem, specific for the storage system, which should be able to transfer different types and shapes of waste. The described virtual model of the automated system for the storage of collected waste will be part of the virtual model of the entire integrated waste collection system as requested by the beneficiary.

  1. Generative Adversarial Networks Based Heterogeneous Data Integration and Its Application for Intelligent Power Distribution and Utilization

    Directory of Open Access Journals (Sweden)

    Yuanpeng Tan

    2018-01-01

    Full Text Available Heterogeneous characteristics of a big data system for intelligent power distribution and utilization have already become more and more prominent, which brings new challenges for the traditional data analysis technologies and restricts the comprehensive management of distribution network assets. In order to solve the problem that heterogeneous data resources of power distribution systems are difficult to be effectively utilized, a novel generative adversarial networks (GANs based heterogeneous data integration method for intelligent power distribution and utilization is proposed. In the proposed method, GANs theory is introduced to expand the distribution of completed data samples. Then, a so-called peak clustering algorithm is proposed to realize the finite open coverage of the expanded sample space, and repair those incomplete samples to eliminate the heterogeneous characteristics. Finally, in order to realize the integration of the heterogeneous data for intelligent power distribution and utilization, the well-trained discriminator model of GANs is employed to check the restored data samples. The simulation experiments verified the validity and stability of the proposed heterogeneous data integration method, which provides a novel perspective for the further data quality management of power distribution systems.

  2. Secure Automated Microgrid Energy System

    Science.gov (United States)

    2016-12-01

    O&M Operations and Maintenance PSO Power System Optimization PV Photovoltaic RAID Redundant Array of Independent Disks RBAC Role...elements of the initial study and operational power system model (feeder size , protective devices, generation sources, controllable loads, transformers...EW-201340) Secure Automated Microgrid Energy System December 2016 This document has been cleared for public release; Distribution Statement A

  3. HAUTO: Automated composition of convergent services based in HTN planning

    Directory of Open Access Journals (Sweden)

    Armando Ordoñez

    2014-01-01

    Full Text Available This paper presents HAUTO, a framework able to compose convergent services automatically. HAUTO is based in HTN (hierarchical task networks Automated Planning and is composed of three modules: a request processing module that transforms natural language and context information into a planning instance, the automated composition module based on HTN planning and the execution environment for convergent (Web and telecom services. The integration of a planning component provides two basic functionalities: the possibility of customizing the composition of services using the user context information and a middleware level that integrates the execution of services in high performance telecom environments. Finally, a prototype in environmental early warning management is presented as a test case.

  4. Distribution transformer lifetime analysis in the presence of demand response and rooftop PV integration

    Directory of Open Access Journals (Sweden)

    Behi Behnaz

    2017-01-01

    Full Text Available Many distribution transformers have already exceeded half of their expected service life of 35 years in the infrastructure of Western Power, the electric distribution company supplying southwest of Western Australia, Australia. Therefore, it is anticipated that a high investment on transformer replacement happens in the near future. However, high renewable integration and demand response (DR are promising resources to defer the investment on infrastructure upgrade and extend the lifetime of transformers. This paper investigates the impact of rooftop photovoltaic (PV integration and customer engagement through DR on the lifetime of transformers in electric distribution networks. To this aim, first, a time series modelling of load, DR and PV is utilised for each year over a planning period. This load model is applied to a typical distribution transformer for which the hot-spot temperature rise is modelled based on the relevant standard. Using this calculation platform, the loss of life and the actual age of distribution transformer are obtained. Then, various scenarios including different levels of PV penetration and DR contribution are examined, and their impacts on the age of transformer are reported. Finally, the equivalent loss of net present value of distribution transformer is formulated and discussed. This formulation gives major benefits to the distribution network planners for analysing the contribution of PV and DR on lifetime extension of the distribution transformer. In addition, the provided model can be utilised in optimal investment analysis to find the best time for the transformer replacement and the associated cost considering PV penetration and DR. The simulation results show that integration of PV and DR within a feeder can significantly extend the lifetime of transformers.

  5. Toponomics method for the automated quantification of membrane protein translocation.

    Science.gov (United States)

    Domanova, Olga; Borbe, Stefan; Mühlfeld, Stefanie; Becker, Martin; Kubitz, Ralf; Häussinger, Dieter; Berlage, Thomas

    2011-09-19

    Intra-cellular and inter-cellular protein translocation can be observed by microscopic imaging of tissue sections prepared immunohistochemically. A manual densitometric analysis is time-consuming, subjective and error-prone. An automated quantification is faster, more reproducible, and should yield results comparable to manual evaluation. The automated method presented here was developed on rat liver tissue sections to study the translocation of bile salt transport proteins in hepatocytes. For validation, the cholestatic liver state was compared to the normal biological state. An automated quantification method was developed to analyze the translocation of membrane proteins and evaluated in comparison to an established manual method. Firstly, regions of interest (membrane fragments) are identified in confocal microscopy images. Further, densitometric intensity profiles are extracted orthogonally to membrane fragments, following the direction from the plasma membrane to cytoplasm. Finally, several different quantitative descriptors were derived from the densitometric profiles and were compared regarding their statistical significance with respect to the transport protein distribution. Stable performance, robustness and reproducibility were tested using several independent experimental datasets. A fully automated workflow for the information extraction and statistical evaluation has been developed and produces robust results. New descriptors for the intensity distribution profiles were found to be more discriminative, i.e. more significant, than those used in previous research publications for the translocation quantification. The slow manual calculation can be substituted by the fast and unbiased automated method.

  6. Cardiac imaging: working towards fully-automated machine analysis & interpretation.

    Science.gov (United States)

    Slomka, Piotr J; Dey, Damini; Sitek, Arkadiusz; Motwani, Manish; Berman, Daniel S; Germano, Guido

    2017-03-01

    Non-invasive imaging plays a critical role in managing patients with cardiovascular disease. Although subjective visual interpretation remains the clinical mainstay, quantitative analysis facilitates objective, evidence-based management, and advances in clinical research. This has driven developments in computing and software tools aimed at achieving fully automated image processing and quantitative analysis. In parallel, machine learning techniques have been used to rapidly integrate large amounts of clinical and quantitative imaging data to provide highly personalized individual patient-based conclusions. Areas covered: This review summarizes recent advances in automated quantitative imaging in cardiology and describes the latest techniques which incorporate machine learning principles. The review focuses on the cardiac imaging techniques which are in wide clinical use. It also discusses key issues and obstacles for these tools to become utilized in mainstream clinical practice. Expert commentary: Fully-automated processing and high-level computer interpretation of cardiac imaging are becoming a reality. Application of machine learning to the vast amounts of quantitative data generated per scan and integration with clinical data also facilitates a move to more patient-specific interpretation. These developments are unlikely to replace interpreting physicians but will provide them with highly accurate tools to detect disease, risk-stratify, and optimize patient-specific treatment. However, with each technological advance, we move further from human dependence and closer to fully-automated machine interpretation.

  7. Integrated production-distribution planning optimization models: A review in collaborative networks context

    Directory of Open Access Journals (Sweden)

    Beatriz Andres

    2017-01-01

    Full Text Available Researchers in the area of collaborative networks are more and more aware of proposing collaborative approaches to address planning processes, due to the advantages associated when enterprises perform integrated planning models. Collaborative production-distribution planning, among the supply network actors, is considered a proper mechanism to support enterprises on dealing with uncertainties and dynamicity associated to the current markets. Enterprises, and especially SMEs, should be able to overcome the continuous changes of the market by increasing their agility. Carrying out collaborative planning allows enterprises to enhance their readiness and agility for facing the market turbulences. However, SMEs have limited access when incorporating optimization tools to deal with collaborative planning, reducing their ability to respond to the competition. The problem to solve is to provide SMEs affordable solutions to support collaborative planning. In this regard, new optimisation algorithms are required in order to improve the collaboration within the supply network partners. As part of the H2020 Cloud Collaborative Manufacturing Networks (C2NET research project, this paper presents a study on integrated production and distribution plans. The main objective of the research is to identify gaps in current optimization models, proposed to address integrated planning, taking into account the requirements and needs of the industry. Thus, the needs of the companies belonging to the industrial pilots, defined in the C2NET project, are identified; analysing how these needs are covered by the optimization models proposed in the literature, to deal with the integrated production-distribution planning.

  8. Development of An Optimization Method for Determining Automation Rate in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Lee, Seung Min; Seong, Poong Hyun; Kim, Jong Hyun

    2014-01-01

    Since automation was introduced in various industrial fields, it has been known that automation provides positive effects like greater efficiency and fewer human errors, and negative effect defined as out-of-the-loop (OOTL). Thus, before introducing automation in nuclear field, the estimation of the positive and negative effects of automation on human operators should be conducted. In this paper, by focusing on CPS, the optimization method to find an appropriate proportion of automation is suggested by integrating the suggested cognitive automation rate and the concepts of the level of ostracism. The cognitive automation rate estimation method was suggested to express the reduced amount of human cognitive loads, and the level of ostracism was suggested to express the difficulty in obtaining information from the automation system and increased uncertainty of human operators' diagnosis. The maximized proportion of automation that maintains the high level of attention for monitoring the situation is derived by an experiment, and the automation rate is estimated by the suggested automation rate estimation method. It is expected to derive an appropriate inclusion proportion of the automation system avoiding the OOTL problem and having maximum efficacy at the same time

  9. Development of System Architecture to Investigate the Impact of Integrated Air and Missile Defense in a Distributed Lethality Environment

    Science.gov (United States)

    2017-12-01

    SYSTEM ARCHITECTURE TO INVESTIGATE THE IMPACT OF INTEGRATED AIR AND MISSILE DEFENSE IN A DISTRIBUTED LETHALITY ENVIRONMENT by Justin K. Davis...TO INVESTIGATE THE IMPACT OF INTEGRATED AIR AND MISSILE DEFENSE IN A DISTRIBUTED LETHALITY ENVIRONMENT 5. FUNDING NUMBERS 6. AUTHOR(S) Justin K...ARCHITECTURE TO INVESTIGATE THE IMPACT OF INTEGRATED AIR AND MISSILE DEFENSE IN A DISTRIBUTED LETHALITY ENVIRONMENT Justin K. Davis Lieutenant

  10. A New Automated Instrument Calibration Facility at the Savannah River Site

    International Nuclear Information System (INIS)

    Polz, E.; Rushton, R.O.; Wilkie, W.H.; Hancock, R.C.

    1998-01-01

    The Health Physics Instrument Calibration Facility at the Savannah River Site in Aiken, SC was expressly designed and built to calibrate portable radiation survey instruments. The facility incorporates recent advances in automation technology, building layout and construction, and computer software to improve the calibration process. Nine new calibration systems automate instrument calibration and data collection. The building is laid out so that instruments are moved from one area to another in a logical, efficient manner. New software and hardware integrate all functions such as shipping/receiving, work flow, calibration, testing, and report generation. Benefits include a streamlined and integrated program, improved efficiency, reduced errors, and better accuracy

  11. HiRel: Hybrid Automated Reliability Predictor (HARP) integrated reliability tool system, (version 7.0). Volume 1: HARP introduction and user's guide

    Science.gov (United States)

    Bavuso, Salvatore J.; Rothmann, Elizabeth; Dugan, Joanne Bechta; Trivedi, Kishor S.; Mittal, Nitin; Boyd, Mark A.; Geist, Robert M.; Smotherman, Mark D.

    1994-01-01

    The Hybrid Automated Reliability Predictor (HARP) integrated Reliability (HiRel) tool system for reliability/availability prediction offers a toolbox of integrated reliability/availability programs that can be used to customize the user's application in a workstation or nonworkstation environment. HiRel consists of interactive graphical input/output programs and four reliability/availability modeling engines that provide analytical and simulative solutions to a wide host of reliable fault-tolerant system architectures and is also applicable to electronic systems in general. The tool system was designed to be compatible with most computing platforms and operating systems, and some programs have been beta tested, within the aerospace community for over 8 years. Volume 1 provides an introduction to the HARP program. Comprehensive information on HARP mathematical models can be found in the references.

  12. Using geospatial solutions to meet distribution integrity management requirements

    Energy Technology Data Exchange (ETDEWEB)

    McElroy, Robert A. [New Century Software, Inc., Fort Collins, CO (United States)

    2010-07-01

    In the United States, incidents on gas distribution pipelines kill on average 10 persons per year in addition to causing 40 serious injuries and millions of dollars of property damage. In order to remedy to this situation, the US Department of Transportation/Pipeline Hazardous Materials Safety Administration enacted new regulations requiring operators to develop distribution integrity management programs (DIMP) which must include: knowledge and identification of threats, evaluation of risk, identification and implementation of measures to address risks, performance measuring, periodic evaluation and improvement and results reporting. The aim of this paper is to show how geographic information systems (GIS) can help operators meet each requirement of the DIMP regulations. This discussion showed that GIS can help in identifying and quantifying the threats to the distribution system and in assessing the consequences of an incident. Investing in GIS will not only help operators in complying with the regulations but will also help them make economically sound, risk-based decisions.

  13. Automated integration of genomic physical mapping data via parallel simulated annealing

    Energy Technology Data Exchange (ETDEWEB)

    Slezak, T.

    1994-06-01

    The Human Genome Center at the Lawrence Livermore National Laboratory (LLNL) is nearing closure on a high-resolution physical map of human chromosome 19. We have build automated tools to assemble 15,000 fingerprinted cosmid clones into 800 contigs with minimal spanning paths identified. These islands are being ordered, oriented, and spanned by a variety of other techniques including: Fluorescence Insitu Hybridization (FISH) at 3 levels of resolution, ECO restriction fragment mapping across all contigs, and a multitude of different hybridization and PCR techniques to link cosmid, YAC, AC, PAC, and Pl clones. The FISH data provide us with partial order and distance data as well as orientation. We made the observation that map builders need a much rougher presentation of data than do map readers; the former wish to see raw data since these can expose errors or interesting biology. We further noted that by ignoring our length and distance data we could simplify our problem into one that could be readily attacked with optimization techniques. The data integration problem could then be seen as an M x N ordering of our N cosmid clones which ``intersect`` M larger objects by defining ``intersection`` to mean either contig/map membership or hybridization results. Clearly, the goal of making an integrated map is now to rearrange the N cosmid clone ``columns`` such that the number of gaps on the object ``rows`` are minimized. Our FISH partially-ordered cosmid clones provide us with a set of constraints that cannot be violated by the rearrangement process. We solved the optimization problem via simulated annealing performed on a network of 40+ Unix machines in parallel, using a server/client model built on explicit socket calls. For current maps we can create a map in about 4 hours on the parallel net versus 4+ days on a single workstation. Our biologists are now using this software on a daily basis to guide their efforts toward final closure.

  14. Integrating distributed generation: Regulation and trends in three leading countries

    International Nuclear Information System (INIS)

    Anaya, Karim L.; Pollitt, Michael G.

    2015-01-01

    This paper explores the trends in the deployment and integration of distributed generation in Germany, Denmark and Sweden. The study concentrates on the regulation of renewable energy generation with a focus on grid access and connection mechanisms. The high rate of distributed generation penetration is mainly based on the early support that these countries gave to the expansion of renewable energy generation – mainly wind and solar – within their respective national policies. Germany and Denmark are the ones with the most sophisticated support schemes, which have shown a dynamic design over time. In terms of connections, Germany has the most favorable connection regime which provides not only priority connection but also priority grid access for generation units that produce electricity from renewable energy sources. Sweden guarantees equal treatment among different technologies (i.e. a non-discrimination principle). High connection costs have been observed specially in Germany and Denmark. The costs of network upgrades are usually socialised across demand customers. However, integration issues should be taken into consideration in order to avoid expansion of distributed generation in a way which unnecessarily raises total system costs, via high connection costs. -- Highlights: •Examination of the DG connection arrangements in Denmark, Germany and Sweden. •Sophisticated subsidy schemes for DG contrast with socialization of connection costs. •No evidence of novel business models for connecting DG units smartly

  15. A Gauss-Kronrod-Trapezoidal integration scheme for modeling biological tissues with continuous fiber distributions.

    Science.gov (United States)

    Hou, Chieh; Ateshian, Gerard A

    2016-01-01

    Fibrous biological tissues may be modeled using a continuous fiber distribution (CFD) to capture tension-compression nonlinearity, anisotropic fiber distributions, and load-induced anisotropy. The CFD framework requires spherical integration of weighted individual fiber responses, with fibers contributing to the stress response only when they are in tension. The common method for performing this integration employs the discretization of the unit sphere into a polyhedron with nearly uniform triangular faces (finite element integration or FEI scheme). Although FEI has proven to be more accurate and efficient than integration using spherical coordinates, it presents three major drawbacks: First, the number of elements on the unit sphere needed to achieve satisfactory accuracy becomes a significant computational cost in a finite element (FE) analysis. Second, fibers may not be in tension in some regions on the unit sphere, where the integration becomes a waste. Third, if tensed fiber bundles span a small region compared to the area of the elements on the sphere, a significant discretization error arises. This study presents an integration scheme specialized to the CFD framework, which significantly mitigates the first drawback of the FEI scheme, while eliminating the second and third completely. Here, integration is performed only over the regions of the unit sphere where fibers are in tension. Gauss-Kronrod quadrature is used across latitudes and the trapezoidal scheme across longitudes. Over a wide range of strain states, fiber material properties, and fiber angular distributions, results demonstrate that this new scheme always outperforms FEI, sometimes by orders of magnitude in the number of computational steps and relative accuracy of the stress calculation.

  16. Uncertainty assessment of integrated distributed hydrological models using GLUE with Markov chain Monte Carlo sampling

    DEFF Research Database (Denmark)

    Blasone, Roberta-Serena; Madsen, Henrik; Rosbjerg, Dan

    2008-01-01

    uncertainty estimation (GLUE) procedure based on Markov chain Monte Carlo sampling is applied in order to improve the performance of the methodology in estimating parameters and posterior output distributions. The description of the spatial variations of the hydrological processes is accounted for by defining......In recent years, there has been an increase in the application of distributed, physically-based and integrated hydrological models. Many questions regarding how to properly calibrate and validate distributed models and assess the uncertainty of the estimated parameters and the spatially......-site validation must complement the usual time validation. In this study, we develop, through an application, a comprehensive framework for multi-criteria calibration and uncertainty assessment of distributed physically-based, integrated hydrological models. A revised version of the generalized likelihood...

  17. Control strategies for power distribution networks with electric vehicles integration

    DEFF Research Database (Denmark)

    Hu, Junjie

    of electrical energy. A smart grid can also be dened as an electricity network that can intelligently integrate the actions of all users connected to it - generators, consumers and those that do both - in order to eciently deliver sustainable, economic and secure electricity supplies. This thesis focuses...... of the ii market. To build a complete solution for integration of EVs into the distribution network, a price coordinated hierarchical scheduling system is proposed which can well characterize the involved actors in the smart grid. With this system, we demonstrate that it is possible to schedule the charging......Demand side resources, like electric vehicles (EVs), can become integral parts of a smart grids because instead of just consuming power they are capable of providing valuable services to power systems. EVs can be used to balance the intermittent renewable energy resources such as wind and solar...

  18. Analog integrated circuit design automation placement, routing and parasitic extraction techniques

    CERN Document Server

    Martins, Ricardo; Horta, Nuno

    2017-01-01

    This book introduces readers to a variety of tools for analog layout design automation. After discussing the placement and routing problem in electronic design automation (EDA), the authors overview a variety of automatic layout generation tools, as well as the most recent advances in analog layout-aware circuit sizing. The discussion includes different methods for automatic placement (a template-based Placer and an optimization-based Placer), a fully-automatic Router and an empirical-based Parasitic Extractor. The concepts and algorithms of all the modules are thoroughly described, enabling readers to reproduce the methodologies, improve the quality of their designs, or use them as starting point for a new tool. All the methods described are applied to practical examples for a 130nm design process, as well as placement and routing benchmark sets. Introduces readers to hierarchical combination of Pareto fronts of placements; Presents electromigration-aware routing with multilayer multiport terminal structures...

  19. DISCO - A concept of a system for integrated data base management in distributed data processing systems

    International Nuclear Information System (INIS)

    Holler, E.

    1980-01-01

    The development in data processing technology favors the trend towards distributed data processing systems: The large-scale integration of semiconductor devices has lead to very efficient (approx. 10 6 operations per second) and relatively cheap low end computers being offered today, that allow to install distributed data processing systems with a total capacity coming near to that of large-scale data processing plants at a tolerable investment expenditure. The technologies of communication and data banks, each by itself, have reached a state of development justifying their routine application. This is made evident by the present efforts for standardization in both areas. The integration of both technologies in the development of systems for integrated distributed data bank management, however, is new territory for engineering. (orig.) [de

  20. The NASA automation and robotics technology program

    Science.gov (United States)

    Holcomb, Lee B.; Montemerlo, Melvin D.

    1986-01-01

    The development and objectives of the NASA automation and robotics technology program are reviewed. The objectives of the program are to utilize AI and robotics to increase the probability of mission success; decrease the cost of ground control; and increase the capability and flexibility of space operations. There is a need for real-time computational capability; an effective man-machine interface; and techniques to validate automated systems. Current programs in the areas of sensing and perception, task planning and reasoning, control execution, operator interface, and system architecture and integration are described. Programs aimed at demonstrating the capabilities of telerobotics and system autonomy are discussed.

  1. Attention, spatial integration, and the tail of response time distributions in Stroop task performance

    NARCIS (Netherlands)

    Roelofs, A.P.A.

    2012-01-01

    A few studies have examined selective attention in Stroop task performance through ex-Gaussian analyses of response time (RT) distributions. It has remained unclear whether the tail of the RT distribution in vocal responding reflects spatial integration of relevant and irrelevant attributes, as

  2. Operations management system advanced automation: Fault detection isolation and recovery prototyping

    Science.gov (United States)

    Hanson, Matt

    1990-01-01

    The purpose of this project is to address the global fault detection, isolation and recovery (FDIR) requirements for Operation's Management System (OMS) automation within the Space Station Freedom program. This shall be accomplished by developing a selected FDIR prototype for the Space Station Freedom distributed processing systems. The prototype shall be based on advanced automation methodologies in addition to traditional software methods to meet the requirements for automation. A secondary objective is to expand the scope of the prototyping to encompass multiple aspects of station-wide fault management (SWFM) as discussed in OMS requirements documentation.

  3. Improving the driver-automation interaction: an approach using automation uncertainty.

    Science.gov (United States)

    Beller, Johannes; Heesen, Matthias; Vollrath, Mark

    2013-12-01

    The aim of this study was to evaluate whether communicating automation uncertainty improves the driver-automation interaction. A false system understanding of infallibility may provoke automation misuse and can lead to severe consequences in case of automation failure. The presentation of automation uncertainty may prevent this false system understanding and, as was shown by previous studies, may have numerous benefits. Few studies, however, have clearly shown the potential of communicating uncertainty information in driving. The current study fills this gap. We conducted a driving simulator experiment, varying the presented uncertainty information between participants (no uncertainty information vs. uncertainty information) and the automation reliability (high vs.low) within participants. Participants interacted with a highly automated driving system while engaging in secondary tasks and were required to cooperate with the automation to drive safely. Quantile regressions and multilevel modeling showed that the presentation of uncertainty information increases the time to collision in the case of automation failure. Furthermore, the data indicated improved situation awareness and better knowledge of fallibility for the experimental group. Consequently, the automation with the uncertainty symbol received higher trust ratings and increased acceptance. The presentation of automation uncertaintythrough a symbol improves overall driver-automation cooperation. Most automated systems in driving could benefit from displaying reliability information. This display might improve the acceptance of fallible systems and further enhances driver-automation cooperation.

  4. FULLY AUTOMATED IMAGE ORIENTATION IN THE ABSENCE OF TARGETS

    Directory of Open Access Journals (Sweden)

    C. Stamatopoulos

    2012-07-01

    Full Text Available Automated close-range photogrammetric network orientation has traditionally been associated with the use of coded targets in the object space to allow for an initial relative orientation (RO and subsequent spatial resection of the images. Over the past decade, automated orientation via feature-based matching (FBM techniques has attracted renewed research attention in both the photogrammetry and computer vision (CV communities. This is largely due to advances made towards the goal of automated relative orientation of multi-image networks covering untargetted (markerless objects. There are now a number of CV-based algorithms, with accompanying open-source software, that can achieve multi-image orientation within narrow-baseline networks. From a photogrammetric standpoint, the results are typically disappointing as the metric integrity of the resulting models is generally poor, or even unknown, while the number of outliers within the image matching and triangulation is large, and generally too large to allow relative orientation (RO via the commonly used coplanarity equations. On the other hand, there are few examples within the photogrammetric research field of automated markerless camera calibration to metric tolerances, and these too are restricted to narrow-baseline, low-convergence imaging geometry. The objective addressed in this paper is markerless automatic multi-image orientation, maintaining metric integrity, within networks that incorporate wide-baseline imagery. By wide-baseline we imply convergent multi-image configurations with convergence angles of up to around 90°. An associated aim is provision of a fast, fully automated process, which can be performed without user intervention. For this purpose, various algorithms require optimisation to allow parallel processing utilising multiple PC cores and graphics processing units (GPUs.

  5. Mockup of an automated material transport system for remote handling

    International Nuclear Information System (INIS)

    Porter, M.L.

    1992-01-01

    An Automated Material Transport System (AMTS) was identified for transport of samples within a Material and Process Control Laboratory (MPCL). The MPCL was designed with a dry sample handling laboratory and a wet chemistry analysis laboratory. Each laboratory contained several processing gloveboxes. The function of the AMTS was to automate the handling of materials, multiple process samples, and bulky items between process stations with a minimum of operator intervention and with minimum o[ waiting periods and nonproductive activities. This paper discusses the system design features, capabilities and results of initial testing. The overall performance of the AMTS is very good. No major problems or concerns were identified. System commands are simple and logical making the system user friendly. Operating principle and design of individual components is simple. With the addition of various track modules, the system can be configured in most any configuration. The AMTS lends itself very well for integration with other automated systems or products. The AMTS is suited for applications involving light payloads which require multiple sample and material handling, lot tracking, and system integration with other products

  6. Robotics and automation for oil sands bitumen production and maintenance

    Energy Technology Data Exchange (ETDEWEB)

    Lipsett, M.G. [Alberta Univ., Edmonton, AB (Canada). Dept. of Mechanical Engineering

    2008-07-01

    This presentation examined technical challenges and commercial challenges related to robotics and automation processes in the mining and oil sands industries. The oil sands industry has on-going cost pressures. Challenges include the depths to which miners must travel, as well as problems related to equipment reliability and safety. Surface mines must operate in all weather conditions with a variety of complex systems. Barriers for new technologies include high capital and operating expenses. It has also proven difficult to integrate new technologies within established mining practices. However, automation has the potential to improve mineral processing, production, and maintenance processes. Step changes can be placed in locations that are hazardous or inaccessible. Automated sizing, material, and ventilation systems are can also be implemented as well as tele-operated equipment. Prototypes currently being developed include advanced systems for cutting; rock bolting; loose rock detection systems; lump size estimation; unstructured environment sensing; environment modelling; and automatic task execution. Enabling technologies are now being developed for excavation, haulage, material handling systems, mining and reclamation methods, and integrated control and reliability. tabs., figs.

  7. Integration of quantum key distribution and private classical communication through continuous variable

    Science.gov (United States)

    Wang, Tianyi; Gong, Feng; Lu, Anjiang; Zhang, Damin; Zhang, Zhengping

    2017-12-01

    In this paper, we propose a scheme that integrates quantum key distribution and private classical communication via continuous variables. The integrated scheme employs both quadratures of a weak coherent state, with encrypted bits encoded on the signs and Gaussian random numbers encoded on the values of the quadratures. The integration enables quantum and classical data to share the same physical and logical channel. Simulation results based on practical system parameters demonstrate that both classical communication and quantum communication can be implemented over distance of tens of kilometers, thus providing a potential solution for simultaneous transmission of quantum communication and classical communication.

  8. Automated reasoning applications to design validation and sneak function analysis

    International Nuclear Information System (INIS)

    Stratton, R.C.

    1984-01-01

    Argonne National Laboratory (ANL) is actively involved in the LMFBR Man-Machine Integration (MMI) Safety Program. The objective of this program is to enhance the operational safety and reliability of fast-breeder reactors by optimum integration of men and machines through the application of human factors principles and control engineering to the design, operation, and the control environment. ANL is developing methods to apply automated reasoning and computerization in the validation and sneak function analysis process. This project provides the element definitions and relations necessary for an automated reasoner (AR) to reason about design validation and sneak function analysis. This project also provides a demonstration of this AR application on an Experimental Breeder Reactor-II (EBR-II) system, the Argonne Cooling System

  9. Automated Classification of Seedlings Using Computer Vision

    DEFF Research Database (Denmark)

    Dyrmann, Mads; Christiansen, Peter

    The objective of this project is to investigate the possibilities of recognizing plant species at multiple growth stages based on RGB images. Plants and leaves are initially segmented from a database through a partly automated procedure providing samples of 2438 plants and 4767 leaves distributed...

  10. Superconducting power distribution structure for integrated circuits

    International Nuclear Information System (INIS)

    Ruby, R.C.

    1991-01-01

    This patent describes a superconducting power distribution structure for an integrated circuit. It comprises a first superconducting capacitor plate; a second superconducting capacitor plate provided with electrical isolation means within the second capacitor plate; dielectric means separating the first capacitor plate from the second capacitor plate; first via means coupled at a first end to the first capacitor plate and extending through the dielectric and the electrical isolation means of the second capacitor plate; first contact means coupled to a second end of the first via means; and second contact means coupled to the second capacitor plate such that the first contact means and the second contact means are accessible from the same side of the second capacitor plate

  11. Quantification of the heterogeneity of prognostic cellular biomarkers in ewing sarcoma using automated image and random survival forest analysis.

    Directory of Open Access Journals (Sweden)

    Claudia Bühnemann

    Full Text Available Driven by genomic somatic variation, tumour tissues are typically heterogeneous, yet unbiased quantitative methods are rarely used to analyse heterogeneity at the protein level. Motivated by this problem, we developed automated image segmentation of images of multiple biomarkers in Ewing sarcoma to generate distributions of biomarkers between and within tumour cells. We further integrate high dimensional data with patient clinical outcomes utilising random survival forest (RSF machine learning. Using material from cohorts of genetically diagnosed Ewing sarcoma with EWSR1 chromosomal translocations, confocal images of tissue microarrays were segmented with level sets and watershed algorithms. Each cell nucleus and cytoplasm were identified in relation to DAPI and CD99, respectively, and protein biomarkers (e.g. Ki67, pS6, Foxo3a, EGR1, MAPK localised relative to nuclear and cytoplasmic regions of each cell in order to generate image feature distributions. The image distribution features were analysed with RSF in relation to known overall patient survival from three separate cohorts (185 informative cases. Variation in pre-analytical processing resulted in elimination of a high number of non-informative images that had poor DAPI localisation or biomarker preservation (67 cases, 36%. The distribution of image features for biomarkers in the remaining high quality material (118 cases, 104 features per case were analysed by RSF with feature selection, and performance assessed using internal cross-validation, rather than a separate validation cohort. A prognostic classifier for Ewing sarcoma with low cross-validation error rates (0.36 was comprised of multiple features, including the Ki67 proliferative marker and a sub-population of cells with low cytoplasmic/nuclear ratio of CD99. Through elimination of bias, the evaluation of high-dimensionality biomarker distribution within cell populations of a tumour using random forest analysis in quality

  12. Quantification of the heterogeneity of prognostic cellular biomarkers in ewing sarcoma using automated image and random survival forest analysis.

    Science.gov (United States)

    Bühnemann, Claudia; Li, Simon; Yu, Haiyue; Branford White, Harriet; Schäfer, Karl L; Llombart-Bosch, Antonio; Machado, Isidro; Picci, Piero; Hogendoorn, Pancras C W; Athanasou, Nicholas A; Noble, J Alison; Hassan, A Bassim

    2014-01-01

    Driven by genomic somatic variation, tumour tissues are typically heterogeneous, yet unbiased quantitative methods are rarely used to analyse heterogeneity at the protein level. Motivated by this problem, we developed automated image segmentation of images of multiple biomarkers in Ewing sarcoma to generate distributions of biomarkers between and within tumour cells. We further integrate high dimensional data with patient clinical outcomes utilising random survival forest (RSF) machine learning. Using material from cohorts of genetically diagnosed Ewing sarcoma with EWSR1 chromosomal translocations, confocal images of tissue microarrays were segmented with level sets and watershed algorithms. Each cell nucleus and cytoplasm were identified in relation to DAPI and CD99, respectively, and protein biomarkers (e.g. Ki67, pS6, Foxo3a, EGR1, MAPK) localised relative to nuclear and cytoplasmic regions of each cell in order to generate image feature distributions. The image distribution features were analysed with RSF in relation to known overall patient survival from three separate cohorts (185 informative cases). Variation in pre-analytical processing resulted in elimination of a high number of non-informative images that had poor DAPI localisation or biomarker preservation (67 cases, 36%). The distribution of image features for biomarkers in the remaining high quality material (118 cases, 104 features per case) were analysed by RSF with feature selection, and performance assessed using internal cross-validation, rather than a separate validation cohort. A prognostic classifier for Ewing sarcoma with low cross-validation error rates (0.36) was comprised of multiple features, including the Ki67 proliferative marker and a sub-population of cells with low cytoplasmic/nuclear ratio of CD99. Through elimination of bias, the evaluation of high-dimensionality biomarker distribution within cell populations of a tumour using random forest analysis in quality controlled tumour

  13. Automating with SIMATIC S7-400 inside TIA portal configuring, programming and testing with STEP 7 Professional

    CERN Document Server

    Berger, Hans

    2014-01-01

    This book presents a comprehensive description of the configuration of devices and network for the S7-400 components inside the engineering framework TIA Portal. You learn how to formulate and test a control program with the programming languages LAD, FBD, STL, and SCL. The book is rounded off by configuring the distributed I/O with PROFIBUS DP and PROFINET IO using SIMATIC S7-400 and data exchange via Industrial Ethernet. SIMATIC is the globally established automation system for implementing industrial controllers for machines, production plants and processes. SIMATIC S7-400 is the most powerful automation system within SIMATIC. This process controller is ideal for data-intensive tasks that are especially typical for the process industry. With superb communication capability and integrated interfaces it is optimized for larger tasks such as the coordination of entire systems. Open-loop and closed-loop control tasks are formulated with the STEP 7 Professional V11 engineering software in the field-proven progr...

  14. An Extended Genetic Algorithm for Distributed Integration of Fuzzy Process Planning and Scheduling

    Directory of Open Access Journals (Sweden)

    Shuai Zhang

    2016-01-01

    Full Text Available The distributed integration of process planning and scheduling (DIPPS aims to simultaneously arrange the two most important manufacturing stages, process planning and scheduling, in a distributed manufacturing environment. Meanwhile, considering its advantage corresponding to actual situation, the triangle fuzzy number (TFN is adopted in DIPPS to represent the machine processing and transportation time. In order to solve this problem and obtain the optimal or near-optimal solution, an extended genetic algorithm (EGA with innovative three-class encoding method, improved crossover, and mutation strategies is proposed. Furthermore, a local enhancement strategy featuring machine replacement and order exchange is also added to strengthen the local search capability on the basic process of genetic algorithm. Through the verification of experiment, EGA achieves satisfactory results all in a very short period of time and demonstrates its powerful performance in dealing with the distributed integration of fuzzy process planning and scheduling (DIFPPS.

  15. The complete two-loop integrated jet thrust distribution in soft-collinear effective theory

    International Nuclear Information System (INIS)

    Manteuffel, Andreas von; Schabinger, Robert M.; Zhu, Hua Xing

    2014-01-01

    In this work, we complete the calculation of the soft part of the two-loop integrated jet thrust distribution in e + e − annihilation. This jet mass observable is based on the thrust cone jet algorithm, which involves a veto scale for out-of-jet radiation. The previously uncomputed part of our result depends in a complicated way on the jet cone size, r, and at intermediate stages of the calculation we actually encounter a new class of multiple polylogarithms. We employ an extension of the coproduct calculus to systematically exploit functional relations and represent our results concisely. In contrast to the individual contributions, the sum of all global terms can be expressed in terms of classical polylogarithms. Our explicit two-loop calculation enables us to clarify the small r picture discussed in earlier work. In particular, we show that the resummation of the logarithms of r that appear in the previously uncomputed part of the two-loop integrated jet thrust distribution is inextricably linked to the resummation of the non-global logarithms. Furthermore, we find that the logarithms of r which cannot be absorbed into the non-global logarithms in the way advocated in earlier work have coefficients fixed by the two-loop cusp anomalous dimension. We also show that in many cases one can straightforwardly predict potentially large logarithmic contributions to the integrated jet thrust distribution at L loops by making use of analogous contributions to the simpler integrated hemisphere soft function

  16. System Engineering Strategy for Distributed Multi-Purpose Simulation Architectures

    Science.gov (United States)

    Bhula, Dlilpkumar; Kurt, Cindy Marie; Luty, Roger

    2007-01-01

    This paper describes the system engineering approach used to develop distributed multi-purpose simulations. The multi-purpose simulation architecture focuses on user needs, operations, flexibility, cost and maintenance. This approach was used to develop an International Space Station (ISS) simulator, which is called the International Space Station Integrated Simulation (ISIS)1. The ISIS runs unmodified ISS flight software, system models, and the astronaut command and control interface in an open system design that allows for rapid integration of multiple ISS models. The initial intent of ISIS was to provide a distributed system that allows access to ISS flight software and models for the creation, test, and validation of crew and ground controller procedures. This capability reduces the cost and scheduling issues associated with utilizing standalone simulators in fixed locations, and facilitates discovering unknowns and errors earlier in the development lifecycle. Since its inception, the flexible architecture of the ISIS has allowed its purpose to evolve to include ground operator system and display training, flight software modification testing, and as a realistic test bed for Exploration automation technology research and development.

  17. Network Regulation and Support Schemes - How Policy Interactions Affect the Integration of Distributed Generation

    DEFF Research Database (Denmark)

    Ropenus, Stephanie; Jacobsen, Henrik; Schröder, Sascha Thorsten

    2011-01-01

    This article seeks to investigate the interactions between the policy dimensions of support schemes and network regulation and how they affect distributed generation. Firstly, the incentives of distributed generators and distribution system operators are examined. Frequently there exists a trade......-off between the incentives for these two market agents to facilitate the integration of distributed generation. Secondly, the interaction of these policy dimensions is analyzed, including case studies based on five EU Member States. Aspects of operational nature and investments in grid and distributed...

  18. FIGENIX: Intelligent automation of genomic annotation: expertise integration in a new software platform

    Directory of Open Access Journals (Sweden)

    Pontarotti Pierre

    2005-08-01

    Full Text Available Abstract Background Two of the main objectives of the genomic and post-genomic era are to structurally and functionally annotate genomes which consists of detecting genes' position and structure, and inferring their function (as well as of other features of genomes. Structural and functional annotation both require the complex chaining of numerous different software, algorithms and methods under the supervision of a biologist. The automation of these pipelines is necessary to manage huge amounts of data released by sequencing projects. Several pipelines already automate some of these complex chaining but still necessitate an important contribution of biologists for supervising and controlling the results at various steps. Results Here we propose an innovative automated platform, FIGENIX, which includes an expert system capable to substitute to human expertise at several key steps. FIGENIX currently automates complex pipelines of structural and functional annotation under the supervision of the expert system (which allows for example to make key decisions, check intermediate results or refine the dataset. The quality of the results produced by FIGENIX is comparable to those obtained by expert biologists with a drastic gain in terms of time costs and avoidance of errors due to the human manipulation of data. Conclusion The core engine and expert system of the FIGENIX platform currently handle complex annotation processes of broad interest for the genomic community. They could be easily adapted to new, or more specialized pipelines, such as for example the annotation of miRNAs, the classification of complex multigenic families, annotation of regulatory elements and other genomic features of interest.

  19. The environmental control and life support system advanced automation project

    Science.gov (United States)

    Dewberry, Brandon S.

    1991-01-01

    The objective of the ECLSS Advanced Automation project includes reduction of the risk associated with the integration of new, beneficial software techniques. Demonstrations of this software to baseline engineering and test personnel will show the benefits of these techniques. The advanced software will be integrated into ground testing and ground support facilities, familiarizing its usage by key personnel.

  20. Test Results for the Automated Rendezvous and Capture System

    Science.gov (United States)

    Cruzen, Craig; Dabney, Richard; Lomas, James

    1999-01-01

    The Automated Rendezvous and Capture (AR&C) system was designed and tested at NASA's Marshall Space Flight Center (MSFC) to demonstrate technologies and mission strategies for automated rendezvous and docking of spacecraft in Earth orbit, The system incorporates some of the latest innovations in Global Positioning, System space navigation, laser sensor technologies and automated mission sequencing algorithms. The system's initial design and integration was completed in 1998 and has undergone testing at MSFC. This paper describes the major components of the AR&C system and presents results from the official system tests performed in MSFC's Flight Robotics Laboratory with digital simulations and hardware in the loop tests. The results show that the AR&C system can safely and reliably perform automated rendezvous and docking missions in the absence of system failures with 100 percent success. When system failures are included, the system uses its automated collision avoidance maneuver logic to recover in a safe manner. The primary objective of the AR&C project is to prove that by designing a safe and robust automated system, mission operations cost can be reduced by decreasing the personnel required for mission design, preflight planning and training required for crewed rendezvous and docking missions.

  1. Automated sampling and data processing derived from biomimetic membranes

    International Nuclear Information System (INIS)

    Perry, M; Vissing, T; Hansen, J S; Nielsen, C H; Boesen, T P; Emneus, J

    2009-01-01

    Recent advances in biomimetic membrane systems have resulted in an increase in membrane lifetimes from hours to days and months. Long-lived membrane systems demand the development of both new automated monitoring equipment capable of measuring electrophysiological membrane characteristics and new data processing software to analyze and organize the large amounts of data generated. In this work, we developed an automated instrumental voltage clamp solution based on a custom-designed software controller application (the WaveManager), which enables automated on-line voltage clamp data acquisition applicable to long-time series experiments. We designed another software program for off-line data processing. The automation of the on-line voltage clamp data acquisition and off-line processing was furthermore integrated with a searchable database (DiscoverySheet(TM)) for efficient data management. The combined solution provides a cost efficient and fast way to acquire, process and administrate large amounts of voltage clamp data that may be too laborious and time consuming to handle manually. (communication)

  2. Automated sampling and data processing derived from biomimetic membranes

    Energy Technology Data Exchange (ETDEWEB)

    Perry, M; Vissing, T; Hansen, J S; Nielsen, C H [Aquaporin A/S, Diplomvej 377, DK-2800 Kgs. Lyngby (Denmark); Boesen, T P [Xefion ApS, Kildegaardsvej 8C, DK-2900 Hellerup (Denmark); Emneus, J, E-mail: Claus.Nielsen@fysik.dtu.d [DTU Nanotech, Technical University of Denmark, DK-2800 Kgs. Lyngby (Denmark)

    2009-12-15

    Recent advances in biomimetic membrane systems have resulted in an increase in membrane lifetimes from hours to days and months. Long-lived membrane systems demand the development of both new automated monitoring equipment capable of measuring electrophysiological membrane characteristics and new data processing software to analyze and organize the large amounts of data generated. In this work, we developed an automated instrumental voltage clamp solution based on a custom-designed software controller application (the WaveManager), which enables automated on-line voltage clamp data acquisition applicable to long-time series experiments. We designed another software program for off-line data processing. The automation of the on-line voltage clamp data acquisition and off-line processing was furthermore integrated with a searchable database (DiscoverySheet(TM)) for efficient data management. The combined solution provides a cost efficient and fast way to acquire, process and administrate large amounts of voltage clamp data that may be too laborious and time consuming to handle manually. (communication)

  3. Automated complex spectra processing of actinide α-radiation

    International Nuclear Information System (INIS)

    Anichenkov, S.V.; Popov, Yu.S.; Tselishchev, I.V.; Mishenev, V.B.; Timofeev, G.A.

    1989-01-01

    Earlier described algorithms of automated processing of complex α - spectra of actinides with the use of Ehlektronika D3-28 computer line, connected with ICA-070 multichannel amplitude pulse analyzer, were realized. The developed program enables to calculated peak intensity and the relative isotope content, to conduct energy calibration of spectra, to calculate peak center of gravity and energy resolution, to perform integral counting in particular part of the spectrum. Error of the method of automated processing depens on the degree of spectrum complication and lies within the limits of 1-12%. 8 refs.; 4 figs.; 2 tabs

  4. Effects of age condition on the distribution and integrity of inorganic fillers in dental resin composites.

    Science.gov (United States)

    D'Alpino, Paulo Henrique Perlatti; Svizero, Nádia da Rocha; Bim Júnior, Odair; Valduga, Claudete Justina; Graeff, Carlos Frederico de Oliveira; Sauro, Salvatore

    2016-06-01

    The aim of this study is to evaluate the distribution of the filler size along with the zeta potential, and the integrity of silane-bonded filler surface in different types of restorative dental composites as a function of the material age condition. Filtek P60 (hybrid composite), Filtek Z250 (small-particle filled composite), Filtek Z350XT (nanofilled composite), and Filtek Silorane (silorane composite) (3M ESPE) were tested at different stage condition (i.e., fresh/new, aged, and expired). Composites were submitted to an accelerated aging protocol (Arrhenius model). Specimens were obtained by first diluting each composite specimen in ethanol and then dispersed in potassium chloride solution (0.001 mol%). Composite fillers were characterized for their zeta potential, mean particle size, size distribution, via poly-dispersion dynamic light scattering. The integrity of the silane-bonded surface of the fillers was characterized by FTIR. The material age influenced significantly the outcomes; Zeta potential, filler characteristics, and silane integrity varied both after aging and expiration. Silorane presented the broadest filler distribution and lowest zeta potential. Nanofilled and silorane composites exhibited decreased peak intensities in the FTIR analysis, indicating a deficiency of the silane integrity after aging or expiry time. Regardless to the material condition, the hybrid and the small-particle-filled composites were more stable overtime as no significant alteration in filler size distribution, diameter, and zeta potential occurred. A deficiency in the silane integrity in the nanofilled and silorane composites seems to be affected by the material stage condition. The materials conditions tested in this study influenced the filler size distribution, the zeta potential, and integrity of the silane adsorbed on fillers in the nanofilled and silorane composites. Thus, this may result in a decrease of the clinical performance of aforementioned composites, in

  5. Large-scale integration of renewable and distributed generation of electricity in Spain: Current situation and future needs

    International Nuclear Information System (INIS)

    Cossent, Rafael; Gómez, Tomás; Olmos, Luis

    2011-01-01

    Similar to other European countries, mechanisms for the promotion of electricity generation from renewable energy sources (RESs) and combined heat and power (CHP) production have caused a significant growth in distributed generation (DG) in Spain. Low DG/RES penetration levels do not have a major impact on electricity systems. However, several problems arise as DG shares increase. Smarter distribution grids are deemed necessary to facilitate DG/RES integration. This involves modifying the way distribution networks are currently planned and operated. Furthermore, DG and demand should also adopt a more active role. This paper reviews the current situation of DG/RES in Spain including penetration rates, support payments for DG/RES, level of market integration, economic regulation of Distribution System Operators (DSOs), smart metering implementation, grid operation and planning, and incentives for DSO innovation. This paper identifies several improvements that could be made to the treatment of DG/RES. Key aspects of an efficient DG/RES integration are identified and several regulatory changes specific to the Spanish situation are recommended. - Highlights: ► Substantial DG/RES penetration levels are foreseen for the coming years in Spain. ► Integrating such amount of DG/RES in electricity markets and networks is challenging. ► We review key regulatory aspects that may affect DG/RES integration in Spain. ► Several recommendations aimed at easing DG/RES integration in Spain are provided. ► Market integration and the transition towards smarter grids are deemed key issues.

  6. MOD control center automated information systems security evolution

    Science.gov (United States)

    Owen, Rich

    1991-01-01

    The role of the technology infusion process in future Control Center Automated Information Systems (AIS) is highlighted. The following subject areas are presented in the form of the viewgraphs: goals, background, threat, MOD's AISS program, TQM, SDLC integration, payback, future challenges, and bottom line.

  7. Metro-access integrated network based on optical OFDMA with dynamic sub-carrier allocation and power distribution.

    Science.gov (United States)

    Zhang, Chongfu; Zhang, Qiongli; Chen, Chen; Jiang, Ning; Liu, Deming; Qiu, Kun; Liu, Shuang; Wu, Baojian

    2013-01-28

    We propose and demonstrate a novel optical orthogonal frequency-division multiple access (OFDMA)-based metro-access integrated network with dynamic resource allocation. It consists of a single fiber OFDMA ring and many single fiber OFDMA trees, which transparently integrates metropolitan area networks with optical access networks. The single fiber OFDMA ring connects the core network and the central nodes (CNs), the CNs are on demand reconfigurable and use multiple orthogonal sub-carriers to realize parallel data transmission and dynamic resource allocation, meanwhile, they can also implement flexible power distribution. The remote nodes (RNs) distributed in the user side are connected by the single fiber OFDMA trees with the corresponding CN. The obtained results indicate that our proposed metro-access integrated network is feasible and the power distribution is agile.

  8. A bench-top automated workstation for nucleic acid isolation from clinical sample types.

    Science.gov (United States)

    Thakore, Nitu; Garber, Steve; Bueno, Arial; Qu, Peter; Norville, Ryan; Villanueva, Michael; Chandler, Darrell P; Holmberg, Rebecca; Cooney, Christopher G

    2018-04-18

    Systems that automate extraction of nucleic acid from cells or viruses in complex clinical matrices have tremendous value even in the absence of an integrated downstream detector. We describe our bench-top automated workstation that integrates our previously-reported extraction method - TruTip - with our newly-developed mechanical lysis method. This is the first report of this method for homogenizing viscous and heterogeneous samples and lysing difficult-to-disrupt cells using "MagVor": a rotating magnet that rotates a miniature stir disk amidst glass beads confined inside of a disposable tube. Using this system, we demonstrate automated nucleic acid extraction from methicillin-resistant Staphylococcus aureus (MRSA) in nasopharyngeal aspirate (NPA), influenza A in nasopharyngeal swabs (NPS), human genomic DNA from whole blood, and Mycobacterium tuberculosis in NPA. The automated workstation yields nucleic acid with comparable extraction efficiency to manual protocols, which include commercially-available Qiagen spin column kits, across each of these sample types. This work expands the scope of applications beyond previous reports of TruTip to include difficult-to-disrupt cell types and automates the process, including a method for removal of organics, inside a compact bench-top workstation. Copyright © 2018 Elsevier B.V. All rights reserved.

  9. Systematic integration of experimental data and models in systems biology.

    Science.gov (United States)

    Li, Peter; Dada, Joseph O; Jameson, Daniel; Spasic, Irena; Swainston, Neil; Carroll, Kathleen; Dunn, Warwick; Khan, Farid; Malys, Naglis; Messiha, Hanan L; Simeonidis, Evangelos; Weichart, Dieter; Winder, Catherine; Wishart, Jill; Broomhead, David S; Goble, Carole A; Gaskell, Simon J; Kell, Douglas B; Westerhoff, Hans V; Mendes, Pedro; Paton, Norman W

    2010-11-29

    The behaviour of biological systems can be deduced from their mathematical models. However, multiple sources of data in diverse forms are required in the construction of a model in order to define its components and their biochemical reactions, and corresponding parameters. Automating the assembly and use of systems biology models is dependent upon data integration processes involving the interoperation of data and analytical resources. Taverna workflows have been developed for the automated assembly of quantitative parameterised metabolic networks in the Systems Biology Markup Language (SBML). A SBML model is built in a systematic fashion by the workflows which starts with the construction of a qualitative network using data from a MIRIAM-compliant genome-scale model of yeast metabolism. This is followed by parameterisation of the SBML model with experimental data from two repositories, the SABIO-RK enzyme kinetics database and a database of quantitative experimental results. The models are then calibrated and simulated in workflows that call out to COPASIWS, the web service interface to the COPASI software application for analysing biochemical networks. These systems biology workflows were evaluated for their ability to construct a parameterised model of yeast glycolysis. Distributed information about metabolic reactions that have been described to MIRIAM standards enables the automated assembly of quantitative systems biology models of metabolic networks based on user-defined criteria. Such data integration processes can be implemented as Taverna workflows to provide a rapid overview of the components and their relationships within a biochemical system.

  10. Coherent one-way quantum key distribution

    Science.gov (United States)

    Stucki, Damien; Fasel, Sylvain; Gisin, Nicolas; Thoma, Yann; Zbinden, Hugo

    2007-05-01

    Quantum Key Distribution (QKD) consists in the exchange of a secrete key between two distant points [1]. Even if quantum key distribution systems exist and commercial systems are reaching the market [2], there are still improvements to be made: simplify the construction of the system; increase the secret key rate. To this end, we present a new protocol for QKD tailored to work with weak coherent pulses and at high bit rates [3]. The advantages of this system are that the setup is experimentally simple and it is tolerant to reduced interference visibility and to photon number splitting attacks, thus resulting in a high efficiency in terms of distilled secret bits per qubit. After having successfully tested the feasibility of the system [3], we are currently developing a fully integrated and automated prototype within the SECOQC project [4]. We present the latest results using the prototype. We also discuss the issue of the photon detection, which still remains the bottleneck for QKD.

  11. Logistics support economy and efficiency through consolidation and automation

    Science.gov (United States)

    Savage, G. R.; Fontana, C. J.; Custer, J. D.

    1985-01-01

    An integrated logistics support system, which would provide routine access to space and be cost-competitive as an operational space transportation system, was planned and implemented to support the NSTS program launch-on-time goal of 95 percent. A decision was made to centralize the Shuttle logistics functions in a modern facility that would provide office and training space and an efficient warehouse area. In this warehouse, the emphasis is on automation of the storage and retrieval function, while utilizing state-of-the-art warehousing and inventory management technology. This consolidation, together with the automation capabilities being provided, will allow for more effective utilization of personnel and improved responsiveness. In addition, this facility will be the prime support for the fully integrated logistics support of the operations era NSTS and reduce the program's management, procurement, transportation, and supply costs in the operations era.

  12. Gene ARMADA: an integrated multi-analysis platform for microarray data implemented in MATLAB.

    Science.gov (United States)

    Chatziioannou, Aristotelis; Moulos, Panagiotis; Kolisis, Fragiskos N

    2009-10-27

    The microarray data analysis realm is ever growing through the development of various tools, open source and commercial. However there is absence of predefined rational algorithmic analysis workflows or batch standardized processing to incorporate all steps, from raw data import up to the derivation of significantly differentially expressed gene lists. This absence obfuscates the analytical procedure and obstructs the massive comparative processing of genomic microarray datasets. Moreover, the solutions provided, heavily depend on the programming skills of the user, whereas in the case of GUI embedded solutions, they do not provide direct support of various raw image analysis formats or a versatile and simultaneously flexible combination of signal processing methods. We describe here Gene ARMADA (Automated Robust MicroArray Data Analysis), a MATLAB implemented platform with a Graphical User Interface. This suite integrates all steps of microarray data analysis including automated data import, noise correction and filtering, normalization, statistical selection of differentially expressed genes, clustering, classification and annotation. In its current version, Gene ARMADA fully supports 2 coloured cDNA and Affymetrix oligonucleotide arrays, plus custom arrays for which experimental details are given in tabular form (Excel spreadsheet, comma separated values, tab-delimited text formats). It also supports the analysis of already processed results through its versatile import editor. Besides being fully automated, Gene ARMADA incorporates numerous functionalities of the Statistics and Bioinformatics Toolboxes of MATLAB. In addition, it provides numerous visualization and exploration tools plus customizable export data formats for seamless integration by other analysis tools or MATLAB, for further processing. Gene ARMADA requires MATLAB 7.4 (R2007a) or higher and is also distributed as a stand-alone application with MATLAB Component Runtime. Gene ARMADA provides a

  13. Automation Hooks Architecture Trade Study for Flexible Test Orchestration

    Science.gov (United States)

    Lansdowne, Chatwin A.; Maclean, John R.; Graffagnino, Frank J.; McCartney, Patrick A.

    2010-01-01

    We describe the conclusions of a technology and communities survey supported by concurrent and follow-on proof-of-concept prototyping to evaluate feasibility of defining a durable, versatile, reliable, visible software interface to support strategic modularization of test software development. The objective is that test sets and support software with diverse origins, ages, and abilities can be reliably integrated into test configurations that assemble and tear down and reassemble with scalable complexity in order to conduct both parametric tests and monitored trial runs. The resulting approach is based on integration of three recognized technologies that are currently gaining acceptance within the test industry and when combined provide a simple, open and scalable test orchestration architecture that addresses the objectives of the Automation Hooks task. The technologies are automated discovery using multicast DNS Zero Configuration Networking (zeroconf), commanding and data retrieval using resource-oriented Restful Web Services, and XML data transfer formats based on Automatic Test Markup Language (ATML). This open-source standards-based approach provides direct integration with existing commercial off-the-shelf (COTS) analysis software tools.

  14. Study concerning the power plant control and safety equipment by integrated distributed systems

    International Nuclear Information System (INIS)

    Optea, I.; Oprea, M.; Stanescu, P.

    1995-01-01

    The paper deals with the trends existing in the field of nuclear control and safety equipment and systems, proposing a high-efficiency integrated system. In order to enhance the safety of the plant and reliability of the structure system and components, we present a concept based on the latest computer technology with an open, distributed system, connected by a local area network with high redundancy. A modern conception for the control and safety system is to integrate all the information related to the reactor protection, active engineered safeguard and auxiliary systems parameters, offering a fast flow of information between all the agencies concerned so that situations can be quickly assessed. The integrated distributed control is based on a high performance operating system for realtime applications, flexible enough for transparent networking and modular for demanding configurations. The general design considerations for nuclear reactors instrumentation reliability and testing methods for real-time functions under dynamic regime are presented. Taking into account the fast progress in information technology, we consider the replacement of the old instrumentation of Cernavoda-1 NPP by a modern integrated system as an economical and efficient solution for the next units. (Author) 20 Refs

  15. Factors to Consider When Implementing Automated Software Testing

    Science.gov (United States)

    2016-11-10

    development and integration is a continuous process throughout the acquisition life cycle . Automated Software Testing can improve testing capabilities...requires a lab, conference room, or both, and whether it should be located in- house or an external facility. 2. Ensure space is adequate to support team

  16. Efficient Reactive Power Compensation Algorithm for Distribution Network

    Directory of Open Access Journals (Sweden)

    J. Jerome

    2017-12-01

    Full Text Available The use of automation and energy efficient equipment with electronic control would greatly improve industrial production.  These new devices are more sensitive to supply voltage deviation and the characteristics of the power system that was previously ignored are now very important. Hence the benefits of distribution automation have been widely acknowledged in recent years. This paper proposes an efficient load flow solution technique extended to find optimum location for reactive power compensation and network reconfiguration for planning and day-to-day operation of distribution networks.  This is required as a part of the distribution automation system (DAS for taking various control and operation decisions.  The method exploits the radial nature of the network and uses forward and backward propagation technique to calculate branch currents and node voltages.  The proposed method has been tested to analyze several practical distribution networks of various voltage levels and also having high R/X ratio.

  17. Flexible software architecture for user-interface and machine control in laboratory automation.

    Science.gov (United States)

    Arutunian, E B; Meldrum, D R; Friedman, N A; Moody, S E

    1998-10-01

    We describe a modular, layered software architecture for automated laboratory instruments. The design consists of a sophisticated user interface, a machine controller and multiple individual hardware subsystems, each interacting through a client-server architecture built entirely on top of open Internet standards. In our implementation, the user-interface components are built as Java applets that are downloaded from a server integrated into the machine controller. The user-interface client can thereby provide laboratory personnel with a familiar environment for experiment design through a standard World Wide Web browser. Data management and security are seamlessly integrated at the machine-controller layer using QNX, a real-time operating system. This layer also controls hardware subsystems through a second client-server interface. This architecture has proven flexible and relatively easy to implement and allows users to operate laboratory automation instruments remotely through an Internet connection. The software architecture was implemented and demonstrated on the Acapella, an automated fluid-sample-processing system that is under development at the University of Washington.

  18. Electronic system for the automation of current measurements produced by ionization chambers

    International Nuclear Information System (INIS)

    Brancaccio, Franco; Dias, Mauro da Silva

    2002-01-01

    Ionization Chambers in current mode operation are usually used in Nuclear Metrology in the determination of radionuclide activity. For this purpose measurements of very low ionization currents, in the range of 10 -8 to 10 -14 A, are required. Usually, electrometers perform the current integration method under command of signals from an automation system, in order to reduce the measurement uncertainties. In the present work, an automation system, developed for current integration measurements at the Laboratorio de Metrologia Nuclear (LMN) of Instituto de Pesquisas Energeticas e Nucleares (IPEN), is described. This automation system is composed by software (graphic interface and control) and an electronic module connected to a microcomputer, by means of a commercial data acquisition card CAD12/32 (LYNX Tecnologia Eletronica Ltda.). Measurements, using an electrometer Keithley 616 (Keithley Instruments, Inc) and an ionization chamber IG12/A20 (20 th Century Electronics Ltd.), were performed in order to check the system and for validating the project. (author)

  19. Automated spectral and timing analysis of AGNs

    Science.gov (United States)

    Munz, F.; Karas, V.; Guainazzi, M.

    2006-12-01

    % We have developed an autonomous script that helps the user to automate the XMM-Newton data analysis for the purposes of extensive statistical investigations. We test this approach by examining X-ray spectra of bright AGNs pre-selected from the public database. The event lists extracted in this process were studied further by constructing their energy-resolved Fourier power-spectrum density. This analysis combines energy distributions, light-curves, and their power-spectra and it proves useful to assess the variability patterns present is the data. As another example, an automated search was based on the XSPEC package to reveal the emission features in 2-8 keV range.

  20. Automated prototyping tool-kit (APT)

    OpenAIRE

    Nada, Nader; Shing, M.; Berzins, V.; Luqi

    2002-01-01

    Automated prototyping tool-kit (APT) is an integrated set of software tools that generate source programs directly from real-time requirements. The APT system uses a fifth-generation prototyping language to model the communication structure, timing constraints, 1/0 control, and data buffering that comprise the requirements for an embedded software system. The language supports the specification of hard real-time systems with reusable components from domain specific component libraries. APT ha...