WorldWideScience

Sample records for integrates distributed automated

  1. Smart integration of distribution automation applications

    NARCIS (Netherlands)

    Groot, de R.J.W.; Morren, J.; Slootweg, J.G.

    2012-01-01

    Future electricity demand will significantly increase, while flexibility in supply will decrease, due to an increase in the use of renewable energy sources. The most effective way to prepare distribution grids for this increase in loading and decrease in supply-flexibility is to apply balancing,

  2. Distribution automation

    International Nuclear Information System (INIS)

    Gruenemeyer, D.

    1991-01-01

    This paper reports on a Distribution Automation (DA) System enhances the efficiency and productivity of a utility. It also provides intangible benefits such as improved public image and market advantages. A utility should evaluate the benefits and costs of such a system before committing funds. The expenditure for distribution automation is economical when justified by the deferral of a capacity increase, a decrease in peak power demand, or a reduction in O and M requirements

  3. Automated Energy Distribution and Reliability System: Validation Integration - Results of Future Architecture Implementation

    Energy Technology Data Exchange (ETDEWEB)

    Buche, D. L.

    2008-06-01

    This report describes Northern Indiana Public Service Co. project efforts to develop an automated energy distribution and reliability system. The purpose of this project was to implement a database-driven GIS solution that would manage all of the company's gas, electric, and landbase objects. This report is second in a series of reports detailing this effort.

  4. ATLAS Distributed Computing Automation

    CERN Document Server

    Schovancova, J; The ATLAS collaboration; Borrego, C; Campana, S; Di Girolamo, A; Elmsheuser, J; Hejbal, J; Kouba, T; Legger, F; Magradze, E; Medrano Llamas, R; Negri, G; Rinaldi, L; Sciacca, G; Serfon, C; Van Der Ster, D C

    2012-01-01

    The ATLAS Experiment benefits from computing resources distributed worldwide at more than 100 WLCG sites. The ATLAS Grid sites provide over 100k CPU job slots, over 100 PB of storage space on disk or tape. Monitoring of status of such a complex infrastructure is essential. The ATLAS Grid infrastructure is monitored 24/7 by two teams of shifters distributed world-wide, by the ATLAS Distributed Computing experts, and by site administrators. In this paper we summarize automation efforts performed within the ATLAS Distributed Computing team in order to reduce manpower costs and improve the reliability of the system. Different aspects of the automation process are described: from the ATLAS Grid site topology provided by the ATLAS Grid Information System, via automatic site testing by the HammerCloud, to automatic exclusion from production or analysis activities.

  5. Models, methods and software for distributed knowledge acquisition for the automated construction of integrated expert systems knowledge bases

    International Nuclear Information System (INIS)

    Dejneko, A.O.

    2011-01-01

    Based on an analysis of existing models, methods and means of acquiring knowledge, a base method of automated knowledge acquisition has been chosen. On the base of this method, a new approach to integrate information acquired from knowledge sources of different typologies has been proposed, and the concept of a distributed knowledge acquisition with the aim of computerized formation of the most complete and consistent models of problem areas has been introduced. An original algorithm for distributed knowledge acquisition from databases, based on the construction of binary decision trees has been developed [ru

  6. On the Automated Synthesis of Enterprise Integration Patterns to Adapt Choreography-based Distributed Systems

    Directory of Open Access Journals (Sweden)

    Marco Autili

    2015-12-01

    Full Text Available The Future Internet is becoming a reality, providing a large-scale computing environments where a virtually infinite number of available services can be composed so to fit users' needs. Modern service-oriented applications will be more and more often built by reusing and assembling distributed services. A key enabler for this vision is then the ability to automatically compose and dynamically coordinate software services. Service choreographies are an emergent Service Engineering (SE approach to compose together and coordinate services in a distributed way. When mismatching third-party services are to be composed, obtaining the distributed coordination and adaptation logic required to suitably realize a choreography is a non-trivial and error prone task. Automatic support is then needed. In this direction, this paper leverages previous work on the automatic synthesis of choreography-based systems, and describes our preliminary steps towards exploiting Enterprise Integration Patterns to deal with a form of choreography adaptation.

  7. World-wide distribution automation systems

    International Nuclear Information System (INIS)

    Devaney, T.M.

    1994-01-01

    A worldwide power distribution automation system is outlined. Distribution automation is defined and the status of utility automation is discussed. Other topics discussed include a distribution management system, substation feeder, and customer functions, potential benefits, automation costs, planning and engineering considerations, automation trends, databases, system operation, computer modeling of system, and distribution management systems

  8. Systems integration (automation system). System integration (automation system)

    Energy Technology Data Exchange (ETDEWEB)

    Fujii, K; Komori, T; Fukuma, Y; Oikawa, M [Nippon Steal Corp., Tokyo (Japan)

    1991-09-26

    This paper introduces business activities on an automation systems integration (SI) started by a company in July,1988, and describes the SI concepts. The business activities include, with the CIM (unified production carried out on computers) and AMENITY (living environment) as the mainstays, a single responsibility construction ranging from consultation on structuring optimal systems for processing and assembling industries and intelligent buildings to system design, installation and after-sales services. With an SI standing on users {prime} position taken most importantly, the business starts from a planning and consultation under close coordination. On the conceptual basis of structuring optimal systems using the ompany {prime}s affluent know-hows and tools and adapting and applying with multi-vendors, open networks, centralized and distributed systems, the business is promoted with the accumulated technologies capable of realizing artificial intelligence and neural networks in its background, and supported with highly valuable business results in the past. 10 figs., 1 tab.

  9. Distribution system analysis and automation

    CERN Document Server

    Gers, Juan

    2013-01-01

    A comprehensive guide to techniques that allow engineers to simulate, analyse and optimise power distribution systems which combined with automation, underpin the emerging concept of the "smart grid". This book is supported by theoretical concepts with real-world applications and MATLAB exercises.

  10. Automated Planning and Scheduling for Planetary Rover Distributed Operations

    Science.gov (United States)

    Backes, Paul G.; Rabideau, Gregg; Tso, Kam S.; Chien, Steve

    1999-01-01

    Automated planning and Scheduling, including automated path planning, has been integrated with an Internet-based distributed operations system for planetary rover operations. The resulting prototype system enables faster generation of valid rover command sequences by a distributed planetary rover operations team. The Web Interface for Telescience (WITS) provides Internet-based distributed collaboration, the Automated Scheduling and Planning Environment (ASPEN) provides automated planning and scheduling, and an automated path planner provided path planning. The system was demonstrated on the Rocky 7 research rover at JPL.

  11. Layered distributed architecture for plant automation

    International Nuclear Information System (INIS)

    Aravamuthan, G.; Verma, Yachika; Ranjan, Jyoti; Chachondia, Alka S.; Ganesh, G.

    2005-01-01

    The development of plant automation system and associated software remains one of the greatest challenges to the widespread implementation of highly adaptive re-configurable automation technology. This paper presents a layered distributed architecture for a plant automation system designed to support rapid reconfiguration and redeployment of automation components. The paper first presents evolution of automation architecture and their associated environment in the past few decades and then presents the concept of layered system architecture and the use of automation components to support the construction of a wide variety of automation system. It also highlights the role of standards and technology, which can be used in the development of automation components. We have attempted to adhere to open standards and technology for the development of automation component at a various layers. It also highlights the application of this concept in the development of an Operator Information System (OIS) for Advanced Heavy Water Reactor (AHWR). (author)

  12. Partial Automated Alignment and Integration System

    Science.gov (United States)

    Kelley, Gary Wayne (Inventor)

    2014-01-01

    The present invention is a Partial Automated Alignment and Integration System (PAAIS) used to automate the alignment and integration of space vehicle components. A PAAIS includes ground support apparatuses, a track assembly with a plurality of energy-emitting components and an energy-receiving component containing a plurality of energy-receiving surfaces. Communication components and processors allow communication and feedback through PAAIS.

  13. MDSplus automated build and distribution system

    Energy Technology Data Exchange (ETDEWEB)

    Fredian, T., E-mail: twf@psfc.mit.edu [Massachusetts Institute of Technology, 175 Albany Street, Cambridge, MA 02139 (United States); Stillerman, J. [Massachusetts Institute of Technology, 175 Albany Street, Cambridge, MA 02139 (United States); Manduchi, G. [Consorzio RFX, Euratom-ENEA Association, Corso Stati Uniti 4, Padova 35127 (Italy)

    2014-05-15

    Support of the MDSplus data handling system has been enhanced by the addition of an automated build system which does nightly builds of MDSplus for many computer platforms producing software packages which can now be downloaded using a web browser or via package repositories suitable for automatic updating. The build system was implemented using an extensible continuous integration server product called Hudson which schedules software builds on a collection of VMware based virtual machines. New releases are created based on updates via the MDSplus cvs code repository and versioning are managed using cvs tags and branches. Currently stable, beta and alpha releases of MDSplus are maintained for eleven different platforms including Windows, MacOSX, RedHat Enterprise Linux, Fedora, Ubuntu and Solaris. For some of these platforms, MDSplus packaging has been broken into functional modules so users can pick and choose which MDSplus features they want to install. An added feature to the latest Linux based platforms is the use of package dependencies. When installing MDSplus from the package repositories, any additional required packages used by MDSplus will be installed automatically greatly simplifying the installation of MDSplus. This paper will describe the MDSplus package automated build and distribution system.

  14. MDSplus automated build and distribution system

    International Nuclear Information System (INIS)

    Fredian, T.; Stillerman, J.; Manduchi, G.

    2014-01-01

    Support of the MDSplus data handling system has been enhanced by the addition of an automated build system which does nightly builds of MDSplus for many computer platforms producing software packages which can now be downloaded using a web browser or via package repositories suitable for automatic updating. The build system was implemented using an extensible continuous integration server product called Hudson which schedules software builds on a collection of VMware based virtual machines. New releases are created based on updates via the MDSplus cvs code repository and versioning are managed using cvs tags and branches. Currently stable, beta and alpha releases of MDSplus are maintained for eleven different platforms including Windows, MacOSX, RedHat Enterprise Linux, Fedora, Ubuntu and Solaris. For some of these platforms, MDSplus packaging has been broken into functional modules so users can pick and choose which MDSplus features they want to install. An added feature to the latest Linux based platforms is the use of package dependencies. When installing MDSplus from the package repositories, any additional required packages used by MDSplus will be installed automatically greatly simplifying the installation of MDSplus. This paper will describe the MDSplus package automated build and distribution system

  15. Distribution Integration | Grid Modernization | NREL

    Science.gov (United States)

    Distribution Integration Distribution Integration The goal of NREL's distribution integration research is to tackle the challenges facing the widespread integration of distributed energy resources NREL engineers mapping out a grid model on a whiteboard. NREL's research on the integration of

  16. How to Evaluate Integrated Library Automation Systems.

    Science.gov (United States)

    Powell, James R.; Slach, June E.

    1985-01-01

    This paper describes methodology used in compiling a list of candidate integrated library automation systems at a corporate technical library. Priorities for automation, identification of candidate systems, the filtering process, information for suppliers, software and hardware considerations, on-site evaluations, and final system selection are…

  17. Automated Distributed Simulation in Ptolemy II

    DEFF Research Database (Denmark)

    Lázaro Cuadrado, Daniel; Ravn, Anders Peter; Koch, Peter

    2007-01-01

    the ensuing communication and synchronization problems. Very often the designer has to explicitly specify extra information concerning distribution for the framework to make an effort to exploit parallelism. This paper presents Automated Distributed Simulation (ADS), which allows the designer to forget about...

  18. Data Distribution Service for Industrial Automation

    OpenAIRE

    Yang, Jinsong

    2012-01-01

    In industrial automation systems, there is usually large volume of data which needs to be delivered to right places at the right time. In addition, large number of nodes in the automation systems are usually distributed which increases the complexity that there needs to be more point-to-point Ethernet-connections in the network. Hence, it is necessary to apply data-centric design and reduce the connection complexity. Data Distributed Service for Real-Time Systems (DDS) is a data-centric middl...

  19. Automation, consolidation, and integration in autoimmune diagnostics.

    Science.gov (United States)

    Tozzoli, Renato; D'Aurizio, Federica; Villalta, Danilo; Bizzaro, Nicola

    2015-08-01

    Over the past two decades, we have witnessed an extraordinary change in autoimmune diagnostics, characterized by the progressive evolution of analytical technologies, the availability of new tests, and the explosive growth of molecular biology and proteomics. Aside from these huge improvements, organizational changes have also occurred which brought about a more modern vision of the autoimmune laboratory. The introduction of automation (for harmonization of testing, reduction of human error, reduction of handling steps, increase of productivity, decrease of turnaround time, improvement of safety), consolidation (combining different analytical technologies or strategies on one instrument or on one group of connected instruments) and integration (linking analytical instruments or group of instruments with pre- and post-analytical devices) opened a new era in immunodiagnostics. In this article, we review the most important changes that have occurred in autoimmune diagnostics and present some models related to the introduction of automation in the autoimmunology laboratory, such as automated indirect immunofluorescence and changes in the two-step strategy for detection of autoantibodies; automated monoplex immunoassays and reduction of turnaround time; and automated multiplex immunoassays for autoantibody profiling.

  20. EDISON - research programme on electricity distribution automation 1993-1997. Interim report 1996

    Energy Technology Data Exchange (ETDEWEB)

    Lehtonen, M. [ed.] [VTT Energy, Espoo (Finland). Energy Systems

    1997-12-31

    The report comprises a summary of the results of the first four years of the research programme EDISON on distribution automation in Finnish utilities. The five year research programme (1993-1997) is conducted under the leadership of VTT Energy, in cooperation with universities, distribution companies and the manufacturing industry. The main part of the funding is from the Technology Development Centre TEKES and from manufacturing companies. The goal of the research programme is to develop a new scheme for a complete distribution automation system, including the network automation, computer systems in the control centre and the customer associated automation functions. In addition, the techniques for demand side management are developed and integrated into the automation scheme. The final aim is to demonstrate the automation functions and systems of the scheme in real distribution systems. The results of fifteen projects are now given. These results should be considered intermediate, since most projects will be continued in 1997. (orig.) 43 refs.

  1. EDISON - research programme on electricity distribution automation 1993-1997. Interim report 1995

    Energy Technology Data Exchange (ETDEWEB)

    Lehtonen, M. [ed.] [VTT Energy, Espoo (Finland). Energy Systems

    1996-12-31

    The report comprises a summary of the results of the first three years of the research programme EDISON on distribution automation in Finnish electrical utilities. The five year research programme (1993-1997) is conducted under the leadership of VTT Energy, in cooperation with universities, distribution companies and the manufacturing industry. The main part of funding is from the Technology Development Centre (Tekes) and from manufacturing companies. The goal of the research programme is to develop a new scheme for a complete distribution automation system, including the network automation, computer systems in the control centre and the customer automation functions. In addition, the techniques for demand side management are developed and integrated into the automation scheme. The final aim is to demonstrate the automation functions and systems of the scheme in real distribution systems. The results of thirteen projects are now given. These results should be considered intermediate, since most projects will be continued in 1996. (orig.)

  2. EDISON - research programme on electric distribution automation 1993-1997. Final report 1997

    Energy Technology Data Exchange (ETDEWEB)

    Lehtonen, M [ed.; VTT Energy, Espoo (Finland). Energy Systems

    1998-08-01

    This report comprises a summary of the results of the five year research programme EDISON on distribution automation in Finnish utilities. The research programme (1993 - 1997) was conducted under the leadership of VTT Energy, in cooperation with universities, distribution companies and the manufacturing industry. The main part of the funding has been from the Technology Development Centre TEKES and from manufacturing companies. The goal of the research programme was to develop a new scheme for a complete distribution automation system, including the network automation, computer systems in the control centre and the customer associated automation functions. In addition, the techniques for demand side management were developed and integrated into the automation scheme. The final aim was to demonstrate the automation functions and systems of the scheme in real distribution systems. The results of nineteen projects are given in this report

  3. EDISON - research programme on electricity distribution automation 1993-1997. Interim report 1995

    Energy Technology Data Exchange (ETDEWEB)

    Lehtonen, M [ed.; VTT Energy, Espoo (Finland). Energy Systems

    1997-12-31

    The report comprises a summary of the results of the first three years of the research programme EDISON on distribution automation in Finnish electrical utilities. The five year research programme (1993-1997) is conducted under the leadership of VTT Energy, in cooperation with universities, distribution companies and the manufacturing industry. The main part of funding is from the Technology Development Centre (Tekes) and from manufacturing companies. The goal of the research programme is to develop a new scheme for a complete distribution automation system, including the network automation, computer systems in the control centre and the customer automation functions. In addition, the techniques for demand side management are developed and integrated into the automation scheme. The final aim is to demonstrate the automation functions and systems of the scheme in real distribution systems. The results of thirteen projects are now given. These results should be considered intermediate, since most projects will be continued in 1996. (orig.)

  4. Tools for the Automation of Large Distributed Control Systems

    CERN Document Server

    Gaspar, Clara

    2005-01-01

    The new LHC experiments at CERN will have very large numbers of channels to operate. In order to be able to configure and monitor such large systems, a high degree of parallelism is necessary. The control system is built as a hierarchy of sub-systems distributed over several computers. A toolkit - SMI++, combining two approaches: finite state machines and rule-based programming, allows for the description of the various sub-systems as decentralized deciding entities, reacting is real-time to changes in the system, thus providing for the automation of standard procedures and for the automatic recovery from error conditions in a hierarchical fashion. In this paper we will describe the principles and features of SMI++ as well as its integration with an industrial SCADA tool for use by the LHC experiments and we will try to show that such tools, can provide a very convenient mechanism for the automation of large scale, high complexity, applications.

  5. Integrated system for automated financial document processing

    Science.gov (United States)

    Hassanein, Khaled S.; Wesolkowski, Slawo; Higgins, Ray; Crabtree, Ralph; Peng, Antai

    1997-02-01

    A system was developed that integrates intelligent document analysis with multiple character/numeral recognition engines in order to achieve high accuracy automated financial document processing. In this system, images are accepted in both their grayscale and binary formats. A document analysis module starts by extracting essential features from the document to help identify its type (e.g. personal check, business check, etc.). These features are also utilized to conduct a full analysis of the image to determine the location of interesting zones such as the courtesy amount and the legal amount. These fields are then made available to several recognition knowledge sources such as courtesy amount recognition engines and legal amount recognition engines through a blackboard architecture. This architecture allows all the available knowledge sources to contribute incrementally and opportunistically to the solution of the given recognition query. Performance results on a test set of machine printed business checks using the integrated system are also reported.

  6. Automating usability of ATLAS distributed computing resources

    International Nuclear Information System (INIS)

    Tupputi, S A; Girolamo, A Di; Kouba, T; Schovancová, J

    2014-01-01

    The automation of ATLAS Distributed Computing (ADC) operations is essential to reduce manpower costs and allow performance-enhancing actions, which improve the reliability of the system. In this perspective a crucial case is the automatic handling of outages of ATLAS computing sites storage resources, which are continuously exploited at the edge of their capabilities. It is challenging to adopt unambiguous decision criteria for storage resources of non-homogeneous types, sizes and roles. The recently developed Storage Area Automatic Blacklisting (SAAB) tool has provided a suitable solution, by employing an inference algorithm which processes history of storage monitoring tests outcome. SAAB accomplishes both the tasks of providing global monitoring as well as automatic operations on single sites. The implementation of the SAAB tool has been the first step in a comprehensive review of the storage areas monitoring and central management at all levels. Such review has involved the reordering and optimization of SAM tests deployment and the inclusion of SAAB results in the ATLAS Site Status Board with both dedicated metrics and views. The resulting structure allows monitoring the storage resources status with fine time-granularity and automatic actions to be taken in foreseen cases, like automatic outage handling and notifications to sites. Hence, the human actions are restricted to reporting and following up problems, where and when needed. In this work we show SAAB working principles and features. We present also the decrease of human interactions achieved within the ATLAS Computing Operation team. The automation results in a prompt reaction to failures, which leads to the optimization of resource exploitation.

  7. Choosing the Right Integrator for Your Building Automation Project.

    Science.gov (United States)

    Podgorski, Will

    2002-01-01

    Examines the prevailing definitions and responsibilities of product, network, and system integrators for building automation systems; offers a novel approach to system integration; and sets realistic expectations for the owner in terms of benefits, outcomes, and overall values. (EV)

  8. Automated radiotherapy treatment plan integrity verification

    Energy Technology Data Exchange (ETDEWEB)

    Yang Deshan; Moore, Kevin L. [Department of Radiation Oncology, School of Medicine, Washington University in Saint Louis, St. Louis, Missouri 63110 (United States)

    2012-03-15

    Purpose: In our clinic, physicists spend from 15 to 60 min to verify the physical and dosimetric integrity of radiotherapy plans before presentation to radiation oncology physicians for approval. The purpose of this study was to design and implement a framework to automate as many elements of this quality control (QC) step as possible. Methods: A comprehensive computer application was developed to carry out a majority of these verification tasks in the Philips PINNACLE treatment planning system (TPS). This QC tool functions based on both PINNACLE scripting elements and PERL sub-routines. The core of this technique is the method of dynamic scripting, which involves a PERL programming module that is flexible and powerful for treatment plan data handling. Run-time plan data are collected, saved into temporary files, and analyzed against standard values and predefined logical rules. The results were summarized in a hypertext markup language (HTML) report that is displayed to the user. Results: This tool has been in clinical use for over a year. The occurrence frequency of technical problems, which would cause delays and suboptimal plans, has been reduced since clinical implementation. Conclusions: In addition to drastically reducing the set of human-driven logical comparisons, this QC tool also accomplished some tasks that are otherwise either quite laborious or impractical for humans to verify, e.g., identifying conflicts amongst IMRT optimization objectives.

  9. Automated radiotherapy treatment plan integrity verification

    International Nuclear Information System (INIS)

    Yang Deshan; Moore, Kevin L.

    2012-01-01

    Purpose: In our clinic, physicists spend from 15 to 60 min to verify the physical and dosimetric integrity of radiotherapy plans before presentation to radiation oncology physicians for approval. The purpose of this study was to design and implement a framework to automate as many elements of this quality control (QC) step as possible. Methods: A comprehensive computer application was developed to carry out a majority of these verification tasks in the Philips PINNACLE treatment planning system (TPS). This QC tool functions based on both PINNACLE scripting elements and PERL sub-routines. The core of this technique is the method of dynamic scripting, which involves a PERL programming module that is flexible and powerful for treatment plan data handling. Run-time plan data are collected, saved into temporary files, and analyzed against standard values and predefined logical rules. The results were summarized in a hypertext markup language (HTML) report that is displayed to the user. Results: This tool has been in clinical use for over a year. The occurrence frequency of technical problems, which would cause delays and suboptimal plans, has been reduced since clinical implementation. Conclusions: In addition to drastically reducing the set of human-driven logical comparisons, this QC tool also accomplished some tasks that are otherwise either quite laborious or impractical for humans to verify, e.g., identifying conflicts amongst IMRT optimization objectives.

  10. Automation and Integration in Semiconductor Manufacturing

    OpenAIRE

    Liao, Da-Yin

    2010-01-01

    Semiconductor automation originates from the prevention and avoidance of frauds in daily fab operations. As semiconductor technology and business continuously advance and grow, manufacturing systems must aggressively evolve to meet the changing technical and business requirements in this industry. Semiconductor manufacturing has been suffering pains from islands of automation. The problems associated with these systems are limited

  11. Distribution automation at BC Hydro : a case study

    Energy Technology Data Exchange (ETDEWEB)

    Siew, C. [BC Hydro, Vancouver, BC (Canada). Smart Grid Development Program

    2009-07-01

    This presentation discussed a distribution automation study conducted by BC Hydro to determine methods of improving grid performance by supporting intelligent transmission and distribution systems. The utility's smart grid program includes a number of utility-side and customer-side applications, including enabled demand response, microgrid, and operational efficiency applications. The smart grid program will improve reliability and power quality by 40 per cent, improve conservation and energy efficiency throughout the province, and provide enhanced customer service. Programs and initiatives currently underway at the utility include distribution management, smart metering, distribution automation, and substation automation programs. The utility's automation functionality will include fault interruption and locating, restoration capability, and restoration success. A decision support system has also been established to assist control room and field operating personnel with monitoring and control of the electric distribution system. Protection, control and monitoring (PCM) and volt VAR optimization upgrades are also planned. Reclosers are also being automated, and an automation guide has been developed for switches. tabs., figs.

  12. Automated computation of one-loop integrals in massless theories

    International Nuclear Information System (INIS)

    Hameren, A. van; Vollinga, J.; Weinzierl, S.

    2005-01-01

    We consider one-loop tensor and scalar integrals, which occur in a massless quantum field theory, and we report on the implementation into a numerical program of an algorithm for the automated computation of these one-loop integrals. The number of external legs of the loop integrals is not restricted. All calculations are done within dimensional regularization. (orig.)

  13. Chattanooga Electric Power Board Case Study Distribution Automation

    Energy Technology Data Exchange (ETDEWEB)

    Glass, Jim [Chattanooga Electric Power Board (EPB), TN (United States); Melin, Alexander M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Starke, Michael R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Ollis, Ben [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-10-01

    In 2009, the U.S. Department of Energy under the American Recovery and Reinvestment Act (ARRA) awarded a grant to the Chattanooga, Tennessee, Electric Power Board (EPB) as part of the Smart Grid Investment Grant Program. The grant had the objective “to accelerate the transformation of the nation’s electric grid by deploying smart grid technologies.” This funding award enabled EPB to expedite the original smart grid implementation schedule from an estimated 10-12 years to 2.5 years. With this funding, EPB invested heavily in distribution automation technologies including installing over 1,200 automated circuit switches and sensors on 171 circuits. For utilities considering a commitment to distribution automation, there are underlying questions such as the following: “What is the value?” and “What are the costs?” This case study attempts to answer these questions. The primary benefit of distribution automation is increased reliability or reduced power outage duration and frequency. Power outages directly impact customer economics by interfering with business functions. In the past, this economic driver has been difficult to effectively evaluate. However, as this case study demonstrates, tools and analysis techniques are now available. In this case study, the impact on customer costs associated with power outages before and after the implementation of distribution automation are compared. Two example evaluations are performed to demonstrate the benefits: 1) a savings baseline for customers under normal operations1 and 2) customer savings for a single severe weather event. Cost calculations for customer power outages are performed using the US Department of Energy (DOE) Interruption Cost Estimate (ICE) calculator2. This tool uses standard metrics associated with outages and the customers to calculate cost impact. The analysis shows that EPB customers have seen significant reliability improvements from the implementation of distribution automation. Under

  14. Automated Energy Distribution and Reliability System Status Report

    Energy Technology Data Exchange (ETDEWEB)

    Buche, D. L.; Perry, S.

    2007-10-01

    This report describes Northern Indiana Public Service Co. project efforts to develop an automated energy distribution and reliability system. The purpose of this project was to implement a database-driven GIS solution that would manage all of the company's gas, electric, and landbase objects.

  15. Automated Energy Distribution and Reliability System (AEDR): Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Buche, D. L.

    2008-07-01

    This report describes Northern Indiana Public Service Co. project efforts to develop an automated energy distribution and reliability system. The purpose of this project was to implement a database-driven GIS solution that would manage all of the company's gas, electric, and landbase objects.

  16. Proceedings of the distribution automation seminar. CD-ROM ed.

    International Nuclear Information System (INIS)

    2003-01-01

    Electric utilities are being driven to improve the utilization of their distribution system assets while reducing life cycle costs. This seminar provided an opportunity for electric utilities to share their experience and knowledge about the constantly evolving technologies that apply to distributed automation. Customers and their representatives place increased priority on regulatory commissions to achieve reliability and push the conventional use of distribution automation into rural areas. Various options are under consideration by managers to incorporate a variety of distributed generation resources. Several papers highlighted technical aspects as they relate to applications to meet the changing needs of utilities. The latest products and technologies in the field were on display. The seminar sessions included: business cases; utility experience and applications; utility experience and projects; and, technology and equipment. Eight presentations were indexed separately for inclusion in this database

  17. Wireless communication technologies in distribution automation

    Energy Technology Data Exchange (ETDEWEB)

    Takala, J. [VTT Energy, Espoo (Finland)

    1996-12-31

    The project examines four different wireless communication technologies: GSM short message service, NMT data calls, packet radio network, Autonet (Actionet) status message service. The targets for communication include: energy measurement, especially in the de-regulated electricity market, secondary sub-station control, fault indicators. The research concentrates on the usability of different communication technologies for different purposes. Data about response times, error rates, retry times, communication delays, costs etc. will be collected for each communication technology and comparative results will be obtained. Some field experiments and demonstrations will be made in energy measurement and distribution network remote control. The project is divided in four tasks. Each task is described briefly

  18. Wireless communication technologies in distribution automation

    Energy Technology Data Exchange (ETDEWEB)

    Takala, J [VTT Energy, Espoo (Finland)

    1997-12-31

    The project examines four different wireless communication technologies: GSM short message service, NMT data calls, packet radio network, Autonet (Actionet) status message service. The targets for communication include: energy measurement, especially in the de-regulated electricity market, secondary sub-station control, fault indicators. The research concentrates on the usability of different communication technologies for different purposes. Data about response times, error rates, retry times, communication delays, costs etc. will be collected for each communication technology and comparative results will be obtained. Some field experiments and demonstrations will be made in energy measurement and distribution network remote control. The project is divided in four tasks. Each task is described briefly

  19. Distribution definition of path integrals

    International Nuclear Information System (INIS)

    Kerler, W.

    1979-01-01

    By starting from quantum mechanics it turns out that a rather general definition of quantum functional integrals can be given which is based on distribution theory. It applies also to curved space and provides clear rules for non-linear transformations. The refinements necessary in usual definitions of path integrals are pointed out. Since the quantum nature requires special care with time sequences, it is not the classical phase space which occurs in the phase-space form of the path integral. Feynman's configuration-space form only applies to a highly specialized situation, and therefore is not a very advantageous starting point for general investigations. It is shown that the commonly used substitutions of variables do not properly account for quantum effects. The relation to the traditional ordering problem is clarified. The distribution formulation has allowed to treat constrained systems directly at the quantum level, to complete the path integral formulation of the equivalence theorem, and to define functional integrals also for space translation after the transition to fields. (orig.)

  20. Wireless communication technologies in distribution automation

    Energy Technology Data Exchange (ETDEWEB)

    Takala, J [VTT Energy, Espoo (Finland)

    1998-08-01

    The project started in mid 1995 and will be finished in 1997. The project examines four different wireless communication technologies: GSM short message service, NMT data calls, packet radio network and Autonet (Actionet) status message service. The targets for communication include: Energy measurement, especially in the de-regulated electricity market, secondary sub-station control and fault indicators. The research has been focused on the usability of different communication technologies for different purposes. Data about response times, reliability, error rates, retry times, communication delays, costs etc. has been collected about each communication technology and comparative results were analysed. Some field experiments and demonstrations will be made in energy measurement and distribution network remote control. The project is divided into four tasks. Each task is described briefly

  1. Integrated Transmission and Distribution Control

    Energy Technology Data Exchange (ETDEWEB)

    Kalsi, Karanjit [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fuller, Jason C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Tuffner, Francis K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Lian, Jianming [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Zhang, Wei [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Marinovici, Laurentiu D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fisher, Andrew R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Chassin, Forrest S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hauer, Matthew L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2013-01-01

    Distributed, generation, demand response, distributed storage, smart appliances, electric vehicles and renewable energy resources are expected to play a key part in the transformation of the American power system. Control, coordination and compensation of these smart grid assets are inherently interlinked. Advanced control strategies to warrant large-scale penetration of distributed smart grid assets do not currently exist. While many of the smart grid technologies proposed involve assets being deployed at the distribution level, most of the significant benefits accrue at the transmission level. The development of advanced smart grid simulation tools, such as GridLAB-D, has led to a dramatic improvement in the models of smart grid assets available for design and evaluation of smart grid technology. However, one of the main challenges to quantifying the benefits of smart grid assets at the transmission level is the lack of tools and framework for integrating transmission and distribution technologies into a single simulation environment. Furthermore, given the size and complexity of the distribution system, it is crucial to be able to represent the behavior of distributed smart grid assets using reduced-order controllable models and to analyze their impacts on the bulk power system in terms of stability and reliability.

  2. Distribution Integrity Management Plant (DIMP)

    Energy Technology Data Exchange (ETDEWEB)

    Gonzales, Jerome F. [Los Alamos National Laboratory

    2012-05-07

    This document is the distribution integrity management plan (Plan) for the Los Alamos National Laboratory (LANL) Natural Gas Distribution System. This Plan meets the requirements of 49 CFR Part 192, Subpart P Distribution Integrity Management Programs (DIMP) for the LANL Natural Gas Distribution System. This Plan was developed by reviewing records and interviewing LANL personnel. The records consist of the design, construction, operation and maintenance for the LANL Natural Gas Distribution System. The records system for the LANL Natural Gas Distribution System is limited, so the majority of information is based on the judgment of LANL employees; the maintenance crew, the Corrosion Specialist and the Utilities and Infrastructure (UI) Civil Team Leader. The records used in this report are: Pipeline and Hazardous Materials Safety Administration (PHMSA) 7100.1-1, Report of Main and Service Line Inspection, Natural Gas Leak Survey, Gas Leak Response Report, Gas Leak and Repair Report, and Pipe-to-Soil Recordings. The specific elements of knowledge of the infrastructure used to evaluate each threat and prioritize risks are listed in Sections 6 and 7, Threat Evaluation and Risk Prioritization respectively. This Plan addresses additional information needed and a method for gaining that data over time through normal activities. The processes used for the initial assessment of Threat Evaluation and Risk Prioritization are the methods found in the Simple, Handy Risk-based Integrity Management Plan (SHRIMP{trademark}) software package developed by the American Pipeline and Gas Agency (APGA) Security and Integrity Foundation (SIF). SHRIMP{trademark} uses an index model developed by the consultants and advisors of the SIF. Threat assessment is performed using questions developed by the Gas Piping Technology Company (GPTC) as modified and added to by the SHRIMP{trademark} advisors. This Plan is required to be reviewed every 5 years to be continually refined and improved. Records

  3. Integrated plant automation using programmable logic controllers

    International Nuclear Information System (INIS)

    Qureshi, S.A.

    2002-01-01

    In the world of automation the Programmable Logic Controller (PLC) has became for control. It now not only replaces the earlier relay logic controls but also has taken over many additional control functions. Initially the PLC was used to replace relay logic, but is ever-increasing range of functions means that it is found in many and more complex applications. As the structure of the PLC is based on the same principles as those employed in computer architecture, it is capable of performance not only relay switching tasks, but also other applications such as counting, calculating, comparing and the processing of analogue signals. Due to the simplicity of entering and modifying the programmed instructions to suit the requirements of the process under control, the PLC is truly a versatile and flexible device that can be employed easily and efficiently to repeatedly control tasks that vary in nature and complexes. A photograph of the Siemens S-5 95U. To illustrate the advantage of using a PLC over a traditional relay logic system, consider a control system with 20 input/output points. This assembly could comprise 60-80 relays, some counter/timers and a great deal of wiring. This assembly would be cumbersome with a power consumption of 30-40VA. A considerable time would be required to design, test and commission the assembly and once it is in full working order any desired modification, even of minor nature, could require major hardware changes. (author)

  4. Research of the application of the new communication technologies for distribution automation

    Science.gov (United States)

    Zhong, Guoxin; Wang, Hao

    2018-03-01

    Communication network is a key factor of distribution automation. In recent years, new communication technologies for distribution automation have a rapid development in China. This paper introduces the traditional communication technologies of distribution automation and analyse the defects of these traditional technologies. Then this paper gives a detailed analysis on some new communication technologies for distribution automation including wired communication and wireless communication and then gives an application suggestion of these new technologies.

  5. AIRSAR Automated Web-based Data Processing and Distribution System

    Science.gov (United States)

    Chu, Anhua; vanZyl, Jakob; Kim, Yunjin; Lou, Yunling; Imel, David; Tung, Wayne; Chapman, Bruce; Durden, Stephen

    2005-01-01

    In this paper, we present an integrated, end-to-end synthetic aperture radar (SAR) processing system that accepts data processing requests, submits processing jobs, performs quality analysis, delivers and archives processed data. This fully automated SAR processing system utilizes database and internet/intranet web technologies to allow external users to browse and submit data processing requests and receive processed data. It is a cost-effective way to manage a robust SAR processing and archival system. The integration of these functions has reduced operator errors and increased processor throughput dramatically.

  6. Integrated circuit design using design automation

    International Nuclear Information System (INIS)

    Gwyn, C.W.

    1976-09-01

    Although the use of computer aids to develop integrated circuits is relatively new at Sandia, the program has been very successful. The results have verified the utility of the in-house CAD design capability. Custom IC's have been developed in much shorter times than available through semiconductor device manufacturers. In addition, security problems were minimized and a saving was realized in circuit cost. The custom CMOS IC's were designed at less than half the cost of designing with conventional techniques. In addition to the computer aided design, the prototype fabrication and testing capability provided by the semiconductor development laboratory and microelectronics computer network allows the circuits to be fabricated and evaluated before the designs are transferred to the commercial semiconductor manufacturers for production. The Sandia design and prototype fabrication facilities provide the capability of complete custom integrated circuit development entirely within the ERDA laboratories

  7. 76 FR 17145 - Agency Information Collection Activities: Business Transformation-Automated Integrated Operating...

    Science.gov (United States)

    2011-03-28

    ... Collection Activities: Business Transformation--Automated Integrated Operating Environment (IOE), New... through efforts like USCIS' Business Transformation initiative. The IOE will be implemented by USCIS and... information collection. (2) Title of the Form/Collection: Business Transformation-- Automated Integrated...

  8. Semi-automated software service integration in virtual organisations

    Science.gov (United States)

    Afsarmanesh, Hamideh; Sargolzaei, Mahdi; Shadi, Mahdieh

    2015-08-01

    To enhance their business opportunities, organisations involved in many service industries are increasingly active in pursuit of both online provision of their business services (BSs) and collaborating with others. Collaborative Networks (CNs) in service industry sector, however, face many challenges related to sharing and integration of their collection of provided BSs and their corresponding software services. Therefore, the topic of service interoperability for which this article introduces a framework is gaining momentum in research for supporting CNs. It contributes to generation of formal machine readable specification for business processes, aimed at providing their unambiguous definitions, as needed for developing their equivalent software services. The framework provides a model and implementation architecture for discovery and composition of shared services, to support the semi-automated development of integrated value-added services. In support of service discovery, a main contribution of this research is the formal representation of services' behaviour and applying desired service behaviour specified by users for automated matchmaking with other existing services. Furthermore, to support service integration, mechanisms are developed for automated selection of the most suitable service(s) according to a number of service quality aspects. Two scenario cases are presented, which exemplify several specific features related to service discovery and service integration aspects.

  9. Integrated safeguards and security for a highly automated process

    International Nuclear Information System (INIS)

    Zack, N.R.; Hunteman, W.J.; Jaeger, C.D.

    1993-01-01

    Before the cancellation of the New Production Reactor Programs for the production of tritium, the reactors and associated processing were being designed to contain some of the most highly automated and remote systems conceived for a Department of Energy facility. Integrating safety, security, materials control and accountability (MC and A), and process systems at the proposed facilities would enhance the overall information and protection-in-depth available. Remote, automated fuel handling and assembly/disassembly techniques would deny access to the nuclear materials while upholding ALARA principles but would also require the full integration of all data/information systems. Such systems would greatly enhance MC and A as well as facilitate materials tracking. Physical protection systems would be connected with materials control features to cross check activities and help detect and resolve anomalies. This paper will discuss the results of a study of the safeguards and security benefits achieved from a highly automated and integrated remote nuclear facility and the impacts that such systems have on safeguards and computer and information security

  10. Digital coal mine integrated automation system based on Controlnet

    Energy Technology Data Exchange (ETDEWEB)

    Jin-yun Chen; Shen Zhang; Wei-ran Zuo [China University of Mining and Technology, Xuzhou (China). School of Chemical Engineering and Technology

    2007-06-15

    A three-layer model for digital communication in a mine is proposed. Two basic platforms are discussed: a uniform transmission network and a uniform data warehouse. An actual, ControlNet based, transmission network platform suitable for the Jining No.3 coal mine in China is presented. This network is an information superhighway intended to integrate all existing and new automation subsystems. Its standard interface can be used with future subsystems. The network, data structure and management decision-making all employ this uniform hardware and software. This effectively avoids the problems of system and information islands seen in traditional mine-automation systems. The construction of the network provides a stable foundation for digital communication in the Jining No.3 coal mine. 9 refs., 5 figs.

  11. Integration of disabled people in an automated work process

    Science.gov (United States)

    Jalba, C. K.; Muminovic, A.; Epple, S.; Barz, C.; Nasui, V.

    2017-05-01

    Automation processes enter more and more into all areas of life and production. Especially people with disabilities can hardly keep step with this change. In sheltered workshops in Germany people with physical and mental disabilities get help with much dedication, to be integrated into the work processes. This work shows that cooperation between disabled people and industrial robots by means of industrial image processing can successfully result in the production of highly complex products. Here is described how high-pressure hydraulic pumps are assembled by people with disabilities in cooperation with industrial robots in a sheltered workshop. After the assembly process, the pumps are checked for leaks at very high pressures in a completely automated process.

  12. Automating an integrated spatial data-mining model for landfill site selection

    Science.gov (United States)

    Abujayyab, Sohaib K. M.; Ahamad, Mohd Sanusi S.; Yahya, Ahmad Shukri; Ahmad, Siti Zubaidah; Aziz, Hamidi Abdul

    2017-10-01

    An integrated programming environment represents a robust approach to building a valid model for landfill site selection. One of the main challenges in the integrated model is the complicated processing and modelling due to the programming stages and several limitations. An automation process helps avoid the limitations and improve the interoperability between integrated programming environments. This work targets the automation of a spatial data-mining model for landfill site selection by integrating between spatial programming environment (Python-ArcGIS) and non-spatial environment (MATLAB). The model was constructed using neural networks and is divided into nine stages distributed between Matlab and Python-ArcGIS. A case study was taken from the north part of Peninsular Malaysia. 22 criteria were selected to utilise as input data and to build the training and testing datasets. The outcomes show a high-performance accuracy percentage of 98.2% in the testing dataset using 10-fold cross validation. The automated spatial data mining model provides a solid platform for decision makers to performing landfill site selection and planning operations on a regional scale.

  13. Design and Development of an Integrated Workstation Automation Hub

    Energy Technology Data Exchange (ETDEWEB)

    Weber, Andrew; Ghatikar, Girish; Sartor, Dale; Lanzisera, Steven

    2015-03-30

    Miscellaneous Electronic Loads (MELs) account for one third of all electricity consumption in U.S. commercial buildings, and are drivers for a significant energy use in India. Many of the MEL-specific plug-load devices are concentrated at workstations in offices. The use of intelligence, and integrated controls and communications at the workstation for an Office Automation Hub – offers the opportunity to improve both energy efficiency and occupant comfort, along with services for Smart Grid operations. Software and hardware solutions are available from a wide array of vendors for the different components, but an integrated system with interoperable communications is yet to be developed and deployed. In this study, we propose system- and component-level specifications for the Office Automation Hub, their functions, and a prioritized list for the design of a proof-of-concept system. Leveraging the strength of both the U.S. and India technology sectors, this specification serves as a guide for researchers and industry in both countries to support the development, testing, and evaluation of a prototype product. Further evaluation of such integrated technologies for performance and cost is necessary to identify the potential to reduce energy consumptions in MELs and to improve occupant comfort.

  14. Westinghouse integrated cementation facility. Smart process automation minimizing secondary waste

    International Nuclear Information System (INIS)

    Fehrmann, H.; Jacobs, T.; Aign, J.

    2015-01-01

    The Westinghouse Cementation Facility described in this paper is an example for a typical standardized turnkey project in the area of waste management. The facility is able to handle NPP waste such as evaporator concentrates, spent resins and filter cartridges. The facility scope covers all equipment required for a fully integrated system including all required auxiliary equipment for hydraulic, pneumatic and electric control system. The control system is based on actual PLC technology and the process is highly automated. The equipment is designed to be remotely operated, under radiation exposure conditions. 4 cementation facilities have been built for new CPR-1000 nuclear power stations in China

  15. Integration of Real-Time Data Into Building Automation Systems

    Energy Technology Data Exchange (ETDEWEB)

    Mark J. Stunder; Perry Sebastian; Brenda A. Chube; Michael D. Koontz

    2003-04-16

    The project goal was to investigate the possibility of using predictive real-time information from the Internet as an input to building management system algorithms. The objectives were to identify the types of information most valuable to commercial and residential building owners, managers, and system designers. To comprehensively investigate and document currently available electronic real-time information suitable for use in building management systems. Verify the reliability of the information and recommend accreditation methods for data and providers. Assess methodologies to automatically retrieve and utilize the information. Characterize equipment required to implement automated integration. Demonstrate the feasibility and benefits of using the information in building management systems. Identify evolutionary control strategies.

  16. Office automation: The administrative window into the integrated DBMS

    Science.gov (United States)

    Brock, G. H.

    1985-01-01

    In parallel to the evolution of Management Information Systems from simple data files to complex data bases, the stand-alone computer systems have been migrating toward fully integrated systems serving the work force. The next major productivity gain may very well be to make these highly sophisticated working level Data Base Management Systems (DMBS) serve all levels of management with reports of varying levels of detail. Most attempts by the DBMS development organization to provide useful information to management seem to bog down in the quagmire of competing working level requirements. Most large DBMS development organizations possess three to five year backlogs. Perhaps Office Automation is the vehicle that brings to pass the Management Information System that really serves management. A good office automation system manned by a team of facilitators seeking opportunities to serve end-users could go a long way toward defining a DBMS that serves management. This paper will briefly discuss the problems of the DBMS organization, alternative approaches to solving some of the major problems, a debate about problems that may have no solution, and finally how office automation fits into the development of the Manager's Management Information System.

  17. Joint force protection advanced security system (JFPASS) "the future of force protection: integrate and automate"

    Science.gov (United States)

    Lama, Carlos E.; Fagan, Joe E.

    2009-09-01

    The United States Department of Defense (DoD) defines 'force protection' as "preventive measures taken to mitigate hostile actions against DoD personnel (to include family members), resources, facilities, and critical information." Advanced technologies enable significant improvements in automating and distributing situation awareness, optimizing operator time, and improving sustainability, which enhance protection and lower costs. The JFPASS Joint Capability Technology Demonstration (JCTD) demonstrates a force protection environment that combines physical security and Chemical, Biological, Radiological, Nuclear, and Explosive (CBRNE) defense through the application of integrated command and control and data fusion. The JFPASS JCTD provides a layered approach to force protection by integrating traditional sensors used in physical security, such as video cameras, battlefield surveillance radars, unmanned and unattended ground sensors. The optimization of human participation and automation of processes is achieved by employment of unmanned ground vehicles, along with remotely operated lethal and less-than-lethal weapon systems. These capabilities are integrated via a tailorable, user-defined common operational picture display through a data fusion engine operating in the background. The combined systems automate the screening of alarms, manage the information displays, and provide assessment and response measures. The data fusion engine links disparate sensors and systems, and applies tailored logic to focus the assessment of events. It enables timely responses by providing the user with automated and semi-automated decision support tools. The JFPASS JCTD uses standard communication/data exchange protocols, which allow the system to incorporate future sensor technologies or communication networks, while maintaining the ability to communicate with legacy or existing systems.

  18. IEC 61850: integrating substation automation into the power plant control system; IEC 61850: Integration der Schaltanlagenautomatisierung in die Kraftwerksleittechnik

    Energy Technology Data Exchange (ETDEWEB)

    Orth, J. [ABB AG, Mannheim (Germany)

    2008-07-01

    The new communication standard IEC 61850 has been developed in the substation automation domain and was released 2004 as a worldwide standard. Meanwhile IEC 61850 is already established in many substation automation markets. The paper discusses the implementation of IEC 61850 integrating process control and substation automation into one consistent system in a power plant. (orig.)

  19. Integrating standard operating procedures with spacecraft automation, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Spacecraft automation has the potential to assist crew members and spacecraft operators in managing spacecraft systems during extended space missions. Automation can...

  20. Quantitative Vulnerability Assessment of Cyber Security for Distribution Automation Systems

    Directory of Open Access Journals (Sweden)

    Xiaming Ye

    2015-06-01

    Full Text Available The distribution automation system (DAS is vulnerable to cyber-attacks due to the widespread use of terminal devices and standard communication protocols. On account of the cost of defense, it is impossible to ensure the security of every device in the DAS. Given this background, a novel quantitative vulnerability assessment model of cyber security for DAS is developed in this paper. In the assessment model, the potential physical consequences of cyber-attacks are analyzed from two levels: terminal device level and control center server level. Then, the attack process is modeled based on game theory and the relationships among different vulnerabilities are analyzed by introducing a vulnerability adjacency matrix. Finally, the application process of the proposed methodology is illustrated through a case study based on bus 2 of the Roy Billinton Test System (RBTS. The results demonstrate the reasonability and effectiveness of the proposed methodology.

  1. Integration issues of distributed generation in distribution grids

    NARCIS (Netherlands)

    Coster, E.J.; Myrzik, J.M.A.; Kruimer, B.; Kling, W.L.

    2011-01-01

    In today’s distribution grids the number of distributed generation (DG) units is increasing rapidly. Combined heat and power (CHP) plants and wind turbines are most often installed. Integration of these DG units into the distribution grid leads to planning as well as operational challenges. Based on

  2. Enterprise Integration of Management and Automation in a Refinery

    Science.gov (United States)

    Wang, Chengen

    Traditionally, problems in a petroleum refinery were separately modeled and solved with respect to disciplines. The segregated implementations of various disciplinary technologies resulted in considerable barriers impeding the pursuit of global optimal performance. It is recognized that enterprise-wide integration of the managerial and automation systems is of fundamental significance for refineries to promptly respond to global market requirements. In this paper, the technical implementations are disciplinarily categorized into managerial and automatic systems. Then, typical managerial and automatic implementations in a refinery are depicted to give an insight perception of the heterogeneous data sources manipulated by these systems. Finally, an integration approach based on data reconciliation techniques is proposed to link up the heterogeneous data sources.

  3. Multi-purpose logical device with integrated circuit for the automation of mine water disposal

    Energy Technology Data Exchange (ETDEWEB)

    Pop, E.; Pasculescu, M.

    1980-06-01

    After an analysis of the waste water disposal as an object of automation, the author presents a BASIC-language programme established to simulate the automated control system on a digital computer. Then a multi-purpose logical device with integrated circuits for the automation of the mine water disposal is presented. (In Romanian)

  4. A Distributed Intelligent Automated Demand Response Building Management System

    Energy Technology Data Exchange (ETDEWEB)

    Auslander, David [Univ. of California, Berkeley, CA (United States); Culler, David [Univ. of California, Berkeley, CA (United States); Wright, Paul [Univ. of California, Berkeley, CA (United States); Lu, Yan [Siemens Corporate Research Inc., Princeton, NJ (United States); Piette, Mary [Univ. of California, Berkeley, CA (United States)

    2013-03-31

    The goal of the 2.5 year Distributed Intelligent Automated Demand Response (DIADR) project was to reduce peak electricity load of Sutardja Dai Hall at UC Berkeley by 30% while maintaining a healthy, comfortable, and productive environment for the occupants. We sought to bring together both central and distributed control to provide “deep” demand response1 at the appliance level of the building as well as typical lighting and HVAC applications. This project brought together Siemens Corporate Research and Siemens Building Technology (the building has a Siemens Apogee Building Automation System (BAS)), Lawrence Berkeley National Laboratory (leveraging their Open Automated Demand Response (openADR), Auto-­Demand Response, and building modeling expertise), and UC Berkeley (related demand response research including distributed wireless control, and grid-­to-­building gateway development). Sutardja Dai Hall houses the Center for Information Technology Research in the Interest of Society (CITRIS), which fosters collaboration among industry and faculty and students of four UC campuses (Berkeley, Davis, Merced, and Santa Cruz). The 141,000 square foot building, occupied in 2009, includes typical office spaces and a nanofabrication laboratory. Heating is provided by a district heating system (steam from campus as a byproduct of the campus cogeneration plant); cooling is provided by one of two chillers: a more typical electric centrifugal compressor chiller designed for the cool months (Nov-­ March) and a steam absorption chiller for use in the warm months (April-­October). Lighting in the open office areas is provided by direct-­indirect luminaries with Building Management System-­based scheduling for open areas, and occupancy sensors for private office areas. For the purposes of this project, we focused on the office portion of the building. Annual energy consumption is approximately 8053 MWh; the office portion is estimated as 1924 MWh. The maximum peak load

  5. Becoming Earth Independent: Human-Automation-Robotics Integration Challenges for Future Space Exploration

    Science.gov (United States)

    Marquez, Jessica J.

    2016-01-01

    Future exploration missions will require NASA to integrate more automation and robotics in order to accomplish mission objectives. This presentation will describe on the future challenges facing the human operator (astronaut, ground controllers) as we increase the amount of automation and robotics in spaceflight operations. It will describe how future exploration missions will have to adapt and evolve in order to deal with more complex missions and communication latencies. This presentation will outline future human-automation-robotic integration challenges.

  6. Path Searching Based Fault Automated Recovery Scheme for Distribution Grid with DG

    Science.gov (United States)

    Xia, Lin; Qun, Wang; Hui, Xue; Simeng, Zhu

    2016-12-01

    Applying the method of path searching based on distribution network topology in setting software has a good effect, and the path searching method containing DG power source is also applicable to the automatic generation and division of planned islands after the fault. This paper applies path searching algorithm in the automatic division of planned islands after faults: starting from the switch of fault isolation, ending in each power source, and according to the line load that the searching path traverses and the load integrated by important optimized searching path, forming optimized division scheme of planned islands that uses each DG as power source and is balanced to local important load. Finally, COBASE software and distribution network automation software applied are used to illustrate the effectiveness of the realization of such automatic restoration program.

  7. Integrating Standard Operating Procedures with Spacecraft Automation, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Spacecraft automation can be used to greatly reduce the demands on crew member and flight controllers time and attention. Automation can monitor critical resources,...

  8. Development of methods for DSM and distribution automation planning

    International Nuclear Information System (INIS)

    Kaerkkaeinen, S.; Kekkonen, V.; Rissanen, P.

    1998-01-01

    Demand-Side Management (DSM) is usually an utility (or sometimes governmental) activity designed to influence energy demand of customers (both level and load variation). It includes basic options like strategic conservation or load growth, peak clipping. Load shifting and fuel switching. Typical ways to realize DSM are direct load control, innovative tariffs, different types of campaign etc. Restructuring of utility in Finland and increased competition in electricity market have had dramatic influence on the DSM. Traditional ways are impossible due to the conflicting interests of generation, network and supply business and increased competition between different actors in the market. Costs and benefits of DSM are divided to different companies, and different type of utilities are interested only in those activities which are beneficial to them. On the other hand, due to the increased competition the suppliers are diversifying to different types of products and increasing number of customer services partly based on DSM are available. The aim of this project was to develop and assess methods for DSM and distribution automation planning from the utility point of view. The methods were also applied to case studies at utilities

  9. Development of methods for DSM and distribution automation planning

    Energy Technology Data Exchange (ETDEWEB)

    Kaerkkaeinen, S; Kekkonen, V [VTT Energy, Espoo (Finland); Rissanen, P [Tietosavo Oy (Finland)

    1998-08-01

    Demand-Side Management (DSM) is usually an utility (or sometimes governmental) activity designed to influence energy demand of customers (both level and load variation). It includes basic options like strategic conservation or load growth, peak clipping. Load shifting and fuel switching. Typical ways to realize DSM are direct load control, innovative tariffs, different types of campaign etc. Restructuring of utility in Finland and increased competition in electricity market have had dramatic influence on the DSM. Traditional ways are impossible due to the conflicting interests of generation, network and supply business and increased competition between different actors in the market. Costs and benefits of DSM are divided to different companies, and different type of utilities are interested only in those activities which are beneficial to them. On the other hand, due to the increased competition the suppliers are diversifying to different types of products and increasing number of customer services partly based on DSM are available. The aim of this project was to develop and assess methods for DSM and distribution automation planning from the utility point of view. The methods were also applied to case studies at utilities

  10. Integration of drinking water treatment plant process models and emulated process automation software

    NARCIS (Netherlands)

    Worm, G.I.M.

    2012-01-01

    The objective of this research is to limit the risks of fully automated operation of drinking water treatment plants and to improve their operation by using an integrated system of process models and emulated process automation software. This thesis contains the design of such an integrated system.

  11. On the engineering design for systematic integration of agent-orientation in industrial automation.

    Science.gov (United States)

    Yu, Liyong; Schüller, Andreas; Epple, Ulrich

    2014-09-01

    In today's automation industry, agent-oriented development of system functionalities appears to have a great potential for increasing autonomy and flexibility of complex operations, while lowering the workload of users. In this paper, we present a reference model for the harmonious and systematical integration of agent-orientation in industrial automation. Considering compatibility with existing automation systems and best practice, this model combines advantages of function block technology, service orientation and native description methods from the automation standard IEC 61131-3. This approach can be applied as a guideline for the engineering design of future agent-oriented automation systems. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  12. A scheme for a future distribution automation system in Finnish utilities

    Energy Technology Data Exchange (ETDEWEB)

    Lehtonen, M.; Kaerkkaeinen, S. [VTT Energy, Espoo (Finland); Partanen, J. [Lappeenranta Univ. of Technology (Finland)

    1996-12-31

    This presentation summarizes the results of a project, the aim of which was to define the optimal set of functions for the future distribution automation (DA) systems in Finland. The general factors. which affect the automation needs, are first discussed. The benefits of various functions of DA and demand side management (DSM) are then studied. Next a computer model for a DA feasibility analysis is presented, and some computation results are given. From these. the proposed automation scheme is finally concluded

  13. A scheme for a future distribution automation system in Finnish utilities

    Energy Technology Data Exchange (ETDEWEB)

    Lehtonen, M; Kaerkkaeinen, S [VTT Energy, Espoo (Finland); Partanen, J [Lappeenranta Univ. of Technology (Finland)

    1998-08-01

    This presentation summarizes the results of a project, the aim of which was to define the optimal set of functions for the future distribution automation (DA) systems in Finland. The general factors, which affect the automation needs, are first discussed. The benefits of various functions of DA and demand side management (DSM) are then studied. Next a computer model for a DA feasibility analysis is presented, and some computation results are given. From these, the proposed automation scheme is finally concluded

  14. A scheme for a future distribution automation system in Finnish utilities

    Energy Technology Data Exchange (ETDEWEB)

    Lehtonen, M; Kaerkkaeinen, S [VTT Energy, Espoo (Finland); Partanen, J [Lappeenranta Univ. of Technology (Finland)

    1997-12-31

    This presentation summarizes the results of a project, the aim of which was to define the optimal set of functions for the future distribution automation (DA) systems in Finland. The general factors. which affect the automation needs, are first discussed. The benefits of various functions of DA and demand side management (DSM) are then studied. Next a computer model for a DA feasibility analysis is presented, and some computation results are given. From these. the proposed automation scheme is finally concluded

  15. Integrating CLIPS applications into heterogeneous distributed systems

    Science.gov (United States)

    Adler, Richard M.

    1991-01-01

    SOCIAL is an advanced, object-oriented development tool for integrating intelligent and conventional applications across heterogeneous hardware and software platforms. SOCIAL defines a family of 'wrapper' objects called agents, which incorporate predefined capabilities for distributed communication and control. Developers embed applications within agents and establish interactions between distributed agents via non-intrusive message-based interfaces. This paper describes a predefined SOCIAL agent that is specialized for integrating C Language Integrated Production System (CLIPS)-based applications. The agent's high-level Application Programming Interface supports bidirectional flow of data, knowledge, and commands to other agents, enabling CLIPS applications to initiate interactions autonomously, and respond to requests and results from heterogeneous remote systems. The design and operation of CLIPS agents are illustrated with two distributed applications that integrate CLIPS-based expert systems with other intelligent systems for isolating and mapping problems in the Space Shuttle Launch Processing System at the NASA Kennedy Space Center.

  16. Design automation for integrated nonlinear logic circuits (Conference Presentation)

    Science.gov (United States)

    Van Vaerenbergh, Thomas; Pelc, Jason; Santori, Charles; Bose, Ranojoy; Kielpinski, Dave; Beausoleil, Raymond G.

    2016-05-01

    A key enabler of the IT revolution of the late 20th century was the development of electronic design automation (EDA) tools allowing engineers to manage the complexity of electronic circuits with transistor counts now reaching into the billions. Recently, we have been developing large-scale nonlinear photonic integrated logic circuits for next generation all-optical information processing. At this time a sufficiently powerful EDA-style software tool chain to design this type of complex circuits does not yet exist. Here we describe a hierarchical approach to automating the design and validation of photonic integrated circuits, which can scale to several orders of magnitude higher complexity than the state of the art. Most photonic integrated circuits developed today consist of a small number of components, and only limited hierarchy. For example, a simple photonic transceiver may contain on the order of 10 building-block components, consisting of grating couplers for photonic I/O, modulators, and signal splitters/combiners. Because this is relatively easy to lay out by hand (or simple script) existing photonic design tools have relatively little automation in comparison to electronics tools. But demonstrating all-optical logic will require significantly more complex photonic circuits containing up to 1,000 components, hence becoming infeasible to design manually. Our design framework is based off Python-based software from Luceda Photonics which provides an environment to describe components, simulate their behavior, and export design files (GDS) to foundries for fabrication. At a fundamental level, a photonic component is described as a parametric cell (PCell) similarly to electronics design. PCells are described by geometric characteristics of their layout. A critical part of the design framework is the implementation of PCells as Python objects. PCell objects can then use inheritance to simplify design, and hierarchical designs can be made by creating composite

  17. E-health, phase two: the imperative to integrate process automation with communication automation for large clinical reference laboratories.

    Science.gov (United States)

    White, L; Terner, C

    2001-01-01

    The initial efforts of e-health have fallen far short of expectations. They were buoyed by the hype and excitement of the Internet craze but limited by their lack of understanding of important market and environmental factors. E-health now recognizes that legacy systems and processes are important, that there is a technology adoption process that needs to be followed, and that demonstrable value drives adoption. Initial e-health transaction solutions have targeted mostly low-cost problems. These solutions invariably are difficult to integrate into existing systems, typically requiring manual interfacing to supported processes. This limitation in particular makes them unworkable for large volume providers. To meet the needs of these providers, e-health companies must rethink their approaches, appropriately applying technology to seamlessly integrate all steps into existing business functions. E-automation is a transaction technology that automates steps, integration of steps, and information communication demands, resulting in comprehensive automation of entire business functions. We applied e-automation to create a billing management solution for clinical reference laboratories. Large volume, onerous regulations, small margins, and only indirect access to patients challenge large laboratories' billing departments. Couple these problems with outmoded, largely manual systems and it becomes apparent why most laboratory billing departments are in crisis. Our approach has been to focus on the most significant and costly problems in billing: errors, compliance, and system maintenance and management. The core of the design relies on conditional processing, a "universal" communications interface, and ASP technologies. The result is comprehensive automation of all routine processes, driving out errors and costs. Additionally, compliance management and billing system support and management costs are dramatically reduced. The implications of e-automated processes can extend

  18. An Integrated Simulation Module for Cyber-Physical Automation Systems

    Directory of Open Access Journals (Sweden)

    Francesco Ferracuti

    2016-05-01

    Full Text Available The integration of Wireless Sensors Networks (WSNs into Cyber Physical Systems (CPSs is an important research problem to solve in order to increase the performances, safety, reliability and usability of wireless automation systems. Due to the complexity of real CPSs, emulators and simulators are often used to replace the real control devices and physical connections during the development stage. The most widespread simulators are free, open source, expandable, flexible and fully integrated into mathematical modeling tools; however, the connection at a physical level and the direct interaction with the real process via the WSN are only marginally tackled; moreover, the simulated wireless sensor motes are not able to generate the analogue output typically required for control purposes. A new simulation module for the control of a wireless cyber-physical system is proposed in this paper. The module integrates the COntiki OS JAva Simulator (COOJA, a cross-level wireless sensor network simulator, and the LabVIEW system design software from National Instruments. The proposed software module has been called “GILOO” (Graphical Integration of Labview and cOOja. It allows one to develop and to debug control strategies over the WSN both using virtual or real hardware modules, such as the National Instruments Real-Time Module platform, the CompactRio, the Supervisory Control And Data Acquisition (SCADA, etc. To test the proposed solution, we decided to integrate it with one of the most popular simulators, i.e., the Contiki OS, and wireless motes, i.e., the Sky mote. As a further contribution, the Contiki Sky DAC driver and a new “Advanced Sky GUI” have been proposed and tested in the COOJA Simulator in order to provide the possibility to develop control over the WSN. To test the performances of the proposed GILOO software module, several experimental tests have been made, and interesting preliminary results are reported. The GILOO module has been

  19. An Integrated Simulation Module for Cyber-Physical Automation Systems.

    Science.gov (United States)

    Ferracuti, Francesco; Freddi, Alessandro; Monteriù, Andrea; Prist, Mariorosario

    2016-05-05

    The integration of Wireless Sensors Networks (WSNs) into Cyber Physical Systems (CPSs) is an important research problem to solve in order to increase the performances, safety, reliability and usability of wireless automation systems. Due to the complexity of real CPSs, emulators and simulators are often used to replace the real control devices and physical connections during the development stage. The most widespread simulators are free, open source, expandable, flexible and fully integrated into mathematical modeling tools; however, the connection at a physical level and the direct interaction with the real process via the WSN are only marginally tackled; moreover, the simulated wireless sensor motes are not able to generate the analogue output typically required for control purposes. A new simulation module for the control of a wireless cyber-physical system is proposed in this paper. The module integrates the COntiki OS JAva Simulator (COOJA), a cross-level wireless sensor network simulator, and the LabVIEW system design software from National Instruments. The proposed software module has been called "GILOO" (Graphical Integration of Labview and cOOja). It allows one to develop and to debug control strategies over the WSN both using virtual or real hardware modules, such as the National Instruments Real-Time Module platform, the CompactRio, the Supervisory Control And Data Acquisition (SCADA), etc. To test the proposed solution, we decided to integrate it with one of the most popular simulators, i.e., the Contiki OS, and wireless motes, i.e., the Sky mote. As a further contribution, the Contiki Sky DAC driver and a new "Advanced Sky GUI" have been proposed and tested in the COOJA Simulator in order to provide the possibility to develop control over the WSN. To test the performances of the proposed GILOO software module, several experimental tests have been made, and interesting preliminary results are reported. The GILOO module has been applied to a smart home

  20. CMS Distributed Computing Integration in the LHC sustained operations era

    International Nuclear Information System (INIS)

    Grandi, C; Bonacorsi, D; Bockelman, B; Fisk, I

    2011-01-01

    After many years of preparation the CMS computing system has reached a situation where stability in operations limits the possibility to introduce innovative features. Nevertheless it is the same need of stability and smooth operations that requires the introduction of features that were considered not strategic in the previous phases. Examples are: adequate authorization to control and prioritize the access to storage and computing resources; improved monitoring to investigate problems and identify bottlenecks on the infrastructure; increased automation to reduce the manpower needed for operations; effective process to deploy in production new releases of the software tools. We present the work of the CMS Distributed Computing Integration Activity that is responsible for providing a liaison between the CMS distributed computing infrastructure and the software providers, both internal and external to CMS. In particular we describe the introduction of new middleware features during the last 18 months as well as the requirements to Grid and Cloud software developers for the future.

  1. Towards data integration automation for the French rare disease registry.

    Science.gov (United States)

    Maaroufi, Meriem; Choquet, Rémy; Landais, Paul; Jaulent, Marie-Christine

    2015-01-01

    Building a medical registry upon an existing infrastructure and rooted practices is not an easy task. It is the case for the BNDMR project, the French rare disease registry, that aims to collect administrative and medical data of rare disease patients seen in different hospitals. To avoid duplicating data entry for health professionals, the project plans to deploy connectors with the existing systems to automatically retrieve data. Given the data heterogeneity and the large number of source systems, the automation of connectors creation is required. In this context, we propose a methodology that optimizes the use of existing alignment approaches in the data integration processes. The generated mappings are formalized in exploitable mapping expressions. Following this methodology, a process has been experimented on specific data types of a source system: Boolean and predefined lists. As a result, effectiveness of the used alignment approach has been enhanced and more good mappings have been detected. Nonetheless, further improvements could be done to deal with the semantic issue and process other data types.

  2. Correlation of the UV-induced mutational spectra and the DNA damage distribution of the human HPRT gene: Automating the analysis

    International Nuclear Information System (INIS)

    Kotturi, G.; Erfle, H.; Koop, B.F.; Boer, J.G. de; Glickman, B.W.

    1994-01-01

    Automated DNA sequencers can be readily adapted for various types of sequence-based nucleic acid analysis: more recently it was determined the distribution of UV photoproducts in the E. coli laci gene using techniques developed for automated fluorescence-based analysis. We have been working to improve the automated approach of damage distribution. Our current method is more rigorous. We have new software that integrates the area under the individual peaks, rather than measuring the height of the curve. In addition, we now employ an internal standard. The analysis can also be partially automated. Detection limits for both major types of UV-photoproducts (cyclobutane dimers and pyrimidine (6-4) pyrimidone photoproducts) are reported. The UV-induced damage distribution in the hprt gene is compared to the mutational spectra in human and rodents cells

  3. Autonomous Integrated Receive System (AIRS) requirements definition. Volume 4: Functional specification for the prototype Automated Integrated Receive System (AIRS)

    Science.gov (United States)

    Chie, C. M.

    1984-01-01

    The functional requirements for the performance, design, and testing for the prototype Automated Integrated Receive System (AIRS) to be demonstrated for the TDRSS S-Band Single Access Return Link are presented.

  4. An Evaluation of the Automated Cost Estimating Integrated Tools (ACEIT) System

    Science.gov (United States)

    1989-09-01

    C~4p DTIC S ELECTE fl JAN12 19 .1R ~OF S%. B -U AN EVALUATION OF THE AUTOMATED COST ESTIMATING INTEGRATED TOOLS ( ACEIT ) SYSTEM THESIS Caroline L...Ohio go 91 022 AFIT/GCA/LSQ/89S-5 AN EVALUATION OF THE AUTOMATED COST ESTIMATING INTEGRATED TOOLS ( ACEIT ) SYSTEM THESIS Caroline L. Hanson Major, USAF...Department of Defense. AFIT/GCA/LSQ/89S-5 AN EVALUATION OF THE AUTOMATED COST ESTIMATING INTEGRATED TOOLS ( ACEIT ) SYSTEM THESIS Presented to the

  5. Integrating Test-Form Formatting into Automated Test Assembly

    Science.gov (United States)

    Diao, Qi; van der Linden, Wim J.

    2013-01-01

    Automated test assembly uses the methodology of mixed integer programming to select an optimal set of items from an item bank. Automated test-form generation uses the same methodology to optimally order the items and format the test form. From an optimization point of view, production of fully formatted test forms directly from the item pool using…

  6. Automation Challenges of the 80's: What to Do until Your Integrated Library System Arrives.

    Science.gov (United States)

    Allan, Ferne C.; Shields, Joyce M.

    1986-01-01

    A medium-sized aerospace library has developed interim solutions to automation needs by using software and equipment that were available in-house in preparation for an expected integrated library system. Automated processes include authors' file of items authored by employees, journal routing (including routing slips), statistics, journal…

  7. Moving NASA Beyond Low Earth Orbit: Future Human-Automation-Robotic Integration Challenges

    Science.gov (United States)

    Marquez, Jessica

    2016-01-01

    This presentation will provide an overview of current human spaceflight operations. It will also describe how future exploration missions will have to adapt and evolve in order to deal with more complex missions and communication latencies. Additionally, there are many implications regarding advanced automation and robotics, and this presentation will outline future human-automation-robotic integration challenges.

  8. Robust localisation of automated guided vehicles for computer-integrated manufacturing environments

    Directory of Open Access Journals (Sweden)

    Dixon, R. C.

    2013-05-01

    Full Text Available As industry moves toward an era of complete automation and mass customisation, automated guided vehicles (AGVs are used as material handling systems. However, the current techniques that provide navigation, control, and manoeuvrability of automated guided vehicles threaten to create bottlenecks and inefficiencies in manufacturing environments that strive towards the optimisation of part production. This paper proposes a decentralised localisation technique for an automated guided vehicle without any non-holonomic constraints. Incorporation of these vehicles into the material handling system of a computer-integrated manufacturing environment would increase the characteristics of robustness, efficiency, flexibility, and advanced manoeuvrability.

  9. 76 FR 63941 - Agency Information Collection Activities: Business Transformation-Automated Integrated Operating...

    Science.gov (United States)

    2011-10-14

    ... Activities: Business Transformation--Automated Integrated Operating Environment (IOE), New Information... Internet by federal agencies through efforts like USCIS' Business Transformation initiative. The USCIS ELIS... the USCIS Business Transformation initiative and wizard technology. The supporting statement can be...

  10. Automated Design Tools for Integrated Mixed-Signal Microsystems (NeoCAD)

    National Research Council Canada - National Science Library

    Petre, P; Visher, J; Shringarpure, R; Valley, F; Swaminathan, M

    2005-01-01

    Automated design tools and integrated design flow methodologies were developed that demonstrated more than an order- of-magnitude reduction in cycle time and cost for mixed signal (digital/analoglRF...

  11. Research and Application of Construction of Operation Integration for Smart Power Distribution and Consumption Based on “Integration of Marketing with Distribution”

    Directory of Open Access Journals (Sweden)

    Zhenbao Feng

    2014-05-01

    Full Text Available The “information integrated platform of marketing and distribution integration system” researched and developed by this article is an advanced application platform to concurrently design and develop the automation of marketing and power distribution through integration and analysis of existing data based on the data platform of Jiaozuo Power Supply Corporation. It uses data mining and data bus technology, uniform analysis of comprehensive marketing and distribution data. And it conducts a real time monitoring on power utilization information for marketing and early warning maintenance business of power distribution according to electric business model, which realizes an integration of marketing and distribution business, achieves the target of integrated operation of marketing and distribution, improves the operation level of business, reduces maintenance costs of distribution grid, increases electricity sales of distribution grid and provide reliable practical basis for operation and maintenance of Jiaozuo power marketing and distribution.

  12. Integrating photovoltaics into utility distribution systems

    International Nuclear Information System (INIS)

    Zaininger, H.W.; Barnes, P.R.

    1995-01-01

    Electric utility distribution system impacts associated with the integration of distributed photovoltaic (PV) energy sources vary from site to site and utility to utility. The objective of this paper is to examine several utility- and site-specific conditions which may affect economic viability of distributed PV applications to utility systems. Assessment methodology compatible with technical and economic assessment techniques employed by utility engineers and planners is employed to determine PV benefits for seven different utility systems. The seven case studies are performed using utility system characteristics and assumptions obtained from appropriate utility personnel. The resulting site-specific distributed PV benefits increase nonsite-specific generation system benefits available to central station PV plants as much as 46%, for one utility located in the Southwest

  13. An Accelerated Testing Approach for Automated Vehicles with Background Traffic Described by Joint Distributions

    OpenAIRE

    Huang, Zhiyuan; Lam, Henry; Zhao, Ding

    2017-01-01

    This paper proposes a new framework based on joint statistical models for evaluating risks of automated vehicles in a naturalistic driving environment. The previous studies on the Accelerated Evaluation for automated vehicles are extended from multi-independent-variate models to joint statistics. The proposed toolkit includes exploration of the rare event (e.g. crash) sets and construction of accelerated distributions for Gaussian Mixture models using Importance Sampling techniques. Furthermo...

  14. Distributed Access View Integrated Database (DAVID) system

    Science.gov (United States)

    Jacobs, Barry E.

    1991-01-01

    The Distributed Access View Integrated Database (DAVID) System, which was adopted by the Astrophysics Division for their Astrophysics Data System, is a solution to the system heterogeneity problem. The heterogeneous components of the Astrophysics problem is outlined. The Library and Library Consortium levels of the DAVID approach are described. The 'books' and 'kits' level is discussed. The Universal Object Typer Management System level is described. The relation of the DAVID project with the Small Business Innovative Research (SBIR) program is explained.

  15. An automated and integrated framework for dust storm detection based on ogc web processing services

    Science.gov (United States)

    Xiao, F.; Shea, G. Y. K.; Wong, M. S.; Campbell, J.

    2014-11-01

    Dust storms are known to have adverse effects on public health. Atmospheric dust loading is also one of the major uncertainties in global climatic modelling as it is known to have a significant impact on the radiation budget and atmospheric stability. The complexity of building scientific dust storm models is coupled with the scientific computation advancement, ongoing computing platform development, and the development of heterogeneous Earth Observation (EO) networks. It is a challenging task to develop an integrated and automated scheme for dust storm detection that combines Geo-Processing frameworks, scientific models and EO data together to enable the dust storm detection and tracking processes in a dynamic and timely manner. This study develops an automated and integrated framework for dust storm detection and tracking based on the Web Processing Services (WPS) initiated by Open Geospatial Consortium (OGC). The presented WPS framework consists of EO data retrieval components, dust storm detecting and tracking component, and service chain orchestration engine. The EO data processing component is implemented based on OPeNDAP standard. The dust storm detecting and tracking component combines three earth scientific models, which are SBDART model (for computing aerosol optical depth (AOT) of dust particles), WRF model (for simulating meteorological parameters) and HYSPLIT model (for simulating the dust storm transport processes). The service chain orchestration engine is implemented based on Business Process Execution Language for Web Service (BPEL4WS) using open-source software. The output results, including horizontal and vertical AOT distribution of dust particles as well as their transport paths, were represented using KML/XML and displayed in Google Earth. A serious dust storm, which occurred over East Asia from 26 to 28 Apr 2012, is used to test the applicability of the proposed WPS framework. Our aim here is to solve a specific instance of a complex EO data

  16. System Integration of Distributed Energy Resources

    DEFF Research Database (Denmark)

    Nyeng, Preben

    units, including the ICT solutions that can facilitate the integration. Specifically, the international standard "IEC 61850-7-420 Communications systems for Distributed Energy Resources" is considered as a possible brick in the solution. This standard has undergone continuous development....... It is therefore investigated in this project how ancillary services can be provided by alternatives to central power stations, and to what extent these can be integrated in the system by means of market-based methods. Particular emphasis is put on automatic solutions, which is particularly relevant for small......, and this project has actively contributed to its further development and improvements. Different types of integration methods are investigated in the project. Some are based on local measurement and control, e.g. by measuring the grid frequency, whereas others are based on direct remote control or market...

  17. The Automated Aircraft Rework System (AARS): A system integration approach

    Science.gov (United States)

    Benoit, Michael J.

    1994-01-01

    The Mercer Engineering Research Center (MERC), under contract to the United States Air Force (USAF) since 1989, has been actively involved in providing the Warner Robins Air Logistics Center (WR-ALC) with a robotic workcell designed to perform rework automated defastening and hole location/transfer operations on F-15 wings. This paper describes the activities required to develop and implement this workcell, known as the Automated Aircraft Rework System (AARS). AARS is scheduled to be completely installed and in operation at WR-ALC by September 1994.

  18. Ring interconnection for distributed memory automation and computing system

    Energy Technology Data Exchange (ETDEWEB)

    Vinogradov, V I [Inst. for Nuclear Research of the Russian Academy of Sciences, Moscow (Russian Federation)

    1996-12-31

    Problems of development of measurement, acquisition and central systems based on a distributed memory and a ring interface are discussed. It has been found that the RAM LINK-type protocol can be used for ringlet links in non-symmetrical distributed memory architecture multiprocessor system interaction. 5 refs.

  19. Distributed Microprocessor Automation Network for Synthesizing Radiotracers Used in Positron Emission Tomography [PET

    Science.gov (United States)

    Russell, J. A. G.; Alexoff, D. L.; Wolf, A. P.

    1984-09-01

    This presentation describes an evolving distributed microprocessor network for automating the routine production synthesis of radiotracers used in Positron Emission Tomography. We first present a brief overview of the PET method for measuring biological function, and then outline the general procedure for producing a radiotracer. The paper identifies several reasons for our automating the syntheses of these compounds. There is a description of the distributed microprocessor network architecture chosen and the rationale for that choice. Finally, we speculate about how this network may be exploited to extend the power of the PET method from the large university or National Laboratory to the biomedical research and clinical community at large. (DT)

  20. Research of the self-healing technologies in the optical communication network of distribution automation

    Science.gov (United States)

    Wang, Hao; Zhong, Guoxin

    2018-03-01

    Optical communication network is the mainstream technique of the communication networks for distribution automation, and self-healing technologies can improve the in reliability of the optical communication networks significantly. This paper discussed the technical characteristics and application scenarios of several network self-healing technologies in the access layer, the backbone layer and the core layer of the optical communication networks for distribution automation. On the base of the contrastive analysis, this paper gives an application suggestion of these self-healing technologies.

  1. Distributed microprocessor automation network for synthesizing radiotracers used in positron emission tomography

    International Nuclear Information System (INIS)

    Russell, J.A.G.; Alexoff, D.L.; Wolf, A.P.

    1984-01-01

    This presentation describes an evolving distributed microprocessor network for automating the routine production synthesis of radiotracers used in Positron Emission Tomography. We first present a brief overview of the PET method for measuring biological function, and then outline the general procedure for producing a radiotracer. The paper identifies several reasons for our automating the syntheses of these compounds. There is a description of the distributed microprocessor network architecture chosen and the rationale for that choice. Finally, we speculate about how this network may be exploited to extend the power of the PET method from the large university or National Laboratory to the biomedical research and clinical community at large. 20 refs. (DT)

  2. Microprocessor-based integration of microfluidic control for the implementation of automated sensor monitoring and multithreaded optimization algorithms.

    Science.gov (United States)

    Ezra, Elishai; Maor, Idan; Bavli, Danny; Shalom, Itai; Levy, Gahl; Prill, Sebastian; Jaeger, Magnus S; Nahmias, Yaakov

    2015-08-01

    Microfluidic applications range from combinatorial synthesis to high throughput screening, with platforms integrating analog perfusion components, digitally controlled micro-valves and a range of sensors that demand a variety of communication protocols. Currently, discrete control units are used to regulate and monitor each component, resulting in scattered control interfaces that limit data integration and synchronization. Here, we present a microprocessor-based control unit, utilizing the MS Gadgeteer open framework that integrates all aspects of microfluidics through a high-current electronic circuit that supports and synchronizes digital and analog signals for perfusion components, pressure elements, and arbitrary sensor communication protocols using a plug-and-play interface. The control unit supports an integrated touch screen and TCP/IP interface that provides local and remote control of flow and data acquisition. To establish the ability of our control unit to integrate and synchronize complex microfluidic circuits we developed an equi-pressure combinatorial mixer. We demonstrate the generation of complex perfusion sequences, allowing the automated sampling, washing, and calibrating of an electrochemical lactate sensor continuously monitoring hepatocyte viability following exposure to the pesticide rotenone. Importantly, integration of an optical sensor allowed us to implement automated optimization protocols that require different computational challenges including: prioritized data structures in a genetic algorithm, distributed computational efforts in multiple-hill climbing searches and real-time realization of probabilistic models in simulated annealing. Our system offers a comprehensive solution for establishing optimization protocols and perfusion sequences in complex microfluidic circuits.

  3. Enhancing Business Process Automation by Integrating RFID Data and Events

    Science.gov (United States)

    Zhao, Xiaohui; Liu, Chengfei; Lin, Tao

    Business process automation is one of the major benefits for utilising Radio Frequency Identification (RFID) technology. Through readers to RFID middleware systems, the information and the movements of tagged objects can be used to trigger business transactions. These features change the way of business applications for dealing with the physical world from mostly quantity-based to object-based. Aiming to facilitate business process automation, this paper introduces a new method to model and incorporate business logics into RFID edge systems from an object-oriented perspective with emphasises on RFID's event-driven characteristics. A framework covering business rule modelling, event handling and system operation invocations is presented on the basis of the event calculus. In regard to the identified delayed effects in RFID-enabled applications, a two-block buffering mechanism is proposed to improve RFID query efficiency within the framework. The performance improvements are analysed with related experiments.

  4. Efficient Sustainable Operation Mechanism of Distributed Desktop Integration Storage Based on Virtualization with Ubiquitous Computing

    Directory of Open Access Journals (Sweden)

    Hyun-Woo Kim

    2015-06-01

    Full Text Available Following the rapid growth of ubiquitous computing, many jobs that were previously manual have now been automated. This automation has increased the amount of time available for leisure; diverse services are now being developed for this leisure time. In addition, the development of small and portable devices like smartphones, diverse Internet services can be used regardless of time and place. Studies regarding diverse virtualization are currently in progress. These studies aim to determine ways to efficiently store and process the big data generated by the multitude of devices and services in use. One topic of such studies is desktop storage virtualization, which integrates distributed desktop resources and provides these resources to users to integrate into distributed legacy desktops via virtualization. In the case of desktop storage virtualization, high availability of virtualization is necessary and important for providing reliability to users. Studies regarding hierarchical structures and resource integration are currently in progress. These studies aim to create efficient data distribution and storage for distributed desktops based on resource integration environments. However, studies regarding efficient responses to server faults occurring in desktop-based resource integration environments have been insufficient. This paper proposes a mechanism for the sustainable operation of desktop storage (SODS for high operational availability. It allows for the easy addition and removal of desktops in desktop-based integration environments. It also activates alternative servers when a fault occurs within a system.

  5. Automated Finite State Workflow for Distributed Data Production

    Science.gov (United States)

    Hajdu, L.; Didenko, L.; Lauret, J.; Amol, J.; Betts, W.; Jang, H. J.; Noh, S. Y.

    2016-10-01

    In statistically hungry science domains, data deluges can be both a blessing and a curse. They allow the narrowing of statistical errors from known measurements, and open the door to new scientific opportunities as research programs mature. They are also a testament to the efficiency of experimental operations. However, growing data samples may need to be processed with little or no opportunity for huge increases in computing capacity. A standard strategy has thus been to share resources across multiple experiments at a given facility. Another has been to use middleware that “glues” resources across the world so they are able to locally run the experimental software stack (either natively or virtually). We describe a framework STAR has successfully used to reconstruct a ~400 TB dataset consisting of over 100,000 jobs submitted to a remote site in Korea from STAR's Tier 0 facility at the Brookhaven National Laboratory. The framework automates the full workflow, taking raw data files from tape and writing Physics-ready output back to tape without operator or remote site intervention. Through hardening we have demonstrated 97(±2)% efficiency, over a period of 7 months of operation. The high efficiency is attributed to finite state checking with retries to encourage resilience in the system over capricious and fallible infrastructure.

  6. Automated Finite State Workflow for Distributed Data Production

    International Nuclear Information System (INIS)

    Hajdu, L; Didenko, L; Lauret, J; Betts, W; Amol, J; Jang, H J; Noh, S Y

    2016-01-01

    In statistically hungry science domains, data deluges can be both a blessing and a curse. They allow the narrowing of statistical errors from known measurements, and open the door to new scientific opportunities as research programs mature. They are also a testament to the efficiency of experimental operations. However, growing data samples may need to be processed with little or no opportunity for huge increases in computing capacity. A standard strategy has thus been to share resources across multiple experiments at a given facility. Another has been to use middleware that “glues” resources across the world so they are able to locally run the experimental software stack (either natively or virtually). We describe a framework STAR has successfully used to reconstruct a ∼400 TB dataset consisting of over 100,000 jobs submitted to a remote site in Korea from STAR's Tier 0 facility at the Brookhaven National Laboratory. The framework automates the full workflow, taking raw data files from tape and writing Physics-ready output back to tape without operator or remote site intervention. Through hardening we have demonstrated 97(±2)% efficiency, over a period of 7 months of operation. The high efficiency is attributed to finite state checking with retries to encourage resilience in the system over capricious and fallible infrastructure. (paper)

  7. Power system voltage stability and agent based distribution automation in smart grid

    Science.gov (United States)

    Nguyen, Cuong Phuc

    2011-12-01

    Our interconnected electric power system is presently facing many challenges that it was not originally designed and engineered to handle. The increased inter-area power transfers, aging infrastructure, and old technologies, have caused many problems including voltage instability, widespread blackouts, slow control response, among others. These problems have created an urgent need to transform the present electric power system to a highly stable, reliable, efficient, and self-healing electric power system of the future, which has been termed "smart grid". This dissertation begins with an investigation of voltage stability in bulk transmission networks. A new continuation power flow tool for studying the impacts of generator merit order based dispatch on inter-area transfer capability and static voltage stability is presented. The load demands are represented by lumped load models on the transmission system. While this representation is acceptable in traditional power system analysis, it may not be valid in the future smart grid where the distribution system will be integrated with intelligent and quick control capabilities to mitigate voltage problems before they propagate into the entire system. Therefore, before analyzing the operation of the whole smart grid, it is important to understand the distribution system first. The second part of this dissertation presents a new platform for studying and testing emerging technologies in advanced Distribution Automation (DA) within smart grids. Due to the key benefits over the traditional centralized approach, namely flexible deployment, scalability, and avoidance of single-point-of-failure, a new distributed approach is employed to design and develop all elements of the platform. A multi-agent system (MAS), which has the three key characteristics of autonomy, local view, and decentralization, is selected to implement the advanced DA functions. The intelligent agents utilize a communication network for cooperation and

  8. Control and management of distribution system with integrated DERs via IEC 61850 based communication

    Directory of Open Access Journals (Sweden)

    Ikbal Ali

    2017-06-01

    Full Text Available Distributed Energy Resources (DERs are being increasingly integrated in the distribution systems and resulting in complex power flow scenarios. In such cases, effective control, management and protection of distribution systems becomes highly challenging. Standardized and interoperable communication in distribution systems has the potential to deal with such challenges to achieve higher energy efficiency and reliability. Ed. 2 of IEC 61850 standards, for utility automation, standardizing the exchange of data among different substations, DERs, control centers, PMUs and PDCs. This paper demonstrates the modelling of information and services needed for control, management and protection of distribution systems with integrated DERs. This paper has used IP tunnels and/or mapping over IP layer for transferring IEC 61850 messages, such as sample values (SVs and GOOSE (Generic Object Oriented Substation Event, over distribution system Wide Area Network (WAN. Finally performance of the proposed communication configurations for different applications is analyzed by calculating End-to-End (ETE delay, throughput and jitter.

  9. Energy Production System Management - Renewable energy power supply integration with Building Automation System

    International Nuclear Information System (INIS)

    Figueiredo, Joao; Martins, Joao

    2010-01-01

    Intelligent buildings, historically and technologically, refers to the integration of four distinctive systems: Building Automation Systems (BAS), Telecommunication Systems, Office Automation Systems and Computer Building Management Systems. The increasing sophisticated BAS has become the 'heart and soul' of modern intelligent buildings. Integrating energy supply and demand elements - often known as Demand-Side Management (DSM) - has became an important energy efficiency policy concept. Nowadays, European countries have diversified their power supplies, reducing the dependence on OPEC, and developing a broader mix of energy sources maximizing the use of renewable energy domestic sources. In this way it makes sense to include a fifth system into the intelligent building group: Energy Production System Management (EPSM). This paper presents a Building Automation System where the Demand-Side Management is fully integrated with the building's Energy Production System, which incorporates a complete set of renewable energy production and storage systems.

  10. Automated leak localization performance without detailed demand distribution data

    NARCIS (Netherlands)

    Moors, Janneke; Scholten, L.; van der Hoek, J.P.; den Besten, J.

    2018-01-01

    Automatic leak localization has been suggested to reduce the time and personnel efforts needed to localize
    (small) leaks. Yet, the available methods require a detailed demand distribution model for successful
    calibration and good leak localization performance. The main aim of this work was

  11. Modeling Multiple Human-Automation Distributed Systems using Network-form Games

    Science.gov (United States)

    Brat, Guillaume

    2012-01-01

    The paper describes at a high-level the network-form game framework (based on Bayes net and game theory), which can be used to model and analyze safety issues in large, distributed, mixed human-automation systems such as NextGen.

  12. Relay Protection and Automation Systems Based on Programmable Logic Integrated Circuits

    International Nuclear Information System (INIS)

    Lashin, A. V.; Kozyrev, A. V.

    2015-01-01

    One of the most promising forms of developing the apparatus part of relay protection and automation devices is considered. The advantages of choosing programmable logic integrated circuits to obtain adaptive technological algorithms in power system protection and control systems are pointed out. The technical difficulties in the problems which today stand in the way of using relay protection and automation systems are indicated and a new technology for solving these problems is presented. Particular attention is devoted to the possibility of reconfiguring the logic of these devices, using programmable logic integrated circuits

  13. Staghorn: An Automated Large-Scale Distributed System Analysis Platform

    Energy Technology Data Exchange (ETDEWEB)

    Gabert, Kasimir [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Burns, Ian [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Elliott, Steven [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kallaher, Jenna [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vail, Adam [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-09-01

    Conducting experiments on large-scale distributed computing systems is becoming significantly easier with the assistance of emulation. Researchers can now create a model of a distributed computing environment and then generate a virtual, laboratory copy of the entire system composed of potentially thousands of virtual machines, switches, and software. The use of real software, running at clock rate in full virtual machines, allows experiments to produce meaningful results without necessitating a full understanding of all model components. However, the ability to inspect and modify elements within these models is bound by the limitation that such modifications must compete with the model, either running in or alongside it. This inhibits entire classes of analyses from being conducted upon these models. We developed a mechanism to snapshot an entire emulation-based model as it is running. This allows us to \\freeze time" and subsequently fork execution, replay execution, modify arbitrary parts of the model, or deeply explore the model. This snapshot includes capturing packets in transit and other input/output state along with the running virtual machines. We were able to build this system in Linux using Open vSwitch and Kernel Virtual Machines on top of Sandia's emulation platform Firewheel. This primitive opens the door to numerous subsequent analyses on models, including state space exploration, debugging distributed systems, performance optimizations, improved training environments, and improved experiment repeatability.

  14. An integrated system for buildings’ energy-efficient automation: Application in the tertiary sector

    International Nuclear Information System (INIS)

    Marinakis, Vangelis; Doukas, Haris; Karakosta, Charikleia; Psarras, John

    2013-01-01

    Highlights: ► We developed an interactive software for building automation systems. ► Monitoring of energy consumption in real time. ► Optimization of energy consumption implementing appropriate control scenarios. ► Pilot appraisal on remote control of active systems in the tertiary sector building. ► Significant decrease in energy and operating cost of A/C system. -- Abstract: Although integrated building automation systems have become increasingly popular, an integrated system which includes remote control technology to enable real-time monitoring of the energy consumption by energy end-users, as well as optimization functions is required. To respond to this common interest, the main aim of the paper is to present an integrated system for buildings’ energy-efficient automation. The proposed system is based on a prototype software tool for the simulation and optimization of energy consumption in the building sector, enhancing the interactivity of building automation systems. The system can incorporate energy-efficient automation functions for heating, cooling and/or lighting based on recent guidance and decisions of the National Law, energy efficiency requirements of EN 15232 and ISO 50001 Energy Management Standard among others. The presented system was applied to a supermarket building in Greece and focused on the remote control of active systems.

  15. Superconducting power distribution structure for integrated circuits

    International Nuclear Information System (INIS)

    Ruby, R.C.

    1991-01-01

    This patent describes a superconducting power distribution structure for an integrated circuit. It comprises a first superconducting capacitor plate; a second superconducting capacitor plate provided with electrical isolation means within the second capacitor plate; dielectric means separating the first capacitor plate from the second capacitor plate; first via means coupled at a first end to the first capacitor plate and extending through the dielectric and the electrical isolation means of the second capacitor plate; first contact means coupled to a second end of the first via means; and second contact means coupled to the second capacitor plate such that the first contact means and the second contact means are accessible from the same side of the second capacitor plate

  16. The optimal number, type and location of devices in automation of electrical distribution networks

    Directory of Open Access Journals (Sweden)

    Popović Željko N.

    2015-01-01

    Full Text Available This paper presents the mixed integer linear programming based model for determining optimal number, type and location of remotely controlled and supervised devices in distribution networks in the presence of distributed generators. The proposed model takes into consideration a number of different devices simultaneously (remotely controlled circuit breakers/reclosers, sectionalizing switches, remotely supervised and local fault passage indicators along with the following: expected outage cost to consumers and producers due to momentary and long-term interruptions, automated device expenses (capital investment, installation, and annual operation and maintenance costs, number and expenses of crews involved in the isolation and restoration process. Furthermore, the other possible benefits of each of automated device are also taken into account (e.g., benefits due to decreasing the cost of switching operations in normal conditions. The obtained numerical results emphasize the importance of consideration of different types of automation devices simultaneously. They also show that the proposed approach have a potential to improve the process of determining of the best automation strategy in real life distribution networks.

  17. Development of methods for DSM and distribution automation planning

    Energy Technology Data Exchange (ETDEWEB)

    Lehtonen, M.; Seppaelae, A.; Kekkonen, V.; Koreneff, G. [VTT Energy, Espoo (Finland)

    1996-12-31

    In the de-regulated electricity market, the power trading companies have to face new problems. The biggest challenges are caused by the uncertainty in the load magnitudes. In order to minimize the risks in power purchase and also in retail sales, the power traders should have as reliable and accurate estimates for hourly demands of their customers as possible. New tools have been developed for the distribution load estimation and for the management of energy balances of the trading companies. These tools are based on the flexible combination of the information available from several sources, like direct customer measurements, network measurements, load models and statistical data. These functions also serve as an information source for higher level activities of the electricity selling companies. These activities and the associated functions have been studied in the prototype system called DEM, which is now being developed for the operation of Finnish utilities in the newly de-regulated power market

  18. Development of methods for DSM and distribution automation planning

    International Nuclear Information System (INIS)

    Lehtonen, M.; Seppaelae, A.; Kekkonen, V.; Koreneff, G.

    1996-01-01

    In the de-regulated electricity market, the power trading companies have to face new problems. The biggest challenges are caused by the uncertainty in the load magnitudes. In order to minimize the risks in power purchase and also in retail sales, the power traders should have as reliable and accurate estimates for hourly demands of their customers as possible. New tools have been developed for the distribution load estimation and for the management of energy balances of the trading companies. These tools are based on the flexible combination of the information available from several sources, like direct customer measurements, network measurements, load models and statistical data. These functions also serve as an information source for higher level activities of the electricity selling companies. These activities and the associated functions have been studied in the prototype system called DEM, which is now being developed for the operation of Finnish utilities in the newly de-regulated power market

  19. Development of methods for DSM and distribution automation planning

    Energy Technology Data Exchange (ETDEWEB)

    Lehtonen, M; Seppaelae, A; Kekkonen, V; Koreneff, G [VTT Energy, Espoo (Finland)

    1997-12-31

    In the de-regulated electricity market, the power trading companies have to face new problems. The biggest challenges are caused by the uncertainty in the load magnitudes. In order to minimize the risks in power purchase and also in retail sales, the power traders should have as reliable and accurate estimates for hourly demands of their customers as possible. New tools have been developed for the distribution load estimation and for the management of energy balances of the trading companies. These tools are based on the flexible combination of the information available from several sources, like direct customer measurements, network measurements, load models and statistical data. These functions also serve as an information source for higher level activities of the electricity selling companies. These activities and the associated functions have been studied in the prototype system called DEM, which is now being developed for the operation of Finnish utilities in the newly de-regulated power market

  20. Enhanced distribution automation system based on new communication technology at ENEL

    International Nuclear Information System (INIS)

    Comellini, E.; Gargiuli, R.; Gelli, G.; Tonon, R.; Mirandola, P.

    1991-01-01

    On the basis of extensive research aimed at the assessment of the feasibility of a cost effective two-way telecommunications system, and of the experience gained during the eighties in the field of remote control of the primary distribution network where new digital techniques were introduced, and in the field of metering apparatus, where about 7,000 HV and MV customers were equipped with Ferraris meters associated with electronic devices for the application of multirate tariffs, ENEL (Italian National Electricity Board) has designed a new distribution automation system aimed at: remote control of the MV distribution network, and MV and LV customer meter service automation. This report describes the key choices that determined the architecture of the new system and the most important features of its main components, in view of: an improvement of energy usage efficiency, better service to the customers, as well as, increased simplicity and transparency in customer relationships

  1. Enhanced distribution automation system based on new communications technology at ENEL

    Energy Technology Data Exchange (ETDEWEB)

    Comellini, E; Gargiuli, R; Gelli, G; Tonon, R; Mirandola, P

    1992-12-31

    On the basis of extensive research aimed at the assessment of the feasibility of a cost effective two-way telecommunications system, and of the experience gained during the eighties in the field of remote control of the primary distribution network where new digital techniques were introduced, and in the field of metering apparatus, where about 7,000 HV and MV customers were equipped with Ferraris meters associated with electronic devices for the application of multirate tariffs, ENEL (Italian National Electricity Board) has designed a new distribution automation system aimed at: remote control of the MV distribution network, and MV and LV customer meter service automation. This report describes the key choices that determined the architecture of the new system and the most important features of its main components, in view of: an improvement of energy usage efficiency, better service to the customers, as well as, increased simplicity and transparency in customer relationships.

  2. Evidence Report, Risk of Inadequate Design of Human and Automation/Robotic Integration

    Science.gov (United States)

    Zumbado, Jennifer Rochlis; Billman, Dorrit; Feary, Mike; Green, Collin

    2011-01-01

    The success of future exploration missions depends, even more than today, on effective integration of humans and technology (automation and robotics). This will not emerge by chance, but by design. Both crew and ground personnel will need to do more demanding tasks in more difficult conditions, amplifying the costs of poor design and the benefits of good design. This report has looked at the importance of good design and the risks from poor design from several perspectives: 1) If the relevant functions needed for a mission are not identified, then designs of technology and its use by humans are unlikely to be effective: critical functions will be missing and irrelevant functions will mislead or drain attention. 2) If functions are not distributed effectively among the (multiple) participating humans and automation/robotic systems, later design choices can do little to repair this: additional unnecessary coordination work may be introduced, workload may be redistributed to create problems, limited human attentional resources may be wasted, and the capabilities of both humans and technology underused. 3) If the design does not promote accurate understanding of the capabilities of the technology, the operators will not use the technology effectively: the system may be switched off in conditions where it would be effective, or used for tasks or in contexts where its effectiveness may be very limited. 4) If an ineffective interaction design is implemented and put into use, a wide range of problems can ensue. Many involve lack of transparency into the system: operators may be unable or find it very difficult to determine a) the current state and changes of state of the automation or robot, b) the current state and changes in state of the system being controlled or acted on, and c) what actions by human or by system had what effects. 5) If the human interfaces for operation and control of robotic agents are not designed to accommodate the unique points of view and

  3. The Semantic Automated Discovery and Integration (SADI Web service Design-Pattern, API and Reference Implementation

    Directory of Open Access Journals (Sweden)

    Wilkinson Mark D

    2011-10-01

    Full Text Available Abstract Background The complexity and inter-related nature of biological data poses a difficult challenge for data and tool integration. There has been a proliferation of interoperability standards and projects over the past decade, none of which has been widely adopted by the bioinformatics community. Recent attempts have focused on the use of semantics to assist integration, and Semantic Web technologies are being welcomed by this community. Description SADI - Semantic Automated Discovery and Integration - is a lightweight set of fully standards-compliant Semantic Web service design patterns that simplify the publication of services of the type commonly found in bioinformatics and other scientific domains. Using Semantic Web technologies at every level of the Web services "stack", SADI services consume and produce instances of OWL Classes following a small number of very straightforward best-practices. In addition, we provide codebases that support these best-practices, and plug-in tools to popular developer and client software that dramatically simplify deployment of services by providers, and the discovery and utilization of those services by their consumers. Conclusions SADI Services are fully compliant with, and utilize only foundational Web standards; are simple to create and maintain for service providers; and can be discovered and utilized in a very intuitive way by biologist end-users. In addition, the SADI design patterns significantly improve the ability of software to automatically discover appropriate services based on user-needs, and automatically chain these into complex analytical workflows. We show that, when resources are exposed through SADI, data compliant with a given ontological model can be automatically gathered, or generated, from these distributed, non-coordinating resources - a behaviour we have not observed in any other Semantic system. Finally, we show that, using SADI, data dynamically generated from Web services

  4. The Semantic Automated Discovery and Integration (SADI) Web service Design-Pattern, API and Reference Implementation

    Science.gov (United States)

    2011-01-01

    Background The complexity and inter-related nature of biological data poses a difficult challenge for data and tool integration. There has been a proliferation of interoperability standards and projects over the past decade, none of which has been widely adopted by the bioinformatics community. Recent attempts have focused on the use of semantics to assist integration, and Semantic Web technologies are being welcomed by this community. Description SADI - Semantic Automated Discovery and Integration - is a lightweight set of fully standards-compliant Semantic Web service design patterns that simplify the publication of services of the type commonly found in bioinformatics and other scientific domains. Using Semantic Web technologies at every level of the Web services "stack", SADI services consume and produce instances of OWL Classes following a small number of very straightforward best-practices. In addition, we provide codebases that support these best-practices, and plug-in tools to popular developer and client software that dramatically simplify deployment of services by providers, and the discovery and utilization of those services by their consumers. Conclusions SADI Services are fully compliant with, and utilize only foundational Web standards; are simple to create and maintain for service providers; and can be discovered and utilized in a very intuitive way by biologist end-users. In addition, the SADI design patterns significantly improve the ability of software to automatically discover appropriate services based on user-needs, and automatically chain these into complex analytical workflows. We show that, when resources are exposed through SADI, data compliant with a given ontological model can be automatically gathered, or generated, from these distributed, non-coordinating resources - a behaviour we have not observed in any other Semantic system. Finally, we show that, using SADI, data dynamically generated from Web services can be explored in a manner

  5. The Semantic Automated Discovery and Integration (SADI) Web service Design-Pattern, API and Reference Implementation.

    Science.gov (United States)

    Wilkinson, Mark D; Vandervalk, Benjamin; McCarthy, Luke

    2011-10-24

    The complexity and inter-related nature of biological data poses a difficult challenge for data and tool integration. There has been a proliferation of interoperability standards and projects over the past decade, none of which has been widely adopted by the bioinformatics community. Recent attempts have focused on the use of semantics to assist integration, and Semantic Web technologies are being welcomed by this community. SADI - Semantic Automated Discovery and Integration - is a lightweight set of fully standards-compliant Semantic Web service design patterns that simplify the publication of services of the type commonly found in bioinformatics and other scientific domains. Using Semantic Web technologies at every level of the Web services "stack", SADI services consume and produce instances of OWL Classes following a small number of very straightforward best-practices. In addition, we provide codebases that support these best-practices, and plug-in tools to popular developer and client software that dramatically simplify deployment of services by providers, and the discovery and utilization of those services by their consumers. SADI Services are fully compliant with, and utilize only foundational Web standards; are simple to create and maintain for service providers; and can be discovered and utilized in a very intuitive way by biologist end-users. In addition, the SADI design patterns significantly improve the ability of software to automatically discover appropriate services based on user-needs, and automatically chain these into complex analytical workflows. We show that, when resources are exposed through SADI, data compliant with a given ontological model can be automatically gathered, or generated, from these distributed, non-coordinating resources - a behaviour we have not observed in any other Semantic system. Finally, we show that, using SADI, data dynamically generated from Web services can be explored in a manner very similar to data housed in

  6. Automated granularity to integrate digital information: the "Antarctic Treaty Searchable Database" case study

    Directory of Open Access Journals (Sweden)

    Paul Arthur Berkman

    2006-06-01

    Full Text Available Access to information is necessary, but not sufficient in our digital era. The challenge is to objectively integrate digital resources based on user-defined objectives for the purpose of discovering information relationships that facilitate interpretations and decision making. The Antarctic Treaty Searchable Database (http://aspire.nvi.net, which is in its sixth edition, provides an example of digital integration based on the automated generation of information granules that can be dynamically combined to reveal objective relationships within and between digital information resources. This case study further demonstrates that automated granularity and dynamic integration can be accomplished simply by utilizing the inherent structure of the digital information resources. Such information integration is relevant to library and archival programs that require long-term preservation of authentic digital resources.

  7. An Integrated Systems Approach: A Description of an Automated Circulation Management System.

    Science.gov (United States)

    Seifert, Jan E.; And Others

    These bidding specifications describe requirements for a turn-key automated circulation system for the University of Oklahoma Libraries. An integrated systems approach is planned, and requirements are presented for various subsystems: acquisitions, fund accounting, reserve room, and bibliographic and serials control. Also outlined are hardware…

  8. Power up your plant - An introduction to integrated process and power automation

    Energy Technology Data Exchange (ETDEWEB)

    Vasel, Jeffrey

    2010-09-15

    This paper discusses how a single integrated system can increase energy efficiency, improve plant uptime, and lower life cycle costs. Integrated Process and Power Automation is a new system integration architecture and power strategy that addresses the needs of the process and power generation industries. The architecture is based on Industrial Ethernet standards such as IEC 61850 and Profinet as well as Fieldbus technologies. The energy efficiency gains from integration are discussed in a power generation use case. A power management system success story from a major oil and gas company, Petrobras, is also discussed.

  9. Automics: an integrated platform for NMR-based metabonomics spectral processing and data analysis

    Directory of Open Access Journals (Sweden)

    Qu Lijia

    2009-03-01

    Full Text Available Abstract Background Spectral processing and post-experimental data analysis are the major tasks in NMR-based metabonomics studies. While there are commercial and free licensed software tools available to assist these tasks, researchers usually have to use multiple software packages for their studies because software packages generally focus on specific tasks. It would be beneficial to have a highly integrated platform, in which these tasks can be completed within one package. Moreover, with open source architecture, newly proposed algorithms or methods for spectral processing and data analysis can be implemented much more easily and accessed freely by the public. Results In this paper, we report an open source software tool, Automics, which is specifically designed for NMR-based metabonomics studies. Automics is a highly integrated platform that provides functions covering almost all the stages of NMR-based metabonomics studies. Automics provides high throughput automatic modules with most recently proposed algorithms and powerful manual modules for 1D NMR spectral processing. In addition to spectral processing functions, powerful features for data organization, data pre-processing, and data analysis have been implemented. Nine statistical methods can be applied to analyses including: feature selection (Fisher's criterion, data reduction (PCA, LDA, ULDA, unsupervised clustering (K-Mean and supervised regression and classification (PLS/PLS-DA, KNN, SIMCA, SVM. Moreover, Automics has a user-friendly graphical interface for visualizing NMR spectra and data analysis results. The functional ability of Automics is demonstrated with an analysis of a type 2 diabetes metabolic profile. Conclusion Automics facilitates high throughput 1D NMR spectral processing and high dimensional data analysis for NMR-based metabonomics applications. Using Automics, users can complete spectral processing and data analysis within one software package in most cases

  10. Automics: an integrated platform for NMR-based metabonomics spectral processing and data analysis.

    Science.gov (United States)

    Wang, Tao; Shao, Kang; Chu, Qinying; Ren, Yanfei; Mu, Yiming; Qu, Lijia; He, Jie; Jin, Changwen; Xia, Bin

    2009-03-16

    Spectral processing and post-experimental data analysis are the major tasks in NMR-based metabonomics studies. While there are commercial and free licensed software tools available to assist these tasks, researchers usually have to use multiple software packages for their studies because software packages generally focus on specific tasks. It would be beneficial to have a highly integrated platform, in which these tasks can be completed within one package. Moreover, with open source architecture, newly proposed algorithms or methods for spectral processing and data analysis can be implemented much more easily and accessed freely by the public. In this paper, we report an open source software tool, Automics, which is specifically designed for NMR-based metabonomics studies. Automics is a highly integrated platform that provides functions covering almost all the stages of NMR-based metabonomics studies. Automics provides high throughput automatic modules with most recently proposed algorithms and powerful manual modules for 1D NMR spectral processing. In addition to spectral processing functions, powerful features for data organization, data pre-processing, and data analysis have been implemented. Nine statistical methods can be applied to analyses including: feature selection (Fisher's criterion), data reduction (PCA, LDA, ULDA), unsupervised clustering (K-Mean) and supervised regression and classification (PLS/PLS-DA, KNN, SIMCA, SVM). Moreover, Automics has a user-friendly graphical interface for visualizing NMR spectra and data analysis results. The functional ability of Automics is demonstrated with an analysis of a type 2 diabetes metabolic profile. Automics facilitates high throughput 1D NMR spectral processing and high dimensional data analysis for NMR-based metabonomics applications. Using Automics, users can complete spectral processing and data analysis within one software package in most cases. Moreover, with its open source architecture, interested

  11. Energy Systems Integration: Demonstrating Distributed Resource Communications

    Energy Technology Data Exchange (ETDEWEB)

    2017-01-01

    Overview fact sheet about the Electric Power Research Institute (EPRI) and Schneider Electric Integrated Network Testbed for Energy Grid Research and Technology Experimentation (INTEGRATE) project at the Energy Systems Integration Facility. INTEGRATE is part of the U.S. Department of Energy's Grid Modernization Initiative.

  12. Overview of NREL Distribution Grid Integration Cost Projects

    Energy Technology Data Exchange (ETDEWEB)

    Horowitz, Kelsey A [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Ding, Fei [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Mather, Barry A [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Palmintier, Bryan S [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Denholm, Paul L [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2018-01-12

    This presentation was given at the 2017 NREL Workshop 'Benchmarking Distribution Grid Integration Costs Under High Distributed PV Penetrations.' It provides a brief overview of recent and ongoing NREL work on distribution system grid integration costs, as well as challenges and needs from the community.

  13. Thermal Distribution System | Energy Systems Integration Facility | NREL

    Science.gov (United States)

    Thermal Distribution System Thermal Distribution System The Energy Systems Integration Facility's . Photo of the roof of the Energy Systems Integration Facility. The thermal distribution bus allows low as 10% of its full load level). The 60-ton chiller cools water with continuous thermal control

  14. Advanced fighter technology integration (AFTI)/F-16 Automated Maneuvering Attack System final flight test results

    Science.gov (United States)

    Dowden, Donald J.; Bessette, Denis E.

    1987-01-01

    The AFTI F-16 Automated Maneuvering Attack System has undergone developmental and demonstration flight testing over a total of 347.3 flying hours in 237 sorties. The emphasis of this phase of the flight test program was on the development of automated guidance and control systems for air-to-air and air-to-ground weapons delivery, using a digital flight control system, dual avionics multiplex buses, an advanced FLIR sensor with laser ranger, integrated flight/fire-control software, advanced cockpit display and controls, and modified core Multinational Stage Improvement Program avionics.

  15. Integrated Automation of High-Throughput Screening and Reverse Phase Protein Array Sample Preparation

    DEFF Research Database (Denmark)

    Pedersen, Marlene Lemvig; Block, Ines; List, Markus

    into automated robotic high-throughput screens, which allows subsequent protein quantification. In this integrated solution, samples are directly forwarded to automated cell lysate preparation and preparation of dilution series, including reformatting to a protein spotter-compatible format after the high......-throughput screening. Tracking of huge sample numbers and data analysis from a high-content screen to RPPAs is accomplished via MIRACLE, a custom made software suite developed by us. To this end, we demonstrate that the RPPAs generated in this manner deliver reliable protein readouts and that GAPDH and TFR levels can...

  16. Integrated Markov-neural reliability computation method: A case for multiple automated guided vehicle system

    International Nuclear Information System (INIS)

    Fazlollahtabar, Hamed; Saidi-Mehrabad, Mohammad; Balakrishnan, Jaydeep

    2015-01-01

    This paper proposes an integrated Markovian and back propagation neural network approaches to compute reliability of a system. While states of failure occurrences are significant elements for accurate reliability computation, Markovian based reliability assessment method is designed. Due to drawbacks shown by Markovian model for steady state reliability computations and neural network for initial training pattern, integration being called Markov-neural is developed and evaluated. To show efficiency of the proposed approach comparative analyses are performed. Also, for managerial implication purpose an application case for multiple automated guided vehicles (AGVs) in manufacturing networks is conducted. - Highlights: • Integrated Markovian and back propagation neural network approach to compute reliability. • Markovian based reliability assessment method. • Managerial implication is shown in an application case for multiple automated guided vehicles (AGVs) in manufacturing networks

  17. Automated integration of continuous glucose monitor data in the electronic health record using consumer technology.

    Science.gov (United States)

    Kumar, Rajiv B; Goren, Nira D; Stark, David E; Wall, Dennis P; Longhurst, Christopher A

    2016-05-01

    The diabetes healthcare provider plays a key role in interpreting blood glucose trends, but few institutions have successfully integrated patient home glucose data in the electronic health record (EHR). Published implementations to date have required custom interfaces, which limit wide-scale replication. We piloted automated integration of continuous glucose monitor data in the EHR using widely available consumer technology for 10 pediatric patients with insulin-dependent diabetes. Establishment of a passive data communication bridge via a patient's/parent's smartphone enabled automated integration and analytics of patient device data within the EHR between scheduled clinic visits. It is feasible to utilize available consumer technology to assess and triage home diabetes device data within the EHR, and to engage patients/parents and improve healthcare provider workflow. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  18. Automated Integration of Dedicated Hardwired IP Cores in Heterogeneous MPSoCs Designed with ESPAM

    Directory of Open Access Journals (Sweden)

    Ed Deprettere

    2008-06-01

    Full Text Available This paper presents a methodology and techniques for automated integration of dedicated hardwired (HW IP cores into heterogeneous multiprocessor systems. We propose an IP core integration approach based on an HW module generation that consists of a wrapper around a predefined IP core. This approach has been implemented in a tool called ESPAM for automated multiprocessor system design, programming, and implementation. In order to keep high performance of the integrated IP cores, the structure of the IP core wrapper is devised in a way that adequately represents and efficiently implements the main characteristics of the formal model of computation, namely, Kahn process networks, we use as an underlying programming model in ESPAM. We present details about the structure of the HW module, the supported types of IP cores, and the minimum interfaces these IP cores have to provide in order to allow automated integration in heterogeneous multiprocessor systems generated by ESPAM. The ESPAM design flow, the multiprocessor platforms we consider, and the underlying programming (KPN model are introduced as well. Furthermore, we present the efficiency of our approach by applying our methodology and ESPAM tool to automatically generate, implement, and program heterogeneous multiprocessor systems that integrate dedicated IP cores and execute real-life applications.

  19. Automated Peak Picking and Peak Integration in Macromolecular NMR Spectra Using AUTOPSY

    Science.gov (United States)

    Koradi, Reto; Billeter, Martin; Engeli, Max; Güntert, Peter; Wüthrich, Kurt

    1998-12-01

    A new approach for automated peak picking of multidimensional protein NMR spectra with strong overlap is introduced, which makes use of the program AUTOPSY (automatedpeak picking for NMRspectroscopy). The main elements of this program are a novel function for local noise level calculation, the use of symmetry considerations, and the use of lineshapes extracted from well-separated peaks for resolving groups of strongly overlapping peaks. The algorithm generates peak lists with precise chemical shift and integral intensities, and a reliability measure for the recognition of each peak. The results of automated peak picking of NOESY spectra with AUTOPSY were tested in combination with the combined automated NOESY cross peak assignment and structure calculation routine NOAH implemented in the program DYANA. The quality of the resulting structures was found to be comparable with those from corresponding data obtained with manual peak picking.

  20. Load Segmentation for Convergence of Distribution Automation and Advanced Metering Infrastructure Systems

    Science.gov (United States)

    Pamulaparthy, Balakrishna; KS, Swarup; Kommu, Rajagopal

    2014-12-01

    Distribution automation (DA) applications are limited to feeder level today and have zero visibility outside of the substation feeder and reaching down to the low-voltage distribution network level. This has become a major obstacle in realizing many automated functions and enhancing existing DA capabilities. Advanced metering infrastructure (AMI) systems are being widely deployed by utilities across the world creating system-wide communications access to every monitoring and service point, which collects data from smart meters and sensors in short time intervals, in response to utility needs. DA and AMI systems convergence provides unique opportunities and capabilities for distribution grid modernization with the DA system acting as a controller and AMI system acting as feedback to DA system, for which DA applications have to understand and use the AMI data selectively and effectively. In this paper, we propose a load segmentation method that helps the DA system to accurately understand and use the AMI data for various automation applications with a suitable case study on power restoration.

  1. Automated element identification for EDS spectra evaluation using quantification and integrated spectra simulation approaches

    International Nuclear Information System (INIS)

    Eggert, F

    2010-01-01

    This work describes first real automated solution for qualitative evaluation of EDS spectra in X-ray microanalysis. It uses a combination of integrated standardless quantitative evaluation, computation of analytical errors to a final uncertainty, and parts of recently developed simulation approaches. Multiple spectra reconstruction assessments and peak searches of the residual spectrum are powerful enough to solve the qualitative analytical question automatically for totally unknown specimens. The integrated quantitative assessment is useful to improve the confidence of the qualitative analysis. Therefore, the qualitative element analysis becomes a part of integrated quantitative spectrum evaluation, where the quantitative results are used to iteratively refine element decisions, spectrum deconvolution, and simulation steps.

  2. Functional integration of automated system databases by means of artificial intelligence

    Science.gov (United States)

    Dubovoi, Volodymyr M.; Nikitenko, Olena D.; Kalimoldayev, Maksat; Kotyra, Andrzej; Gromaszek, Konrad; Iskakova, Aigul

    2017-08-01

    The paper presents approaches for functional integration of automated system databases by means of artificial intelligence. The peculiarities of turning to account the database in the systems with the usage of a fuzzy implementation of functions were analyzed. Requirements for the normalization of such databases were defined. The question of data equivalence in conditions of uncertainty and collisions in the presence of the databases functional integration is considered and the model to reveal their possible occurrence is devised. The paper also presents evaluation method of standardization of integrated database normalization.

  3. Integrating distributed Bayesian inference and reinforcement learning for sensor management

    NARCIS (Netherlands)

    Grappiolo, C.; Whiteson, S.; Pavlin, G.; Bakker, B.

    2009-01-01

    This paper introduces a sensor management approach that integrates distributed Bayesian inference (DBI) and reinforcement learning (RL). DBI is implemented using distributed perception networks (DPNs), a multiagent approach to performing efficient inference, while RL is used to automatically

  4. Integrated evolutionary computation neural network quality controller for automated systems

    Energy Technology Data Exchange (ETDEWEB)

    Patro, S.; Kolarik, W.J. [Texas Tech Univ., Lubbock, TX (United States). Dept. of Industrial Engineering

    1999-06-01

    With increasing competition in the global market, more and more stringent quality standards and specifications are being demands at lower costs. Manufacturing applications of computing power are becoming more common. The application of neural networks to identification and control of dynamic processes has been discussed. The limitations of using neural networks for control purposes has been pointed out and a different technique, evolutionary computation, has been discussed. The results of identifying and controlling an unstable, dynamic process using evolutionary computation methods has been presented. A framework for an integrated system, using both neural networks and evolutionary computation, has been proposed to identify the process and then control the product quality, in a dynamic, multivariable system, in real-time.

  5. Prototype of Automated PLC Model Checking Using Continuous Integration Tools

    CERN Document Server

    Lettrich, Michael

    2015-01-01

    To deal with the complexity of operating and supervising large scale industrial installations at CERN, often Programmable Logic Controllers (PLCs) are used. A failure in these control systems can cause a disaster in terms of economic loses, environmental damages or human losses. Therefore the requirements to software quality are very high. To provide PLC developers with a way to verify proper functionality against requirements, a Java tool named PLCverif has been developed which encapsulates and thus simplifies the use of third party model checkers. One of our goals in this project is to integrate PLCverif in development process of PLC programs. When the developer changes the program, all the requirements should be verified again, as a change on the code can produce collateral effects and violate one or more requirements. For that reason, PLCverif has been extended to work with Jenkins CI in order to trigger automatically the verication cases when the developer changes the PLC program. This prototype has been...

  6. Integrating the Allen Brain Institute Cell Types Database into Automated Neuroscience Workflow.

    Science.gov (United States)

    Stockton, David B; Santamaria, Fidel

    2017-10-01

    We developed software tools to download, extract features, and organize the Cell Types Database from the Allen Brain Institute (ABI) in order to integrate its whole cell patch clamp characterization data into the automated modeling/data analysis cycle. To expand the potential user base we employed both Python and MATLAB. The basic set of tools downloads selected raw data and extracts cell, sweep, and spike features, using ABI's feature extraction code. To facilitate data manipulation we added a tool to build a local specialized database of raw data plus extracted features. Finally, to maximize automation, we extended our NeuroManager workflow automation suite to include these tools plus a separate investigation database. The extended suite allows the user to integrate ABI experimental and modeling data into an automated workflow deployed on heterogeneous computer infrastructures, from local servers, to high performance computing environments, to the cloud. Since our approach is focused on workflow procedures our tools can be modified to interact with the increasing number of neuroscience databases being developed to cover all scales and properties of the nervous system.

  7. A user view of office automation or the integrated workstation

    Science.gov (United States)

    Schmerling, E. R.

    1984-01-01

    Central data bases are useful only if they are kept up to date and easily accessible in an interactive (query) mode rather than in monthly reports that may be out of date and must be searched by hand. The concepts of automatic data capture, data base management and query languages require good communications and readily available work stations to be useful. The minimal necessary work station is a personal computer which can be an important office tool if connected into other office machines and properly integrated into an office system. It has a great deal of flexibility and can often be tailored to suit the tastes, work habits and requirements of the user. Unlike dumb terminals, there is less tendency to saturate a central computer, since its free standing capabilities are available after down loading a selection of data. The PC also permits the sharing of many other facilities, like larger computing power, sophisticated graphics programs, laser printers and communications. It can provide rapid access to common data bases able to provide more up to date information than printed reports. Portable computers can access the same familiar office facilities from anywhere in the world where a telephone connection can be made.

  8. Development of an integrated automated retinal surgical laser system.

    Science.gov (United States)

    Barrett, S F; Wright, C H; Oberg, E D; Rockwell, B A; Cain, C; Rylander, H G; Welch, A J

    1996-01-01

    Researchers at the University of Texas and the USAF Academy have worked toward the development of a retinal robotic laser system. The overall goal of this ongoing project is to precisely place and control the depth of laser lesions for the treatment of various retinal diseases such as diabetic retinopathy and retinal tears. Separate low speed prototype subsystems have been developed to control lesion depth using lesion reflectance feedback parameters and lesion placement using retinal vessels as tracking landmarks. Both subsystems have been successfully demonstrated in vivo on pigmented rabbits using an argon continuous wave laser. Preliminary testing on rhesus primate subjects have been accomplished with the CW argon laser and also the ultrashort pulse laser. Recent efforts have concentrated on combining the two subsystems into a single prototype capable of simultaneously controlling both lesion depth and placement. We have designated this combined system CALOSOS for Computer Aided Laser Optics System for Ophthalmic Surgery. Several interesting areas of study have developed in integrating the two subsystems: 1) "doughnut" shaped lesions that occur under certain combinations of laser power, spot size, and irradiation time complicating measurements of central lesion reflectance, 2) the optimal retinal field of view (FOV) to achieve both tracking and lesion parameter control, and 3) development of a hybrid analog/digital tracker using confocal reflectometry to achieve retinal tracking speeds of up to 100 dgs. This presentation will discuss these design issues of this clinically significant prototype system. Details of the hybrid prototype system are provided in "Hybrid Eye Tracking for Computer-Aided Retinal Surgery" at this conference. The paper will close with remaining technical hurdles to clear prior to testing the full-up clinical prototype system.

  9. Automated integration of lidar into the LANDFIRE product suite

    Science.gov (United States)

    Peterson, Birgit; Nelson, Kurtis; Seielstad, Carl; Stoker, Jason M.; Jolly, W. Matt; Parsons, Russell

    2015-01-01

    Accurate information about three-dimensional canopy structure and wildland fuel across the landscape is necessary for fire behaviour modelling system predictions. Remotely sensed data are invaluable for assessing these canopy characteristics over large areas; lidar data, in particular, are uniquely suited for quantifying three-dimensional canopy structure. Although lidar data are increasingly available, they have rarely been applied to wildland fuels mapping efforts, mostly due to two issues. First, the Landscape Fire and Resource Planning Tools (LANDFIRE) program, which has become the default source of large-scale fire behaviour modelling inputs for the US, does not currently incorporate lidar data into the vegetation and fuel mapping process because spatially continuous lidar data are not available at the national scale. Second, while lidar data are available for many land management units across the US, these data are underutilized for fire behaviour applications. This is partly due to a lack of local personnel trained to process and analyse lidar data. This investigation addresses these issues by developing the Creating Hybrid Structure from LANDFIRE/lidar Combinations (CHISLIC) tool. CHISLIC allows individuals to automatically generate a suite of vegetation structure and wildland fuel parameters from lidar data and infuse them into existing LANDFIRE data sets. CHISLIC will become available for wider distribution to the public through a partnership with the U.S. Forest Service’s Wildland Fire Assessment System (WFAS) and may be incorporated into the Wildland Fire Decision Support System (WFDSS) with additional design and testing. WFAS and WFDSS are the primary systems used to support tactical and strategic wildland fire management decisions.

  10. Database Security for an Integrated Solution to Automate Sales Processes in Banking

    OpenAIRE

    Alexandra Maria Ioana FLOREA

    2013-01-01

    In order to maintain a competitive edge in a very active banking market the implementation of a web-based solution to standardize, optimize and manage the flow of sales / pre-sales and generating new leads is requested by a company. This article presents the realization of a development framework for software interoperability in the banking financial institutions and an integrated solution for achieving sales process automation in banking. The paper focuses on presenting the requirements for ...

  11. Distributed GIS for automated natural hazard zonation mapping Internet-SMS warning towards sustainable society

    Directory of Open Access Journals (Sweden)

    Devanjan Bhattacharya

    2014-12-01

    Full Text Available Today, open systems are needed for real time analysis and warnings on geo-hazards and over time can be achieved using Open Source Geographical Information System (GIS-based platform such as GeoNode which is being contributed to by developers around the world. To develop on an open source platform is a very vital component for better disaster information management as far as spatial data infrastructures are concerned and this would be extremely vital when huge databases are to be created and consulted regularly for city planning at different scales, particularly satellite images and maps of locations. There is a big need for spatially referenced data creation, analysis, and management. Some of the salient points that this research would be able to definitely contribute with GeoNode, being an open source platform, are facilitating the creation, sharing, and collaborative use of geospatial data. The objective is development of an automated natural hazard zonation system with Internet-short message service (SMS warning utilizing geomatics for sustainable societies. A concept of developing an internet-resident geospatial geohazard warning system has been put forward in this research, which can communicate alerts via SMS. There has been a need to develop an automated integrated system to categorize hazard and issue warning that reaches users directly. At present, no web-enabled warning system exists which can disseminate warning after hazard evaluation at one go and in real time. The objective of this research work has been to formalize a notion of an integrated, independent, generalized, and automated geo-hazard warning system making use of geo-spatial data under popular usage platform. In this paper, a model of an automated geo-spatial hazard warning system has been elaborated. The functionality is to be modular in architecture having GIS-graphical user interface (GUI, input, understanding, rainfall prediction, expert, output, and warning modules. A

  12. Process automation system for integration and operation of Large Volume Plasma Device

    International Nuclear Information System (INIS)

    Sugandhi, R.; Srivastava, P.K.; Sanyasi, A.K.; Srivastav, Prabhakar; Awasthi, L.M.; Mattoo, S.K.

    2016-01-01

    Highlights: • Analysis and design of process automation system for Large Volume Plasma Device (LVPD). • Data flow modeling for process model development. • Modbus based data communication and interfacing. • Interface software development for subsystem control in LabVIEW. - Abstract: Large Volume Plasma Device (LVPD) has been successfully contributing towards understanding of the plasma turbulence driven by Electron Temperature Gradient (ETG), considered as a major contributor for the plasma loss in the fusion devices. Large size of the device imposes certain difficulties in the operation, such as access of the diagnostics, manual control of subsystems and large number of signals monitoring etc. To achieve integrated operation of the machine, automation is essential for the enhanced performance and operational efficiency. Recently, the machine is undergoing major upgradation for the new physics experiments. The new operation and control system consists of following: (1) PXIe based fast data acquisition system for the equipped diagnostics; (2) Modbus based Process Automation System (PAS) for the subsystem controls and (3) Data Utilization System (DUS) for efficient storage, processing and retrieval of the acquired data. In the ongoing development, data flow model of the machine’s operation has been developed. As a proof of concept, following two subsystems have been successfully integrated: (1) Filament Power Supply (FPS) for the heating of W- filaments based plasma source and (2) Probe Positioning System (PPS) for control of 12 number of linear probe drives for a travel length of 100 cm. The process model of the vacuum production system has been prepared and validated against acquired pressure data. In the next upgrade, all the subsystems of the machine will be integrated in a systematic manner. The automation backbone is based on 4-wire multi-drop serial interface (RS485) using Modbus communication protocol. Software is developed on LabVIEW platform using

  13. Process automation system for integration and operation of Large Volume Plasma Device

    Energy Technology Data Exchange (ETDEWEB)

    Sugandhi, R., E-mail: ritesh@ipr.res.in; Srivastava, P.K.; Sanyasi, A.K.; Srivastav, Prabhakar; Awasthi, L.M.; Mattoo, S.K.

    2016-11-15

    Highlights: • Analysis and design of process automation system for Large Volume Plasma Device (LVPD). • Data flow modeling for process model development. • Modbus based data communication and interfacing. • Interface software development for subsystem control in LabVIEW. - Abstract: Large Volume Plasma Device (LVPD) has been successfully contributing towards understanding of the plasma turbulence driven by Electron Temperature Gradient (ETG), considered as a major contributor for the plasma loss in the fusion devices. Large size of the device imposes certain difficulties in the operation, such as access of the diagnostics, manual control of subsystems and large number of signals monitoring etc. To achieve integrated operation of the machine, automation is essential for the enhanced performance and operational efficiency. Recently, the machine is undergoing major upgradation for the new physics experiments. The new operation and control system consists of following: (1) PXIe based fast data acquisition system for the equipped diagnostics; (2) Modbus based Process Automation System (PAS) for the subsystem controls and (3) Data Utilization System (DUS) for efficient storage, processing and retrieval of the acquired data. In the ongoing development, data flow model of the machine’s operation has been developed. As a proof of concept, following two subsystems have been successfully integrated: (1) Filament Power Supply (FPS) for the heating of W- filaments based plasma source and (2) Probe Positioning System (PPS) for control of 12 number of linear probe drives for a travel length of 100 cm. The process model of the vacuum production system has been prepared and validated against acquired pressure data. In the next upgrade, all the subsystems of the machine will be integrated in a systematic manner. The automation backbone is based on 4-wire multi-drop serial interface (RS485) using Modbus communication protocol. Software is developed on LabVIEW platform using

  14. Network-Based Real-time Integrated Fire Detection and Alarm (FDA) System with Building Automation

    Science.gov (United States)

    Anwar, F.; Boby, R. I.; Rashid, M. M.; Alam, M. M.; Shaikh, Z.

    2017-11-01

    Fire alarm systems have become increasingly an important lifesaving technology in many aspects, such as applications to detect, monitor and control any fire hazard. A large sum of money is being spent annually to install and maintain the fire alarm systems in buildings to protect property and lives from the unexpected spread of fire. Several methods are already developed and it is improving on a daily basis to reduce the cost as well as increase quality. An integrated Fire Detection and Alarm (FDA) systems with building automation was studied, to reduce cost and improve their reliability by preventing false alarm. This work proposes an improved framework for FDA system to ensure a robust intelligent network of FDA control panels in real-time. A shortest path algorithmic was chosen for series of buildings connected by fiber optic network. The framework shares information and communicates with each fire alarm panels connected in peer to peer configuration and declare the network state using network address declaration from any building connected in network. The fiber-optic connection was proposed to reduce signal noises, thus increasing large area coverage, real-time communication and long-term safety. Based on this proposed method an experimental setup was designed and a prototype system was developed to validate the performance in practice. Also, the distributed network system was proposed to connect with an optional remote monitoring terminal panel to validate proposed network performance and ensure fire survivability where the information is sequentially transmitted. The proposed FDA system is different from traditional fire alarm and detection system in terms of topology as it manages group of buildings in an optimal and efficient manner.Introduction

  15. Market Integration Dynamics and Asymptotic Price Convergence in Distribution

    NARCIS (Netherlands)

    A. García-Hiernaux (Alfredo); D.E. Guerrero (David); M.J. McAleer (Michael)

    2013-01-01

    textabstractIn this paper we analyse the market integration process of the relative price distribution, develop a model to analyze market integration, and present a formal test of increasing market integration. We distinguish between the economic concepts of price convergence in mean and in

  16. Toward automated interpretation of integrated information: Managing "big data" for NDE

    Science.gov (United States)

    Gregory, Elizabeth; Lesthaeghe, Tyler; Holland, Stephen

    2015-03-01

    Large scale automation of NDE processes is rapidly maturing, thanks to recent improvements in robotics and the rapid growth of computer power over the last twenty years. It is fairly straightforward to automate NDE data collection itself, but the process of NDE remains largely manual. We will discuss three threads of technological needs that must be addressed before we are able to perform automated NDE. Spatial context, the first thread, means that each NDE measurement taken is accompanied by metadata that locates the measurement with respect to the 3D physical geometry of the specimen. In this way, the geometry of the specimen acts as a database key. Data context, the second thread, means that we record why the data was taken and how it was measured in addition to the NDE data itself. We will present our software tool that helps users interact with data in context, Databrowse. Condition estimation, the third thread, is maintaining the best possible knowledge of the condition (serviceability, degradation, etc.) of an object or part. In the NDE context, we can prospectively use Bayes' Theorem to integrate the data from each new NDE measurement with prior knowledge. These tools, combined with robotic measurements and automated defect analysis, will provide the information needed to make high-level life predictions and focus NDE measurements where they are needed most.

  17. Automated estimation of choroidal thickness distribution and volume based on OCT images of posterior visual section.

    Science.gov (United States)

    Vupparaboina, Kiran Kumar; Nizampatnam, Srinath; Chhablani, Jay; Richhariya, Ashutosh; Jana, Soumya

    2015-12-01

    A variety of vision ailments are indicated by anomalies in the choroid layer of the posterior visual section. Consequently, choroidal thickness and volume measurements, usually performed by experts based on optical coherence tomography (OCT) images, have assumed diagnostic significance. Now, to save precious expert time, it has become imperative to develop automated methods. To this end, one requires choroid outer boundary (COB) detection as a crucial step, where difficulty arises as the COB divides the choroidal granularity and the scleral uniformity only notionally, without marked brightness variation. In this backdrop, we measure the structural dissimilarity between choroid and sclera by structural similarity (SSIM) index, and hence estimate the COB by thresholding. Subsequently, smooth COB estimates, mimicking manual delineation, are obtained using tensor voting. On five datasets, each consisting of 97 adult OCT B-scans, automated and manual segmentation results agree visually. We also demonstrate close statistical match (greater than 99.6% correlation) between choroidal thickness distributions obtained algorithmically and manually. Further, quantitative superiority of our method is established over existing results by respective factors of 27.67% and 76.04% in two quotient measures defined relative to observer repeatability. Finally, automated choroidal volume estimation, being attempted for the first time, also yields results in close agreement with that of manual methods. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. [Quality of buffy-coat-derived platelet concentrates prepared using automated system terumo automated centrifuge and separator integration (TACSI)].

    Science.gov (United States)

    Zebrowska, Agnieszka; Lipska, Alina; Rogowska, Anna; Bujno, Magdalena; Nedzi, Marta; Radziwon, Piotr

    2011-03-01

    Platelet recovery, and viability, and function is strongly dependent on the method of the preparation of platelet concentrate (PC). The glucose consumption, decrease of pH, release of alpha granules during storage in platelet concentrate impair their clinical effectiveness. To compare of the quality of buffy-coat-derieved platelet concentrates prepared using automatic system terumo automated centrifuge and separator integration (TACSI) and stored over 7 days. PCs were prepared from buffy coats using manual method (group I), or automatic system TACSI (group II). Fifteen PCs prepared from the 5 buffy coats each were stored over 7 days in 22-24 degrees C and tested. Samples were taken from the PCs container on days 1 and 7. The following laboratory tests were performed: number of platelets, platelets derived microparticles, CD62P expression, platelet adhesion, pH, glucose, lactate dehydrogenase activity. We have observed higher expression of CD62P in PCs prepared using manual method compared to the PCs produced automatically Platelet recovery was significantly higher in PCs prepared using automatic systems compare to manual method. Compared to manual methods, automatic system for preparation of buffy coats, is more efficient and enable production of platelets concentrates of higher quality.

  19. Short circuit analysis of distribution system with integration of DG

    DEFF Research Database (Denmark)

    Su, Chi; Liu, Zhou; Chen, Zhe

    2014-01-01

    and as a result bring challenges to the network protection system. This problem has been frequently discussed in the literature, but mostly considering only the balanced fault situation. This paper presents an investigation on the influence of full converter based wind turbine (WT) integration on fault currents......Integration of distributed generation (DG) such as wind turbines into distribution system is increasing all around the world, because of the flexible and environmentally friendly characteristics. However, DG integration may change the pattern of the fault currents in the distribution system...... during both balanced and unbalanced faults. Major factors such as external grid short circuit power capacity, WT integration location, connection type of WT integration transformer are taken into account. In turn, the challenges brought to the protection system in the distribution network are presented...

  20. Trends and applications of integrated automated ultra-trace sample handling and analysis (T9)

    International Nuclear Information System (INIS)

    Kingston, H.M.S.; Ye Han; Stewart, L.; Link, D.

    2002-01-01

    Full text: Automated analysis, sub-ppt detection limits, and the trend toward speciated analysis (rather than just elemental analysis) force the innovation of sophisticated and integrated sample preparation and analysis techniques. Traditionally, the ability to handle samples at ppt and sub-ppt levels has been limited to clean laboratories and special sample handling techniques and equipment. The world of sample handling has passed a threshold where older or 'old fashioned' traditional techniques no longer provide the ability to see the sample due to the influence of the analytical blank and the fragile nature of the analyte. When samples require decomposition, extraction, separation and manipulation, application of newer more sophisticated sample handling systems are emerging that enable ultra-trace analysis and species manipulation. In addition, new instrumentation has emerged which integrate sample preparation and analysis to enable on-line near real-time analysis. Examples of those newer sample-handling methods will be discussed and current examples provided as alternatives to traditional sample handling. Two new techniques applying ultra-trace microwave energy enhanced sample handling have been developed that permit sample separation and refinement while performing species manipulation during decomposition. A demonstration, that applies to semiconductor materials, will be presented. Next, a new approach to the old problem of sample evaporation without losses will be demonstrated that is capable of retaining all elements and species tested. Both of those methods require microwave energy manipulation in specialized systems and are not accessible through convection, conduction, or other traditional energy applications. A new automated integrated method for handling samples for ultra-trace analysis has been developed. An on-line near real-time measurement system will be described that enables many new automated sample handling and measurement capabilities. This

  1. Prospective validation of a near real-time EHR-integrated automated SOFA score calculator.

    Science.gov (United States)

    Aakre, Christopher; Franco, Pablo Moreno; Ferreyra, Micaela; Kitson, Jaben; Li, Man; Herasevich, Vitaly

    2017-07-01

    We created an algorithm for automated Sequential Organ Failure Assessment (SOFA) score calculation within the Electronic Health Record (EHR) to facilitate detection of sepsis based on the Third International Consensus Definitions for Sepsis and Septic Shock (SEPSIS-3) clinical definition. We evaluated the accuracy of near real-time and daily automated SOFA score calculation compared with manual score calculation. Automated SOFA scoring computer programs were developed using available EHR data sources and integrated into a critical care focused patient care dashboard at Mayo Clinic in Rochester, Minnesota. We prospectively compared the accuracy of automated versus manual calculation for a sample of patients admitted to the medical intensive care unit at Mayo Clinic Hospitals in Rochester, Minnesota and Jacksonville, Florida. Agreement was calculated with Cohen's kappa statistic. Reason for discrepancy was tabulated during manual review. Random spot check comparisons were performed 134 times on 27 unique patients, and daily SOFA score comparisons were performed for 215 patients over a total of 1206 patient days. Agreement between automatically scored and manually scored SOFA components for both random spot checks (696 pairs, κ=0.89) and daily calculation (5972 pairs, κ=0.89) was high. The most common discrepancies were in the respiratory component (inaccurate fraction of inspired oxygen retrieval; 200/1206) and creatinine (normal creatinine in patients with no urine output on dialysis; 128/1094). 147 patients were at risk of developing sepsis after intensive care unit admission, 10 later developed sepsis confirmed by chart review. All were identified before onset of sepsis with the ΔSOFA≥2 point criterion and 46 patients were false-positives. Near real-time automated SOFA scoring was found to have strong agreement with manual score calculation and may be useful for the detection of sepsis utilizing the new SEPSIS-3 definition. Copyright © 2017 Elsevier B.V. All

  2. Using an integrated automated system to optimize retention and increase frequency of blood donations.

    Science.gov (United States)

    Whitney, J Garrett; Hall, Robert F

    2010-07-01

    This study examines the impact of an integrated, automated phone system to reinforce retention and increase frequency of donations among blood donors. Cultivated by incorporating data results over the past 7 years, the system uses computerized phone messaging to contact blood donors with individualized, multilevel notifications. Donors are contacted at planned intervals to acknowledge and recognize their donations, informed where their blood was sent, asked to participate in a survey, and reminded when they are eligible to donate again. The report statistically evaluates the impact of the various components of the system on donor retention and blood donations and quantifies the fiscal advantages to blood centers. By using information and support systems provided by the automated services and then incorporating the phlebotomists and recruiters to reinforce donor retention, both retention and donations will increase. © 2010 American Association of Blood Banks.

  3. A Study on Integrated Control Network for Multiple Automation Services-1st year report

    Energy Technology Data Exchange (ETDEWEB)

    Hyun, D.H.; Park, B.S.; Kim, M.S.; Lim, Y.H.; Ahn, S.K. [Korea Electric Power Research Institute, Taejon (Korea)

    2002-07-01

    This report describes the development of Integrated and Intelligent Gateway which is under developed. The network operating technique in this report can identifies the causes of the communication faults and can avoid communication network faults in advance. Utility companies spend large financial investment and time for supplying the stabilized power. Since this is deeply related to the reliability of Automation Systems, it is natural to employ Fault-Tolerant communication network for Automation Systems. Use of the network system developed in this report is not limited in DAS. It can be expandable to the many kinds of data services for customer. Thus this report suggests the direction of the communication network development. This 1st year report is composed of following contents, 1) The introduction and problems of DAS. 2) The configuration and functions of IIG. 3) The protocols. (author). 27 refs., 73 figs., 6 tabs.

  4. Grid integrated distributed PV (GridPV).

    Energy Technology Data Exchange (ETDEWEB)

    Reno, Matthew J.; Coogan, Kyle

    2013-08-01

    This manual provides the documentation of the MATLAB toolbox of functions for using OpenDSS to simulate the impact of solar energy on the distribution system. The majority of the functions are useful for interfacing OpenDSS and MATLAB, and they are of generic use for commanding OpenDSS from MATLAB and retrieving information from simulations. A set of functions is also included for modeling PV plant output and setting up the PV plant in the OpenDSS simulation. The toolbox contains functions for modeling the OpenDSS distribution feeder on satellite images with GPS coordinates. Finally, example simulations functions are included to show potential uses of the toolbox functions. Each function in the toolbox is documented with the function use syntax, full description, function input list, function output list, example use, and example output.

  5. Distributed and multi-core computation of 2-loop integrals

    International Nuclear Information System (INIS)

    De Doncker, E; Yuasa, F

    2014-01-01

    For an automatic computation of Feynman loop integrals in the physical region we rely on an extrapolation technique where the integrals of the sequence are obtained with iterated/repeated adaptive methods from the QUADPACK 1D quadrature package. The integration rule evaluations in the outer level, corresponding to independent inner integral approximations, are assigned to threads dynamically via the OpenMP runtime in the parallel implementation. Furthermore, multi-level (nested) parallelism enables an efficient utilization of hyperthreading or larger numbers of cores. For a class of loop integrals in the unphysical region, which do not suffer from singularities in the interior of the integration domain, we find that the distributed adaptive integration methods in the multivariate PARINT package are highly efficient and accurate. We apply these techniques without resorting to integral transformations and report on the capabilities of the algorithms and the parallel performance for a test set including various types of two-loop integrals

  6. Application of high performance asynchronous socket communication in power distribution automation

    Science.gov (United States)

    Wang, Ziyu

    2017-05-01

    With the development of information technology and Internet technology, and the growing demand for electricity, the stability and the reliable operation of power system have been the goal of power grid workers. With the advent of the era of big data, the power data will gradually become an important breakthrough to guarantee the safe and reliable operation of the power grid. So, in the electric power industry, how to efficiently and robustly receive the data transmitted by the data acquisition device, make the power distribution automation system be able to execute scientific decision quickly, which is the pursuit direction in power grid. In this paper, some existing problems in the power system communication are analysed, and with the help of the network technology, a set of solutions called Asynchronous Socket Technology to the problem in network communication which meets the high concurrency and the high throughput is proposed. Besides, the paper also looks forward to the development direction of power distribution automation in the era of big data and artificial intelligence.

  7. Implementation strategies for load center automation on the space station module/power management and distribution testbed

    Science.gov (United States)

    Watson, Karen

    1990-01-01

    The Space Station Module/Power Management and Distribution (SSM/PMAD) testbed was developed to study the tertiary power management on modules in large spacecraft. The main goal was to study automation techniques, not necessarily develop flight ready systems. Because of the confidence gained in many of automation strategies investigated, it is appropriate to study, in more detail, implementation strategies in order to find better trade-offs for nearer to flight ready systems. These trade-offs particularly concern the weight, volume, power consumption, and performance of the automation system. These systems, in their present implementation are described.

  8. Integrated Power Flow and Short Circuit Calculation Method for Distribution Network with Inverter Based Distributed Generation

    OpenAIRE

    Yang, Shan; Tong, Xiangqian

    2016-01-01

    Power flow calculation and short circuit calculation are the basis of theoretical research for distribution network with inverter based distributed generation. The similarity of equivalent model for inverter based distributed generation during normal and fault conditions of distribution network and the differences between power flow and short circuit calculation are analyzed in this paper. Then an integrated power flow and short circuit calculation method for distribution network with inverte...

  9. Mathematical methods linear algebra normed spaces distributions integration

    CERN Document Server

    Korevaar, Jacob

    1968-01-01

    Mathematical Methods, Volume I: Linear Algebra, Normed Spaces, Distributions, Integration focuses on advanced mathematical tools used in applications and the basic concepts of algebra, normed spaces, integration, and distributions.The publication first offers information on algebraic theory of vector spaces and introduction to functional analysis. Discussions focus on linear transformations and functionals, rectangular matrices, systems of linear equations, eigenvalue problems, use of eigenvectors and generalized eigenvectors in the representation of linear operators, metric and normed vector

  10. Integration of 100% Micro-Distributed Energy Resources in the Low Voltage Distribution Network

    DEFF Research Database (Denmark)

    You, Shi; Segerberg, Helena

    2014-01-01

    of heat pumps (HPs) and plug-in electric vehicles (PEVs) at 100% penetration level on a representative urban residential low voltage (LV) distribution network of Denmark are investigated by performing a steady-state load flow analysis through an integrated simulation setup. Three DERs integration...... oriented integration strategies, having 100% integration of DER in the provided LV network is feasible.......The existing electricity infrastructure may to a great extent limit a high penetration of the micro-sized Distributed Energy Resources (DERs), due to the physical bottlenecks, e.g. thermal capacitates of cables, transformers and the voltage limitations. In this study, the integration impacts...

  11. Database Security for an Integrated Solution to Automate Sales Processes in Banking

    Directory of Open Access Journals (Sweden)

    Alexandra Maria Ioana FLOREA

    2013-05-01

    Full Text Available In order to maintain a competitive edge in a very active banking market the implementation of a web-based solution to standardize, optimize and manage the flow of sales / pre-sales and generating new leads is requested by a company. This article presents the realization of a development framework for software interoperability in the banking financial institutions and an integrated solution for achieving sales process automation in banking. The paper focuses on presenting the requirements for security and confidentiality of stored data and also on presenting the identified techniques and procedures to implement these requirements.

  12. Drivers' communicative interactions: on-road observations and modelling for integration in future automation systems.

    Science.gov (United States)

    Portouli, Evangelia; Nathanael, Dimitris; Marmaras, Nicolas

    2014-01-01

    Social interactions with other road users are an essential component of the driving activity and may prove critical in view of future automation systems; still up to now they have received only limited attention in the scientific literature. In this paper, it is argued that drivers base their anticipations about the traffic scene to a large extent on observations of social behaviour of other 'animate human-vehicles'. It is further argued that in cases of uncertainty, drivers seek to establish a mutual situational awareness through deliberate communicative interactions. A linguistic model is proposed for modelling these communicative interactions. Empirical evidence from on-road observations and analysis of concurrent running commentary by 25 experienced drivers support the proposed model. It is suggested that the integration of a social interactions layer based on illocutionary acts in future driving support and automation systems will improve their performance towards matching human driver's expectations. Practitioner Summary: Interactions between drivers on the road may play a significant role in traffic coordination. On-road observations and running commentaries are presented as empirical evidence to support a model of such interactions; incorporation of drivers' interactions in future driving support and automation systems may improve their performance towards matching driver's expectations.

  13. Network integration of distributed power generation

    Science.gov (United States)

    Dondi, Peter; Bayoumi, Deia; Haederli, Christoph; Julian, Danny; Suter, Marco

    The world-wide move to deregulation of the electricity and other energy markets, concerns about the environment, and advances in renewable and high efficiency technologies has led to major emphasis being placed on the use of small power generation units in a variety of forms. The paper reviews the position of distributed generation (DG, as these small units are called in comparison with central power plants) with respect to the installation and interconnection of such units with the classical grid infrastructure. In particular, the status of technical standards both in Europe and USA, possible ways to improve the interconnection situation, and also the need for decisions that provide a satisfactory position for the network operator (who remains responsible for the grid, its operation, maintenance and investment plans) are addressed.

  14. Designing the Distributed Model Integration Framework – DMIF

    NARCIS (Netherlands)

    Belete, Getachew F.; Voinov, Alexey; Morales, Javier

    2017-01-01

    We describe and discuss the design and prototype of the Distributed Model Integration Framework (DMIF) that links models deployed on different hardware and software platforms. We used distributed computing and service-oriented development approaches to address the different aspects of

  15. Integrated Computing, Communication, and Distributed Control of Deregulated Electric Power Systems

    Energy Technology Data Exchange (ETDEWEB)

    Bajura, Richard; Feliachi, Ali

    2008-09-24

    Restructuring of the electricity market has affected all aspects of the power industry from generation to transmission, distribution, and consumption. Transmission circuits, in particular, are stressed often exceeding their stability limits because of the difficulty in building new transmission lines due to environmental concerns and financial risk. Deregulation has resulted in the need for tighter control strategies to maintain reliability even in the event of considerable structural changes, such as loss of a large generating unit or a transmission line, and changes in loading conditions due to the continuously varying power consumption. Our research efforts under the DOE EPSCoR Grant focused on Integrated Computing, Communication and Distributed Control of Deregulated Electric Power Systems. This research is applicable to operating and controlling modern electric energy systems. The controls developed by APERC provide for a more efficient, economical, reliable, and secure operation of these systems. Under this program, we developed distributed control algorithms suitable for large-scale geographically dispersed power systems and also economic tools to evaluate their effectiveness and impact on power markets. Progress was made in the development of distributed intelligent control agents for reliable and automated operation of integrated electric power systems. The methodologies employed combine information technology, control and communication, agent technology, and power systems engineering in the development of intelligent control agents for reliable and automated operation of integrated electric power systems. In the event of scheduled load changes or unforeseen disturbances, the power system is expected to minimize the effects and costs of disturbances and to maintain critical infrastructure operational.

  16. AGIS: Integration of new technologies used in ATLAS Distributed Computing

    CERN Document Server

    Anisenkov, Alexey; The ATLAS collaboration; Alandes Pradillo, Maria

    2016-01-01

    AGIS is the information system designed to integrate configuration and status information about resources, services and topology of the computing infrastructure used by ATLAS Distributed Computing (ADC) applications and services. In this note, we describe the evolution and the recent developments of AGIS functionalities, related to integration of new technologies recently become widely used in ATLAS Computing like flexible computing utilization of opportunistic Cloud and HPC resources, ObjectStore services integration for Distributed Data Management (Rucio) and ATLAS workload management (PanDA) systems, unified storage protocols declaration required for PandDA Pilot site movers and others.

  17. Distributed optical fiber sensors for integrated monitoring of railway infrastructures

    Science.gov (United States)

    Minardo, Aldo; Coscetta, Agnese; Porcaro, Giuseppe; Giannetta, Daniele; Bernini, Romeo; Zeni, Luigi

    2014-05-01

    We propose the application of a distributed optical fiber sensor based on stimulated Brillouin scattering, as an integrated system for safety monitoring of railway infrastructures. The strain distribution was measured dynamically along a 60 meters length of rail track, as well as along a 3-m stone arch bridge. The results indicate that distributed sensing technology is able to provide useful information in railway traffic and safety monitoring.

  18. Understanding reliance on automation: effects of error type, error distribution, age and experience

    Science.gov (United States)

    Sanchez, Julian; Rogers, Wendy A.; Fisk, Arthur D.; Rovira, Ericka

    2015-01-01

    An obstacle detection task supported by “imperfect” automation was used with the goal of understanding the effects of automation error types and age on automation reliance. Sixty younger and sixty older adults interacted with a multi-task simulation of an agricultural vehicle (i.e. a virtual harvesting combine). The simulator included an obstacle detection task and a fully manual tracking task. A micro-level analysis provided insight into the way reliance patterns change over time. The results indicated that there are distinct patterns of reliance that develop as a function of error type. A prevalence of automation false alarms led participants to under-rely on the automation during alarm states while over relying on it during non-alarms states. Conversely, a prevalence of automation misses led participants to over-rely on automated alarms and under-rely on the automation during non-alarm states. Older adults adjusted their behavior according to the characteristics of the automation similarly to younger adults, although it took them longer to do so. The results of this study suggest the relationship between automation reliability and reliance depends on the prevalence of specific errors and on the state of the system. Understanding the effects of automation detection criterion settings on human-automation interaction can help designers of automated systems make predictions about human behavior and system performance as a function of the characteristics of the automation. PMID:25642142

  19. Distributed software framework and continuous integration in hydroinformatics systems

    Science.gov (United States)

    Zhou, Jianzhong; Zhang, Wei; Xie, Mengfei; Lu, Chengwei; Chen, Xiao

    2017-08-01

    When encountering multiple and complicated models, multisource structured and unstructured data, complex requirements analysis, the platform design and integration of hydroinformatics systems become a challenge. To properly solve these problems, we describe a distributed software framework and it’s continuous integration process in hydroinformatics systems. This distributed framework mainly consists of server cluster for models, distributed database, GIS (Geographic Information System) servers, master node and clients. Based on it, a GIS - based decision support system for joint regulating of water quantity and water quality of group lakes in Wuhan China is established.

  20. MIDAS: Automated Approach to Design Microwave Integrated Inductors and Transformers on Silicon

    Directory of Open Access Journals (Sweden)

    L. Aluigi

    2013-09-01

    Full Text Available The design of modern radiofrequency integrated circuits on silicon operating at microwave and millimeter-waves requires the integration of several spiral inductors and transformers that are not commonly available in the process design-kits of the technologies. In this work we present an auxiliary CAD tool for Microwave Inductor (and transformer Design Automation on Silicon (MIDAS that exploits commercial simulators and allows the implementation of an automatic design flow, including three-dimensional layout editing and electromagnetic simulations. In detail, MIDAS allows the designer to derive a preliminary sizing of the inductor (transformer on the bases of the design entries (specifications. It draws the inductor (transformer layers for the specific process design kit, including vias and underpasses, with or without patterned ground shield, and launches the electromagnetic simulations, achieving effective design automation with respect to the traditional design flow for RFICs. With the present software suite the complete design time is reduced significantly (typically 1 hour on a PC based on Intel® Pentium® Dual 1.80GHz CPU with 2-GB RAM. Afterwards both the device equivalent circuit and the layout are ready to be imported in the Cadence environment.

  1. Automated tools and techniques for distributed Grid Software Development of the testbed infrastructure

    CERN Document Server

    Aguado Sanchez, C

    2007-01-01

    Grid technology is becoming more and more important as the new paradigm for sharing computational resources across different organizations in a secure way. The great powerfulness of this solution, requires the definition of a generic stack of services and protocols and this is the scope of the different Grid initiatives. As a result of international collaborations for its development, the Open Grid Forum created the Open Grid Services Architecture (OGSA) which aims to define the common set of services that will enable interoperability across the different implementations. This master thesis has been developed in this framework, as part of the two European-funded projects ETICS and OMII-Europe. The main objective is to contribute to the design and maintenance of large distributed development projects with the automated tool that enables to implement Software Engineering techniques oriented to achieve an acceptable level of quality at the release process. Specifically, this thesis develops the testbed concept a...

  2. An automated system for the correlation measurement of γ-quanta energy distribution

    International Nuclear Information System (INIS)

    Ofengenden, R.G.; Berezin, F.N.; Patlan', Yu.V.; Shalejko, A.M.; Shidlyk, A.M.; Shchur, A.M.

    1983-01-01

    Hardware and software are described in brief for an automated system, to measure the energy and time distributions of gamma-quanta, which ensures accumulation and preliminary processing of experimental data while realizing various physical techniques for investigation. The system is based on the SM-4 computer and electronic-physical equipment produced in the CAMAC standard. In the SM-4 computer the RAFOS operational system is employed which has some advantages in solving the tasks of multidimensional data acquisition and analysis, when a high response and real-time operation are reqUired. Certain components of soltware are worked oUt and included in the system: an operational system version with a larger set of drivers which is adapted to the equipment configuration used; library of macrodeterminations and service object library; subsystem of tuning and testing; subsystem of data acquisition and initial processing

  3. Performance of integrated systems of automated roller shade systems and daylight responsive dimming systems

    Energy Technology Data Exchange (ETDEWEB)

    Park, Byoung-Chul; Choi, An-Seop; Jeong, Jae-Weon [Department of Architectural Engineering, Sejong University, Kunja-Dong, Kwangjin-Gu, Seoul (Korea, Republic of); Lee, Eleanor S. [Building Technologies Department, Lawrence Berkeley National Laboratory, Berkeley, CA (United States)

    2011-03-15

    Daylight responsive dimming systems have been used in few buildings to date because they require improvements to improve reliability. The key underlying factor contributing to poor performance is the variability of the ratio of the photosensor signal to daylight workplane illuminance in accordance with sun position, sky condition, and fenestration condition. Therefore, this paper describes the integrated systems between automated roller shade systems and daylight responsive dimming systems with an improved closed-loop proportional control algorithm, and the relative performance of the integrated systems and single systems. The concept of the improved closed-loop proportional control algorithm for the integrated systems is to predict the varying correlation of photosensor signal to daylight workplane illuminance according to roller shade height and sky conditions for improvement of the system accuracy. In this study, the performance of the integrated systems with two improved closed-loop proportional control algorithms was compared with that of the current (modified) closed-loop proportional control algorithm. In the results, the average maintenance percentage and the average discrepancies of the target illuminance, as well as the average time under 90% of target illuminance for the integrated systems significantly improved in comparison with the current closed-loop proportional control algorithm for daylight responsive dimming systems as a single system. (author)

  4. Distribution grid reconfiguration reduces power losses and helps integrate renewables

    International Nuclear Information System (INIS)

    Lueken, Colleen; Carvalho, Pedro M.S.; Apt, Jay

    2012-01-01

    A reconfigurable network can change its topology by opening and closing switches on power lines. We use real wind, solar, load, and cost data and a model of a reconfigurable distribution grid to show that reconfiguration allows a grid operator to reduce operational losses as well as to accept more intermittent renewable generation than a static configuration can. Net present value analysis of automated switch technology shows that the return on investment is negative for this test network when considering only loss reduction, but that the investment is attractive under certain conditions when reconfiguration is used to minimize curtailment. - Highlights: ► Reconfiguration may reduce losses in grids with solar or wind distributed generation. ► Reconfigurable networks can accept more solar or wind DG than static ones. ► Using reconfiguration for loss reduction would not create a positive ROI. ► Using reconfiguration to reduce curtailment usually would create a positive ROI.

  5. Rules for integrals over products of distributions from coordinate independence of path integrals

    International Nuclear Information System (INIS)

    Kleinert, H.; Chervyakov, A.

    2001-01-01

    In perturbative calculations of quantum-mechanical path integrals in curvilinear coordinates, one encounters Feynman diagrams involving multiple temporal integrals over products of distributions which are mathematically undefined. In addition, there are terms proportional to powers of Dirac δ-functions at the origin coming from the measure of path integration. We derive simple rules for dealing with such singular terms from the natural requirement of coordinate independence of the path integrals. (orig.)

  6. Context-awareness in task automation services by distributed event processing

    OpenAIRE

    Coronado Barrios, Miguel; Bruns, Ralf; Dunkel, Jürgen; Stipković, Sebastian

    2014-01-01

    Everybody has to coordinate several tasks everyday, usually in a manual manner. Recently, the concept of Task Automation Services has been introduced to automate and personalize the task coordination problem. Several user centered platforms and applications have arisen in the last years, that let their users configure their very own automations based on third party services. In this paper, we propose a new system architecture for Task Automation Services in a heterogeneous mobile, smart devic...

  7. Automated local bright feature image analysis of nuclear protein distribution identifies changes in tissue phenotype

    International Nuclear Information System (INIS)

    Knowles, David; Sudar, Damir; Bator, Carol; Bissell, Mina

    2006-01-01

    The organization of nuclear proteins is linked to cell and tissue phenotypes. When cells arrest proliferation, undergo apoptosis, or differentiate, the distribution of nuclear proteins changes. Conversely, forced alteration of the distribution of nuclear proteins modifies cell phenotype. Immunostaining and fluorescence microscopy have been critical for such findings. However, there is an increasing need for quantitative analysis of nuclear protein distribution to decipher epigenetic relationships between nuclear structure and cell phenotype, and to unravel the mechanisms linking nuclear structure and function. We have developed imaging methods to quantify the distribution of fluorescently-stained nuclear protein NuMA in different mammary phenotypes obtained using three-dimensional cell culture. Automated image segmentation of DAPI-stained nuclei was generated to isolate thousands of nuclei from three-dimensional confocal images. Prominent features of fluorescently-stained NuMA were detected using a novel local bright feature analysis technique, and their normalized spatial density calculated as a function of the distance from the nuclear perimeter to its center. The results revealed marked changes in the distribution of the density of NuMA bright features as non-neoplastic cells underwent phenotypically normal acinar morphogenesis. In contrast, we did not detect any reorganization of NuMA during the formation of tumor nodules by malignant cells. Importantly, the analysis also discriminated proliferating non-neoplastic cells from proliferating malignant cells, suggesting that these imaging methods are capable of identifying alterations linked not only to the proliferation status but also to the malignant character of cells. We believe that this quantitative analysis will have additional applications for classifying normal and pathological tissues

  8. Automated microfluidic devices integrating solid-phase extraction, fluorescent labeling, and microchip electrophoresis for preterm birth biomarker analysis.

    Science.gov (United States)

    Sahore, Vishal; Sonker, Mukul; Nielsen, Anna V; Knob, Radim; Kumar, Suresh; Woolley, Adam T

    2018-01-01

    We have developed multichannel integrated microfluidic devices for automated preconcentration, labeling, purification, and separation of preterm birth (PTB) biomarkers. We fabricated multilayer poly(dimethylsiloxane)-cyclic olefin copolymer (PDMS-COC) devices that perform solid-phase extraction (SPE) and microchip electrophoresis (μCE) for automated PTB biomarker analysis. The PDMS control layer had a peristaltic pump and pneumatic valves for flow control, while the PDMS fluidic layer had five input reservoirs connected to microchannels and a μCE system. The COC layers had a reversed-phase octyl methacrylate porous polymer monolith for SPE and fluorescent labeling of PTB biomarkers. We determined μCE conditions for two PTB biomarkers, ferritin (Fer) and corticotropin-releasing factor (CRF). We used these integrated microfluidic devices to preconcentrate and purify off-chip-labeled Fer and CRF in an automated fashion. Finally, we performed a fully automated on-chip analysis of unlabeled PTB biomarkers, involving SPE, labeling, and μCE separation with 1 h total analysis time. These integrated systems have strong potential to be combined with upstream immunoaffinity extraction, offering a compact sample-to-answer biomarker analysis platform. Graphical abstract Pressure-actuated integrated microfluidic devices have been developed for automated solid-phase extraction, fluorescent labeling, and microchip electrophoresis of preterm birth biomarkers.

  9. Integrals over products of distributions and coordinate independence of zero-temperature path integrals

    International Nuclear Information System (INIS)

    Kleinert, H.; Chervyakov, A.

    2003-01-01

    In perturbative calculations of quantum-statistical zero-temperature path integrals in curvilinear coordinates one encounters Feynman diagrams involving multiple temporal integrals over products of distributions, which are mathematically undefined. In addition, there are terms proportional to powers of Dirac δ-functions at the origin coming from the measure of path integration. We give simple rules for integrating products of distributions in such a way that the results ensure coordinate independence of the path integrals. The rules are derived by using equations of motion and partial integration, while keeping track of certain minimal features originating in the unique definition of all singular integrals in 1-ε dimensions. Our rules yield the same results as the much more cumbersome calculations in 1-ε dimensions where the limit ε→0 is taken at the end. They also agree with the rules found in an independent treatment on a finite time interval

  10. Development and evaluation of a profile negotiation process for integrating aircraft and air traffic control automation

    Science.gov (United States)

    Green, Steven M.; Denbraven, Wim; Williams, David H.

    1993-01-01

    The development and evaluation of the profile negotiation process (PNP), an interactive process between an aircraft and air traffic control (ATC) that integrates airborne and ground-based automation capabilities to determine conflict-free trajectories that are as close to an aircraft's preference as possible, are described. The PNP was evaluated in a real-time simulation experiment conducted jointly by NASA's Ames and Langley Research Centers. The Ames Center/TRACON Automation System (CTAS) was used to support the ATC environment, and the Langley Transport Systems Research Vehicle (TSRV) piloted cab was used to simulate a 4D Flight Management System (FMS) capable aircraft. Both systems were connected in real time by way of voice and data lines; digital datalink communications capability was developed and evaluated as a means of supporting the air/ground exchange of trajectory data. The controllers were able to consistently and effectively negotiate nominally conflict-free vertical profiles with the 4D-equipped aircraft. The actual profiles flown were substantially closer to the aircraft's preference than would have been possible without the PNP. However, there was a strong consensus among the pilots and controllers that the level of automation of the PNP should be increased to make the process more transparent. The experiment demonstrated the importance of an aircraft's ability to accurately execute a negotiated profile as well as the need for digital datalink to support advanced air/ground data communications. The concept of trajectory space is proposed as a comprehensive approach for coupling the processes of trajectory planning and tracking to allow maximum pilot discretion in meeting ATC constraints.

  11. Working toward integrated models of alpine plant distribution.

    Science.gov (United States)

    Carlson, Bradley Z; Randin, Christophe F; Boulangeat, Isabelle; Lavergne, Sébastien; Thuiller, Wilfried; Choler, Philippe

    2013-10-01

    Species distribution models (SDMs) have been frequently employed to forecast the response of alpine plants to global changes. Efforts to model alpine plant distribution have thus far been primarily based on a correlative approach, in which ecological processes are implicitly addressed through a statistical relationship between observed species occurrences and environmental predictors. Recent evidence, however, highlights the shortcomings of correlative SDMs, especially in alpine landscapes where plant species tend to be decoupled from atmospheric conditions in micro-topographic habitats and are particularly exposed to geomorphic disturbances. While alpine plants respond to the same limiting factors as plants found at lower elevations, alpine environments impose a particular set of scale-dependent and hierarchical drivers that shape the realized niche of species and that require explicit consideration in a modelling context. Several recent studies in the European Alps have successfully integrated both correlative and process-based elements into distribution models of alpine plants, but for the time being a single integrative modelling framework that includes all key drivers remains elusive. As a first step in working toward a comprehensive integrated model applicable to alpine plant communities, we propose a conceptual framework that structures the primary mechanisms affecting alpine plant distributions. We group processes into four categories, including multi-scalar abiotic drivers, gradient dependent species interactions, dispersal and spatial-temporal plant responses to disturbance. Finally, we propose a methodological framework aimed at developing an integrated model to better predict alpine plant distribution.

  12. [Analysis of foreign experience of usage of automation systems of medication distribution in prevention and treatment facilities].

    Science.gov (United States)

    Miroshnichenko, Iu V; Umarov, S Z

    2012-12-01

    One of the ways of increase of effectiveness and safety of patients medication supplement is the use of automated systems of distribution, through which substantially increases the efficiency and safety of patients' medication supplement, achieves significant economy of material and financial resources for medication assistance and possibility of systematical improvement of its accessibility and quality.

  13. AGIS: Integration of new technologies used in ATLAS Distributed Computing

    OpenAIRE

    Anisenkov, Alexey; Di Girolamo, Alessandro; Alandes Pradillo, Maria

    2017-01-01

    The variety of the ATLAS Distributed Computing infrastructure requires a central information system to define the topology of computing resources and to store different parameters and configuration data which are needed by various ATLAS software components. The ATLAS Grid Information System (AGIS) is the system designed to integrate configuration and status information about resources, services and topology of the computing infrastructure used by ATLAS Distributed Computing applications and s...

  14. Automated processing integrated with a microflow cytometer for pathogen detection in clinical matrices.

    Science.gov (United States)

    Golden, J P; Verbarg, J; Howell, P B; Shriver-Lake, L C; Ligler, F S

    2013-02-15

    A spinning magnetic trap (MagTrap) for automated sample processing was integrated with a microflow cytometer capable of simultaneously detecting multiple targets to provide an automated sample-to-answer diagnosis in 40 min. After target capture on fluorescently coded magnetic microspheres, the magnetic trap automatically concentrated the fluorescently coded microspheres, separated the captured target from the sample matrix, and exposed the bound target sequentially to biotinylated tracer molecules and streptavidin-labeled phycoerythrin. The concentrated microspheres were then hydrodynamically focused in a microflow cytometer capable of 4-color analysis (two wavelengths for microsphere identification, one for light scatter to discriminate single microspheres and one for phycoerythrin bound to the target). A three-fold decrease in sample preparation time and an improved detection limit, independent of target preconcentration, was demonstrated for detection of Escherichia coli 0157:H7 using the MagTrap as compared to manual processing. Simultaneous analysis of positive and negative controls, along with the assay reagents specific for the target, was used to obtain dose-response curves, demonstrating the potential for quantification of pathogen load in buffer and serum. Published by Elsevier B.V.

  15. A Cost Effective Security Technology Integrated with RFID Based Automated Toll Collection System

    Directory of Open Access Journals (Sweden)

    Rafiya Hossain

    2017-09-01

    Full Text Available Crime statistics and research on criminology show that under similar circumstances,crimes are more likely to occur in developing countries than in developed countries due to their lack ofsecurity measures. Transport crimes on highways and bridges are one of the most common crimes in the developing nations. Automation of various systems like the toll collection system is being introduced in the developing countries to avoid corruption in the collection of toll, decrease cost and increase operational efficiency. The goal of this research is to find an integrated solution that enhances security along with the advantage of automated toll collection. Inspired by the availability of many security systems, this research presents a system that can block a specific vehicle or a particular type of vehicles at the toll booths based on directives from the law enforcement agencies. The heart of the system is based on RFID (Radio Frequency Identification technology. In this system, by sending a text message the law enforcement agency or the authority that controls the toll booths can prevent the barrier from being liftedeven after deduction of the toll charge if the passing vehicle has a security issue. The designed system should help the effort of reducing transport crimes on highways and bridges of developing countries.

  16. The Space Station Module Power Management and Distribution automation test bed

    Science.gov (United States)

    Lollar, Louis F.

    1991-01-01

    The Space Station Module Power Management And Distribution (SSM/PMAD) automation test bed project was begun at NASA/Marshall Space Flight Center (MSFC) in the mid-1980s to develop an autonomous, user-supportive power management and distribution test bed simulating the Space Station Freedom Hab/Lab modules. As the test bed has matured, many new technologies and projects have been added. The author focuses on three primary areas. The first area is the overall accomplishments of the test bed itself. These include a much-improved user interface, a more efficient expert system scheduler, improved communication among the three expert systems, and initial work on adding intermediate levels of autonomy. The second area is the addition of a more realistic power source to the SSM/PMAD test bed; this project is called the Large Autonomous Spacecraft Electrical Power System (LASEPS). The third area is the completion of a virtual link between the SSM/PMAD test bed at MSFC and the Autonomous Power Expert at Lewis Research Center.

  17. Integrated Power Flow and Short Circuit Calculation Method for Distribution Network with Inverter Based Distributed Generation

    Directory of Open Access Journals (Sweden)

    Shan Yang

    2016-01-01

    Full Text Available Power flow calculation and short circuit calculation are the basis of theoretical research for distribution network with inverter based distributed generation. The similarity of equivalent model for inverter based distributed generation during normal and fault conditions of distribution network and the differences between power flow and short circuit calculation are analyzed in this paper. Then an integrated power flow and short circuit calculation method for distribution network with inverter based distributed generation is proposed. The proposed method let the inverter based distributed generation be equivalent to Iθ bus, which makes it suitable to calculate the power flow of distribution network with a current limited inverter based distributed generation. And the low voltage ride through capability of inverter based distributed generation can be considered as well in this paper. Finally, some tests of power flow and short circuit current calculation are performed on a 33-bus distribution network. The calculated results from the proposed method in this paper are contrasted with those by the traditional method and the simulation method, whose results have verified the effectiveness of the integrated method suggested in this paper.

  18. Regulatory Improvements for Effective Integration of Distributed Generation into Electricity Distribution Networks

    International Nuclear Information System (INIS)

    Scheepers, M.J.J.; Jansen, J.C.; De Joode, J.; Bauknecht, D.; Gomez, T.; Pudjianto, D.; Strbac, G.; Ropenus, S.

    2007-11-01

    The growth of distributed electricity supply of renewable energy sources (RES-E) and combined heat and power (CHP) - so called distributed generation (DG) - can cause technical problems for electricity distribution networks. These integration problems can be overcome by reinforcing the network. Many European Member States apply network regulation that does not account for the impact of DG growth on the network costs. Passing on network integration costs to the DG-operator who is responsible for these extra costs may result in discrimination between different DG plants and between DG and large power generation. Therefore, in many regulatory systems distribution system operators (DSOs) are not being compensated for the DG integration costs. The DG-GRID project analysed technical and economical barriers for integration of distributed generation into electricity distribution networks. The project looked into the impact of a high DG deployment on the electricity distribution system costs and the impact on the financial position of the DSO. Several ways for improving network regulation in order to compensate DSOs for the increasing DG penetration were identified and tested. The DG-GRID project looked also into stimulating network innovations through economic regulation. The project was co-financed by the European Commission and carried out by nine European universities and research institutes. This report summarises the project results and is based on a number of DG-GRID reports that describe the conducted analyses and their results

  19. The NIF DISCO Framework: facilitating automated integration of neuroscience content on the web.

    Science.gov (United States)

    Marenco, Luis; Wang, Rixin; Shepherd, Gordon M; Miller, Perry L

    2010-06-01

    This paper describes the capabilities of DISCO, an extensible approach that supports integrative Web-based information dissemination. DISCO is a component of the Neuroscience Information Framework (NIF), an NIH Neuroscience Blueprint initiative that facilitates integrated access to diverse neuroscience resources via the Internet. DISCO facilitates the automated maintenance of several distinct capabilities using a collection of files 1) that are maintained locally by the developers of participating neuroscience resources and 2) that are "harvested" on a regular basis by a central DISCO server. This approach allows central NIF capabilities to be updated as each resource's content changes over time. DISCO currently supports the following capabilities: 1) resource descriptions, 2) "LinkOut" to a resource's data items from NCBI Entrez resources such as PubMed, 3) Web-based interoperation with a resource, 4) sharing a resource's lexicon and ontology, 5) sharing a resource's database schema, and 6) participation by the resource in neuroscience-related RSS news dissemination. The developers of a resource are free to choose which DISCO capabilities their resource will participate in. Although DISCO is used by NIF to facilitate neuroscience data integration, its capabilities have general applicability to other areas of research.

  20. Distribution Grid Integration Costs Under High PV Penetrations Workshop |

    Science.gov (United States)

    utility business model and structure: policies and regulations, revenue requirements and investment Practices Panel 3: Future Directions in Grid Integration Cost-Benefit Analysis Determining Distribution Grid into Utility Planning Notes on Future Needs All speakers were asked to include their opinions on

  1. An integrated drug prescription and distribution system: challenges and opportunities.

    Science.gov (United States)

    Lanssiers, R; Everaert, E; De Win, M; Van De Velde, R; De Clercq, H

    2002-01-01

    Using the hospital's drug prescription and distribution system as a guide, benefits and drawbacks of a medical activity management system that is tightly integrated with the supply chain management of a hospital will be discussed from the point of view of various participating healthcare actors.

  2. Integration of Cloud resources in the LHCb Distributed Computing

    CERN Document Server

    Ubeda Garcia, Mario; Stagni, Federico; Cabarrou, Baptiste; Rauschmayr, Nathalie; Charpentier, Philippe; Closier, Joel

    2014-01-01

    This contribution describes how Cloud resources have been integrated in the LHCb Distributed Computing. LHCb is using its specific Dirac extension (LHCbDirac) as an interware for its Distributed Computing. So far, it was seamlessly integrating Grid resources and Computer clusters. The cloud extension of DIRAC (VMDIRAC) allows the integration of Cloud computing infrastructures. It is able to interact with multiple types of infrastructures in commercial and institutional clouds, supported by multiple interfaces (Amazon EC2, OpenNebula, OpenStack and CloudStack) – instantiates, monitors and manages Virtual Machines running on this aggregation of Cloud resources. Moreover, specifications for institutional Cloud resources proposed by Worldwide LHC Computing Grid (WLCG), mainly by the High Energy Physics Unix Information Exchange (HEPiX) group, have been taken into account. Several initiatives and computing resource providers in the eScience environment have already deployed IaaS in production during 2013. Keepin...

  3. National Space Science Data Center data archive and distribution service (NDADS) automated retrieval mail system user's guide

    Science.gov (United States)

    Perry, Charleen M.; Vansteenberg, Michael E.

    1992-01-01

    The National Space Science Data Center (NSSDC) has developed an automated data retrieval request service utilizing our Data Archive and Distribution Service (NDADS) computer system. NDADS currently has selected project data written to optical disk platters with the disks residing in a robotic 'jukebox' near-line environment. This allows for rapid and automated access to the data with no staff intervention required. There are also automated help information and user services available that can be accessed. The request system permits an average-size data request to be completed within minutes of the request being sent to NSSDC. A mail message, in the format described in this document, retrieves the data and can send it to a remote site. Also listed in this document are the data currently available.

  4. Future power plant control integrates process and substation automation into one system; Zukunftsorientierte Kraftwerksleittechnik vereint Prozess- und Stationsautomatisierung

    Energy Technology Data Exchange (ETDEWEB)

    Orth, J. [ABB AG, Mannheim (Germany). Div. Energietechnik-Systeme

    2007-07-01

    The new IEC 61850 standard has been established for substation control systems. In future, IEC 61850 may also be widely used for electrical systems in power plants. IEC 61850 simplifies the integration of process and substation control systems in power plants by creating one automated system across manufacturers and thus makes a significant contribution to cost efficiency in operation and maintenance. (orig.)

  5. AGIS: Integration of new technologies used in ATLAS Distributed Computing

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00291854; The ATLAS collaboration; Di Girolamo, Alessandro; Alandes Pradillo, Maria

    2017-01-01

    The variety of the ATLAS Distributed Computing infrastructure requires a central information system to define the topology of computing resources and to store different parameters and configuration data which are needed by various ATLAS software components. The ATLAS Grid Information System (AGIS) is the system designed to integrate configuration and status information about resources, services and topology of the computing infrastructure used by ATLAS Distributed Computing applications and services. Being an intermediate middleware system between clients and external information sources (like central BDII, GOCDB, MyOSG), AGIS defines the relations between experiment specific used resources and physical distributed computing capabilities. Being in production during LHC Runl AGIS became the central information system for Distributed Computing in ATLAS and it is continuously evolving to fulfil new user requests, enable enhanced operations and follow the extension of the ATLAS Computing model. The ATLAS Computin...

  6. An integrated economic and distributional analysis of energy policies

    International Nuclear Information System (INIS)

    Labandeira, Xavier; Labeaga, Jose M.; Rodriguez, Miguel

    2009-01-01

    Most public policies, particularly those in the energy sphere, have not only efficiency but also distributional effects. However, there is a trade-off between modelling approaches suitable for calculating those impacts on the economy. For the former most of the studies have been conducted with general equilibrium models, whereas partial equilibrium models represent the main approach for distributional analysis. This paper proposes a methodology to simultaneously carry out an analysis of the distributional and efficiency consequences of changes in energy taxation. In order to do so, we have integrated a microeconomic household demand model and a computable general equilibrium model for the Spanish economy. We illustrate the advantages of this approach by simulating a revenue-neutral reform in Spanish indirect taxation, with a large increase of energy taxes that serve an environmental purpose. The results show that the reforms bring about significant efficiency and distributional effects, in some cases counterintuitive, and demonstrate the academic and social utility of this approximation. (author)

  7. Managing Risks in Distributed Software Projects: An Integrative Framework

    DEFF Research Database (Denmark)

    Persson, John Stouby; Mathiassen, Lars; Boeg, Jesper

    2009-01-01

    techniques into an integrative framework for managing risks in distributed contexts. Subsequent implementation of a Web-based tool helped us refine the framework based on empirical evaluation of its practical usefulness.We conclude by discussing implications for both research and practice.......Software projects are increasingly geographically distributed with limited face-to-face interaction between participants. These projects face particular challenges that need carefulmanagerial attention. While risk management has been adopted with success to address other challenges within software...... development, there are currently no frameworks available for managing risks related to geographical distribution. On this background, we systematically review the literature on geographically distributed software projects. Based on the review, we synthesize what we know about risks and risk resolution...

  8. An integrated economic and distributional analysis of energy policies

    Energy Technology Data Exchange (ETDEWEB)

    Labandeira, Xavier [Facultade de CC. Economicas, University of Vigo, 36310 Vigo (Spain); Labeaga, Jose M. [Instituto de Estudios Fiscales, Avda. Cardenal Herrera Oria 378, 28035 Madrid (Spain); Rodriguez, Miguel [Facultade de CC. Empresariais e Turismo, University of Vigo, 32004 Ourense (Spain)

    2009-12-15

    Most public policies, particularly those in the energy sphere, have not only efficiency but also distributional effects. However, there is a trade-off between modelling approaches suitable for calculating those impacts on the economy. For the former most of the studies have been conducted with general equilibrium models, whereas partial equilibrium models represent the main approach for distributional analysis. This paper proposes a methodology to simultaneously carry out an analysis of the distributional and efficiency consequences of changes in energy taxation. In order to do so, we have integrated a microeconomic household demand model and a computable general equilibrium model for the Spanish economy. We illustrate the advantages of this approach by simulating a revenue-neutral reform in Spanish indirect taxation, with a large increase of energy taxes that serve an environmental purpose. The results show that the reforms bring about significant efficiency and distributional effects, in some cases counterintuitive, and demonstrate the academic and social utility of this approximation. (author)

  9. System Performance of an Integrated Airborne Spacing Algorithm with Ground Automation

    Science.gov (United States)

    Swieringa, Kurt A.; Wilson, Sara R.; Baxley, Brian T.

    2016-01-01

    The National Aeronautics and Space Administration's (NASA's) first Air Traffic Management (ATM) Technology Demonstration (ATD-1) was created to facilitate the transition of mature ATM technologies from the laboratory to operational use. The technologies selected for demonstration are the Traffic Management Advisor with Terminal Metering (TMA-TM), which provides precise time-based scheduling in the Terminal airspace; Controller Managed Spacing (CMS), which provides controllers with decision support tools to enable precise schedule conformance; and Interval Management (IM), which consists of flight deck automation that enables aircraft to achieve or maintain precise spacing behind another aircraft. Recent simulations and IM algorithm development at NASA have focused on trajectory-based IM operations where aircraft equipped with IM avionics are expected to achieve a spacing goal, assigned by air traffic controllers, at the final approach fix. The recently published IM Minimum Operational Performance Standards describe five types of IM operations. This paper discusses the results and conclusions of a human-in-the-loop simulation that investigated three of those IM operations. The results presented in this paper focus on system performance and integration metrics. Overall, the IM operations conducted in this simulation integrated well with ground-based decisions support tools and certain types of IM operational were able to provide improved spacing precision at the final approach fix; however, some issues were identified that should be addressed prior to implementing IM procedures into real-world operations.

  10. Methodology, Algorithms, and Emerging Tool for Automated Design of Intelligent Integrated Multi-Sensor Systems

    Directory of Open Access Journals (Sweden)

    Andreas König

    2009-11-01

    Full Text Available The emergence of novel sensing elements, computing nodes, wireless communication and integration technology provides unprecedented possibilities for the design and application of intelligent systems. Each new application system must be designed from scratch, employing sophisticated methods ranging from conventional signal processing to computational intelligence. Currently, a significant part of this overall algorithmic chain of the computational system model still has to be assembled manually by experienced designers in a time and labor consuming process. In this research work, this challenge is picked up and a methodology and algorithms for automated design of intelligent integrated and resource-aware multi-sensor systems employing multi-objective evolutionary computation are introduced. The proposed methodology tackles the challenge of rapid-prototyping of such systems under realization constraints and, additionally, includes features of system instance specific self-correction for sustained operation of a large volume and in a dynamically changing environment. The extension of these concepts to the reconfigurable hardware platform renders so called self-x sensor systems, which stands, e.g., for self-monitoring, -calibrating, -trimming, and -repairing/-healing systems. Selected experimental results prove the applicability and effectiveness of our proposed methodology and emerging tool. By our approach, competitive results were achieved with regard to classification accuracy, flexibility, and design speed under additional design constraints.

  11. AGIS: Integration of new technologies used in ATLAS Distributed Computing

    Science.gov (United States)

    Anisenkov, Alexey; Di Girolamo, Alessandro; Alandes Pradillo, Maria

    2017-10-01

    The variety of the ATLAS Distributed Computing infrastructure requires a central information system to define the topology of computing resources and to store different parameters and configuration data which are needed by various ATLAS software components. The ATLAS Grid Information System (AGIS) is the system designed to integrate configuration and status information about resources, services and topology of the computing infrastructure used by ATLAS Distributed Computing applications and services. Being an intermediate middleware system between clients and external information sources (like central BDII, GOCDB, MyOSG), AGIS defines the relations between experiment specific used resources and physical distributed computing capabilities. Being in production during LHC Runl AGIS became the central information system for Distributed Computing in ATLAS and it is continuously evolving to fulfil new user requests, enable enhanced operations and follow the extension of the ATLAS Computing model. The ATLAS Computing model and data structures used by Distributed Computing applications and services are continuously evolving and trend to fit newer requirements from ADC community. In this note, we describe the evolution and the recent developments of AGIS functionalities, related to integration of new technologies recently become widely used in ATLAS Computing, like flexible computing utilization of opportunistic Cloud and HPC resources, ObjectStore services integration for Distributed Data Management (Rucio) and ATLAS workload management (PanDA) systems, unified storage protocols declaration required for PandDA Pilot site movers and others. The improvements of information model and general updates are also shown, in particular we explain how other collaborations outside ATLAS could benefit the system as a computing resources information catalogue. AGIS is evolving towards a common information system, not coupled to a specific experiment.

  12. Integration of renewable generation and elastic loads into distribution grids

    CERN Document Server

    Ardakanian, Omid; Rosenberg, Catherine

    2016-01-01

    This brief examines the challenges of integrating distributed energy resources and high-power elastic loads into low-voltage distribution grids, as well as the potential for pervasive measurement. It explores the control needed to address these challenges and achieve various system-level and user-level objectives. A mathematical framework is presented for the joint control of active end-nodes at scale, and extensive numerical simulations demonstrate that proper control of active end-nodes can significantly enhance reliable and economical operation of the power grid.

  13. A distributed substation automation model based on the multi-agents technology; Um modelo distribuido de automacao de subestacoes baseado em tecnologia multiagentes

    Energy Technology Data Exchange (ETDEWEB)

    Geus, Klaus de; Milsztajn, Flavio; Kolb, Carlos Jose Johann; Dometerco, Jose Henrique; Souza, Alexandre Mendonca de; Braga, Ciro de Carvalho; Parolin, Emerson Luis; Frisch, Arlenio Carneiro; Fortunato Junior, Luiz Kiss; Erzinger Junior, Augusto; Jonack, Marco Antonio; Guiera, Anderson Juliano Azambuja [Companhia Paranaense de Energia (COPEL), Curitiba, PR (Brazil)]. E-mail: klaus@copel.com; flaviomil@copel.com; kolb@copel.com; dometerc@copel.com; alexandre.mendonca@copel.com; ciro@copel.com; parolin@copel.com; arlenio@copel.com; luiz.kiss@copel.com; aerzinger@copel.com; jonack@copel.com; guiera@copel.com

    2006-10-15

    The main purpose of this paper is to analyse distributed computing technology which can be used in substation automation systems. Based on performance comparative results obtained in laboratory, a specific model for distributed substation automation is proposed considering the current model employed at COPEL - Companhia Paranaense de Energia. The proposed model is based on the multi-agents technology, which has lately received special attention in the development of distributed systems with local intelligence. (author)

  14. Developing an automated water emitting-sensing system, based on integral tensiometers placed in homogenous environment.

    Science.gov (United States)

    Dabach, Sharon; Shani, Uri

    2010-05-01

    As the population grows, irrigated agriculture is using more water and fertilizers to supply the growing food demand. However, the uptake by various plants is only 30 to 50% of the water applied. The remaining water flows to surface water and groundwater and causes their contamination by fertilizers or other toxins such as herbicides or pesticides. To improve the water use efficiency of crops and decrease the drainage below the root zone, irrigation water should be applied according to the plant demand. The aim of this work is to develop an automated irrigation system based on real-time feedback from an inexpensive and reliable integrated sensing system. This system will supply water to plants according to their demand, without any user interference during the entire growth season. To achieve this goal a sensor (Geo-Tensiometer) was designed and tested. This sensor has better contact with the surrounding soil, is more reliable and much cheaper than the ceramic cup tensiometer. A lysimeter experiment was conducted to evaluate a subsurface drip irrigation regime based on the Geo-Tensiometer and compare it to a daily irrigation regime. All of the drippers were wrapped in Geo-textile. By integrating the Geo-Tensiometer within the Geo-textile which surrounds the drippers, we created a homogenous media in the entire lysimeter in which the reading of the matric potential takes place. This media, the properties of which are set and known to us, encourages root growth therein. Root density in this media is very high; therefore most of the plant water uptake is from this area. The irrigation system in treatment A irrigated when the matric potential reached a threshold which was set every morning automatically by the system. The daily treatment included a single irrigation each morning that was set to return 120% of the evapotranspiration of the previous day. All Geo-Tensiometers were connected to an automated washing system, that flushed air trapped in the Geo

  15. Integration of cloud resources in the LHCb distributed computing

    International Nuclear Information System (INIS)

    García, Mario Úbeda; Stagni, Federico; Cabarrou, Baptiste; Rauschmayr, Nathalie; Charpentier, Philippe; Closier, Joel; Muñoz, Víctor Méndez

    2014-01-01

    This contribution describes how Cloud resources have been integrated in the LHCb Distributed Computing. LHCb is using its specific Dirac extension (LHCbDirac) as an interware for its Distributed Computing. So far, it was seamlessly integrating Grid resources and Computer clusters. The cloud extension of DIRAC (VMDIRAC) allows the integration of Cloud computing infrastructures. It is able to interact with multiple types of infrastructures in commercial and institutional clouds, supported by multiple interfaces (Amazon EC2, OpenNebula, OpenStack and CloudStack) – instantiates, monitors and manages Virtual Machines running on this aggregation of Cloud resources. Moreover, specifications for institutional Cloud resources proposed by Worldwide LHC Computing Grid (WLCG), mainly by the High Energy Physics Unix Information Exchange (HEPiX) group, have been taken into account. Several initiatives and computing resource providers in the eScience environment have already deployed IaaS in production during 2013. Keeping this on mind, pros and cons of a cloud based infrasctructure have been studied in contrast with the current setup. As a result, this work addresses four different use cases which represent a major improvement on several levels of our infrastructure. We describe the solution implemented by LHCb for the contextualisation of the VMs based on the idea of Cloud Site. We report on operational experience of using in production several institutional Cloud resources that are thus becoming integral part of the LHCb Distributed Computing resources. Furthermore, we describe as well the gradual migration of our Service Infrastructure towards a fully distributed architecture following the Service as a Service (SaaS) model.

  16. Distributional, differential and integral problems: Equivalence and existence results

    Czech Academy of Sciences Publication Activity Database

    Monteiro, Giselle Antunes; Satco, B. R.

    2017-01-01

    Roč. 2017, č. 7 (2017), s. 1-26 ISSN 1417-3875 Institutional support: RVO:67985840 Keywords : derivative with respect to functions * distribution * Kurzweil-Stieltjes integral Subject RIV: BA - General Mathematics OBOR OECD: Pure mathematics Impact factor: 0.926, year: 2016 http://www.math.u-szeged.hu/ejqtde/periodica.html?periodica=1¶mtipus_ertek= publication ¶m_ertek=4753

  17. Integration of Cloud resources in the LHCb Distributed Computing

    Science.gov (United States)

    Úbeda García, Mario; Méndez Muñoz, Víctor; Stagni, Federico; Cabarrou, Baptiste; Rauschmayr, Nathalie; Charpentier, Philippe; Closier, Joel

    2014-06-01

    This contribution describes how Cloud resources have been integrated in the LHCb Distributed Computing. LHCb is using its specific Dirac extension (LHCbDirac) as an interware for its Distributed Computing. So far, it was seamlessly integrating Grid resources and Computer clusters. The cloud extension of DIRAC (VMDIRAC) allows the integration of Cloud computing infrastructures. It is able to interact with multiple types of infrastructures in commercial and institutional clouds, supported by multiple interfaces (Amazon EC2, OpenNebula, OpenStack and CloudStack) - instantiates, monitors and manages Virtual Machines running on this aggregation of Cloud resources. Moreover, specifications for institutional Cloud resources proposed by Worldwide LHC Computing Grid (WLCG), mainly by the High Energy Physics Unix Information Exchange (HEPiX) group, have been taken into account. Several initiatives and computing resource providers in the eScience environment have already deployed IaaS in production during 2013. Keeping this on mind, pros and cons of a cloud based infrasctructure have been studied in contrast with the current setup. As a result, this work addresses four different use cases which represent a major improvement on several levels of our infrastructure. We describe the solution implemented by LHCb for the contextualisation of the VMs based on the idea of Cloud Site. We report on operational experience of using in production several institutional Cloud resources that are thus becoming integral part of the LHCb Distributed Computing resources. Furthermore, we describe as well the gradual migration of our Service Infrastructure towards a fully distributed architecture following the Service as a Service (SaaS) model.

  18. Developing an Integration Infrastructure for Distributed Engine Control Technologies

    Science.gov (United States)

    Culley, Dennis; Zinnecker, Alicia; Aretskin-Hariton, Eliot; Kratz, Jonathan

    2014-01-01

    Turbine engine control technology is poised to make the first revolutionary leap forward since the advent of full authority digital engine control in the mid-1980s. This change aims squarely at overcoming the physical constraints that have historically limited control system hardware on aero-engines to a federated architecture. Distributed control architecture allows complex analog interfaces existing between system elements and the control unit to be replaced by standardized digital interfaces. Embedded processing, enabled by high temperature electronics, provides for digitization of signals at the source and network communications resulting in a modular system at the hardware level. While this scheme simplifies the physical integration of the system, its complexity appears in other ways. In fact, integration now becomes a shared responsibility among suppliers and system integrators. While these are the most obvious changes, there are additional concerns about performance, reliability, and failure modes due to distributed architecture that warrant detailed study. This paper describes the development of a new facility intended to address the many challenges of the underlying technologies of distributed control. The facility is capable of performing both simulation and hardware studies ranging from component to system level complexity. Its modular and hierarchical structure allows the user to focus their interaction on specific areas of interest.

  19. Control strategies for power distribution networks with electric vehicles integration

    DEFF Research Database (Denmark)

    Hu, Junjie

    of electrical energy. A smart grid can also be dened as an electricity network that can intelligently integrate the actions of all users connected to it - generators, consumers and those that do both - in order to eciently deliver sustainable, economic and secure electricity supplies. This thesis focuses...... of the ii market. To build a complete solution for integration of EVs into the distribution network, a price coordinated hierarchical scheduling system is proposed which can well characterize the involved actors in the smart grid. With this system, we demonstrate that it is possible to schedule the charging......Demand side resources, like electric vehicles (EVs), can become integral parts of a smart grids because instead of just consuming power they are capable of providing valuable services to power systems. EVs can be used to balance the intermittent renewable energy resources such as wind and solar...

  20. Using geospatial solutions to meet distribution integrity management requirements

    Energy Technology Data Exchange (ETDEWEB)

    McElroy, Robert A. [New Century Software, Inc., Fort Collins, CO (United States)

    2010-07-01

    In the United States, incidents on gas distribution pipelines kill on average 10 persons per year in addition to causing 40 serious injuries and millions of dollars of property damage. In order to remedy to this situation, the US Department of Transportation/Pipeline Hazardous Materials Safety Administration enacted new regulations requiring operators to develop distribution integrity management programs (DIMP) which must include: knowledge and identification of threats, evaluation of risk, identification and implementation of measures to address risks, performance measuring, periodic evaluation and improvement and results reporting. The aim of this paper is to show how geographic information systems (GIS) can help operators meet each requirement of the DIMP regulations. This discussion showed that GIS can help in identifying and quantifying the threats to the distribution system and in assessing the consequences of an incident. Investing in GIS will not only help operators in complying with the regulations but will also help them make economically sound, risk-based decisions.

  1. Can pilots still fly? Role distribution and hybrid interaction in advanced automated aircraft

    OpenAIRE

    Weyer, Johannes

    2015-01-01

    Recent accidents of commercial airplanes have raised the question once more whether pilots can rely on automation in order to fly advanced aircraft safely. Although the issue of human-machine interaction in aviation has been investigated frequently, profound knowledge about pilots’ perceptions and attitudes is fragmentary and partly out-dated. The paper at hand presents the results of a pilot survey, which has been guided by a collaborative perspective of human-automation decision-making. It ...

  2. Integrated microreactor for enzymatic reaction automation: An easy step toward the quality control of monoclonal antibodies.

    Science.gov (United States)

    Ladner, Yoann; Mas, Silvia; Coussot, Gaelle; Bartley, Killian; Montels, Jérôme; Morel, Jacques; Perrin, Catherine

    2017-12-15

    The main purpose of the present work is to provide a fully integrated miniaturized electrophoretic methodology in order to facilitate the quality control of monoclonal antibodies (mAbs). This methodology called D-PES, which stands for Diffusion-mediated Proteolysis combined with an Electrophoretic Separation, permits to perform subsequently mAb tryptic digestion and electrophoresis separation of proteolysis products in an automated manner. Tryptic digestion conditions were optimized regarding the influence of enzyme concentration and incubation time in order to achieve similar enzymatic digestion efficiency to that obtained with the classical methodology (off-line). Then, the optimization of electrophoretic separation conditions concerning the nature of background electrolyte (BGE), ionic strength and pH was realized. Successful and repeatable electrophoretic profiles of three mAbs digests (Trastuzumab, Infliximab and Tocilizumab), comparable to the off-line digestion profiles, were obtained demonstrating the feasibility and robustness of the proposed methodology. In summary, the use of the proposed and optimized in-line approach opens a new, fast and easy way for the quality control of mAbs. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Microseismic event location using global optimization algorithms: An integrated and automated workflow

    Science.gov (United States)

    Lagos, Soledad R.; Velis, Danilo R.

    2018-02-01

    We perform the location of microseismic events generated in hydraulic fracturing monitoring scenarios using two global optimization techniques: Very Fast Simulated Annealing (VFSA) and Particle Swarm Optimization (PSO), and compare them against the classical grid search (GS). To this end, we present an integrated and optimized workflow that concatenates into an automated bash script the different steps that lead to the microseismic events location from raw 3C data. First, we carry out the automatic detection, denoising and identification of the P- and S-waves. Secondly, we estimate their corresponding backazimuths using polarization information, and propose a simple energy-based criterion to automatically decide which is the most reliable estimate. Finally, after taking proper care of the size of the search space using the backazimuth information, we perform the location using the aforementioned algorithms for 2D and 3D usual scenarios of hydraulic fracturing processes. We assess the impact of restricting the search space and show the advantages of using either VFSA or PSO over GS to attain significant speed-ups.

  4. Design and implementation of a highly integrated and automated in situ bioremediation system for petroleum hydrocarbons

    International Nuclear Information System (INIS)

    Dey, J.C.; Rosenwinkel, P.; Norris, R.D.

    1996-01-01

    The proposed sale of an industrial property required that an environmental investigation be conducted as part of the property transfer agreement. The investigation revealed petroleum hydrocarbon compounds (PHCs) in the subsurface. Light nonaqueous phase liquids (LNAPLs) varsol (a gasoline like solvent), gasoline, and fuel oil were found across a three (3) acre area and were present as liquid phase PHCs, as dissolved phase PHCs, and as adsorbed phase PHCs in both saturated and unsaturated soils. Fuel oil was largely present in the unsaturated soils. Fuel oil was largely present in the unsaturated soils. Varsol represented the majority of the PHCs present. The presence of liquid phase PHCs suggested that any remedial action incorporate free phase recovery. The volatility of varsol and gasoline and the biodegradability of the PHCs present in the subsurface suggested that bioremediation, air sparging, and soil vapor extraction/bioventing were appropriate technologies for incorporation in a remedy. The imminent conversion of the impacted area to a retail facility required that any long term remedy be unobtrusive and require minimum activity across much of the impacted area. In the following sections the site investigation, selection and testing of remedial technologies, and design and implementation of an integrated and automated remedial system is discussed

  5. Automated integration of genomic physical mapping data via parallel simulated annealing

    Energy Technology Data Exchange (ETDEWEB)

    Slezak, T.

    1994-06-01

    The Human Genome Center at the Lawrence Livermore National Laboratory (LLNL) is nearing closure on a high-resolution physical map of human chromosome 19. We have build automated tools to assemble 15,000 fingerprinted cosmid clones into 800 contigs with minimal spanning paths identified. These islands are being ordered, oriented, and spanned by a variety of other techniques including: Fluorescence Insitu Hybridization (FISH) at 3 levels of resolution, ECO restriction fragment mapping across all contigs, and a multitude of different hybridization and PCR techniques to link cosmid, YAC, AC, PAC, and Pl clones. The FISH data provide us with partial order and distance data as well as orientation. We made the observation that map builders need a much rougher presentation of data than do map readers; the former wish to see raw data since these can expose errors or interesting biology. We further noted that by ignoring our length and distance data we could simplify our problem into one that could be readily attacked with optimization techniques. The data integration problem could then be seen as an M x N ordering of our N cosmid clones which ``intersect`` M larger objects by defining ``intersection`` to mean either contig/map membership or hybridization results. Clearly, the goal of making an integrated map is now to rearrange the N cosmid clone ``columns`` such that the number of gaps on the object ``rows`` are minimized. Our FISH partially-ordered cosmid clones provide us with a set of constraints that cannot be violated by the rearrangement process. We solved the optimization problem via simulated annealing performed on a network of 40+ Unix machines in parallel, using a server/client model built on explicit socket calls. For current maps we can create a map in about 4 hours on the parallel net versus 4+ days on a single workstation. Our biologists are now using this software on a daily basis to guide their efforts toward final closure.

  6. Automation and integration of multiplexed on-line sample preparation with capillary electrophoresis for DNA sequencing

    Energy Technology Data Exchange (ETDEWEB)

    Tan, H.

    1999-03-31

    The purpose of this research is to develop a multiplexed sample processing system in conjunction with multiplexed capillary electrophoresis for high-throughput DNA sequencing. The concept from DNA template to called bases was first demonstrated with a manually operated single capillary system. Later, an automated microfluidic system with 8 channels based on the same principle was successfully constructed. The instrument automatically processes 8 templates through reaction, purification, denaturation, pre-concentration, injection, separation and detection in a parallel fashion. A multiplexed freeze/thaw switching principle and a distribution network were implemented to manage flow direction and sample transportation. Dye-labeled terminator cycle-sequencing reactions are performed in an 8-capillary array in a hot air thermal cycler. Subsequently, the sequencing ladders are directly loaded into a corresponding size-exclusion chromatographic column operated at {approximately} 60 C for purification. On-line denaturation and stacking injection for capillary electrophoresis is simultaneously accomplished at a cross assembly set at {approximately} 70 C. Not only the separation capillary array but also the reaction capillary array and purification columns can be regenerated after every run. DNA sequencing data from this system allow base calling up to 460 bases with accuracy of 98%.

  7. Integrating distributed generation: Regulation and trends in three leading countries

    International Nuclear Information System (INIS)

    Anaya, Karim L.; Pollitt, Michael G.

    2015-01-01

    This paper explores the trends in the deployment and integration of distributed generation in Germany, Denmark and Sweden. The study concentrates on the regulation of renewable energy generation with a focus on grid access and connection mechanisms. The high rate of distributed generation penetration is mainly based on the early support that these countries gave to the expansion of renewable energy generation – mainly wind and solar – within their respective national policies. Germany and Denmark are the ones with the most sophisticated support schemes, which have shown a dynamic design over time. In terms of connections, Germany has the most favorable connection regime which provides not only priority connection but also priority grid access for generation units that produce electricity from renewable energy sources. Sweden guarantees equal treatment among different technologies (i.e. a non-discrimination principle). High connection costs have been observed specially in Germany and Denmark. The costs of network upgrades are usually socialised across demand customers. However, integration issues should be taken into consideration in order to avoid expansion of distributed generation in a way which unnecessarily raises total system costs, via high connection costs. -- Highlights: •Examination of the DG connection arrangements in Denmark, Germany and Sweden. •Sophisticated subsidy schemes for DG contrast with socialization of connection costs. •No evidence of novel business models for connecting DG units smartly

  8. GENIUS : An integrated environment for supporting the design of generic automated negotiators

    NARCIS (Netherlands)

    Lin, R.; Kraus, S.; Baarslag, T.; Tykhonov, D.; Hindriks, K.; Jonker, C.M.

    2012-01-01

    The design of automated negotiators has been the focus of abundant research in recent years. However, due to difficulties involved in creating generalized agents that can negotiate in several domains and against human counterparts, many automated negotiators are domain specific and their behavior

  9. Consistent integrated automation. Optimized power plant control by means of IEC 61850; Durchgaengig automatisieren. Optimierte Kraftwerksleittechnik durch die Norm IEC 61850

    Energy Technology Data Exchange (ETDEWEB)

    Orth, J. [ABB AG, Mannheim (Germany). Geschaeftsbereich Power Generation

    2007-07-01

    Today's power plants are highly automated. All subsystems of large thermal power plants can be controlled from a central control room. The electrical systems are an important part. In future the new standard IEC 61850 will improve the integration of electrical systems into automation of power plants supporting the reduction of operation and maintenance cost. (orig.)

  10. A Federated Enterprise Architecture and MBSE Modeling Framework for Integrating Design Automation into a Global PLM Approach

    OpenAIRE

    Vosgien , Thomas; Rigger , Eugen; Schwarz , Martin; Shea , Kristina

    2017-01-01

    Part 1: PLM Maturity, Implementation and Adoption; International audience; PLM and Design Automation (DA) are two interdependent and necessary approaches to increase the performance and efficiency of product development processes. Often, DA systems’ usability suffers due to a lack of integration in industrial business environments stemming from the independent consideration of PLM and DA. This article proposes a methodological and modeling framework for developing and deploying DA solutions w...

  11. Distributed Energy Resources and Dynamic Microgrid: An Integrated Assessment

    Science.gov (United States)

    Shang, Duo Rick

    The overall goal of this thesis is to improve understanding in terms of the benefit of DERs to both utility and to electricity end-users when integrated in power distribution system. To achieve this goal, a series of two studies was conducted to assess the value of DERs when integrated with new power paradigms. First, the arbitrage value of DERs was examined in markets with time-variant electricity pricing rates (e.g., time of use, real time pricing) under a smart grid distribution paradigm. This study uses a stochastic optimization model to estimate the potential profit from electricity price arbitrage over a five-year period. The optimization process involves two types of PHEVs (PHEV-10, and PHEV-40) under three scenarios with different assumptions on technology performance, electricity market and PHEV owner types. The simulation results indicate that expected arbitrage profit is not a viable option to engage PHEVs in dispatching and in providing ancillary services without more favorable policy and PHEV battery technologies. Subsidy or change in electricity tariff or both are needed. Second, it examined the concept of dynamic microgrid as a measure to improve distribution resilience, and estimates the prices of this emerging service. An economic load dispatch (ELD) model is developed to estimate the market-clearing price in a hypothetical community with single bid auction electricity market. The results show that the electricity market clearing price on the dynamic microgrid is predominantly decided by power output and cost of electricity of each type of DGs. At circumstances where CHP is the only source, the electricity market clearing price in the island is even cheaper than the on-grid electricity price at normal times. Integration of PHEVs in the dynamic microgrid will increase electricity market clearing prices. It demonstrates that dynamic microgrid is an economically viable alternative to enhance grid resilience.

  12. An Innovative Reconfigurable Integrated Converter Topology Suitable for Distributed Generation

    Directory of Open Access Journals (Sweden)

    Renato Rizzo

    2012-09-01

    Full Text Available The electricity market and environmental concerns, with wide utilization of renewable sources, have improved the diffusion of distributed generation units changing the operations of distribution grids from passive networks to microgrids. A microgrid includes a cluster of electrical loads, energy storage devices and microsources, which provide both power and heat to their local area. A microgrid has usually one connection point to the utility grid through power electronic converters placed at customers’ sites. This paper analyses a Reconfigurable Integrated Converter (RIC used for a domestic microgrid with inputs from the AC mains and photovoltaic arrays, and two DC outputs at different voltage levels. A RIC as a dual-boost DC-DC converter is proposed, modelled and analysed in the paper. The advantages of such a topology in comparison with traditional boost converters are outlined. Reported simulations results give evidence on the controllability of this converter and the capability of achieving the desired voltage outputs with reduced ripple.

  13. Integrated Design Engineering Analysis (IDEA) Environment Automated Generation of Structured CFD Grids using Topology Methods

    Science.gov (United States)

    Kamhawi, Hilmi N.

    2012-01-01

    This report documents the work performed from March 2010 to March 2012. The Integrated Design and Engineering Analysis (IDEA) environment is a collaborative environment based on an object-oriented, multidisciplinary, distributed framework using the Adaptive Modeling Language (AML) as a framework and supporting the configuration design and parametric CFD grid generation. This report will focus on describing the work in the area of parametric CFD grid generation using novel concepts for defining the interaction between the mesh topology and the geometry in such a way as to separate the mesh topology from the geometric topology while maintaining the link between the mesh topology and the actual geometry.

  14. Reliability assessment of distribution system with the integration of renewable distributed generation

    International Nuclear Information System (INIS)

    Adefarati, T.; Bansal, R.C.

    2017-01-01

    Highlights: • Addresses impacts of renewable DG on the reliability of the distribution system. • Multi-objective formulation for maximizing the cost saving with integration of DG. • Uses Markov model to study the stochastic characteristics of the major components. • The investigation is done using modified RBTS bus test distribution system. • Proposed approach is useful for electric utilities to enhance the reliability. - Abstract: Recent studies have shown that renewable energy resources will contribute substantially to future energy generation owing to the rapid depletion of fossil fuels. Wind and solar energy resources are major sources of renewable energy that have the ability to reduce the energy crisis and the greenhouse gases emitted by the conventional power plants. Reliability assessment is one of the key indicators to measure the impact of the renewable distributed generation (DG) units in the distribution networks and to minimize the cost that is associated with power outage. This paper presents a comprehensive reliability assessment of the distribution system that satisfies the consumer load requirements with the penetration of wind turbine generator (WTG), electric storage system (ESS) and photovoltaic (PV). A Markov model is proposed to access the stochastic characteristics of the major components of the renewable DG resources as well as their influence on the reliability of a conventional distribution system. The results obtained from the case studies have demonstrated the effectiveness of using WTG, ESS and PV to enhance the reliability of the conventional distribution system.

  15. User-friendly Establishment of Trust in Distributed Home Automation Networks

    DEFF Research Database (Denmark)

    Solberg Hjorth, Theis; Torbensen, Rune; Madsen, Per Printz

    2014-01-01

    Current wireless technologies use a variety of methods to locally exchange and verify credentials between devices to establish trusted relationships. Scenarios in home automation networks also require this capability over the Internet, but the necessary involvement of non-expert users to setup...... these relationships can lead to misconfiguration or breaches of security. We outline a security system for Home Automation called Trusted Domain that can establish and maintain cryptographically secure relationships between devices connected via IP-based networks and the Internet. Trust establishment is presented...

  16. Trust in automation: integrating empirical evidence on factors that influence trust.

    Science.gov (United States)

    Hoff, Kevin Anthony; Bashir, Masooda

    2015-05-01

    We systematically review recent empirical research on factors that influence trust in automation to present a three-layered trust model that synthesizes existing knowledge. Much of the existing research on factors that guide human-automation interaction is centered around trust, a variable that often determines the willingness of human operators to rely on automation. Studies have utilized a variety of different automated systems in diverse experimental paradigms to identify factors that impact operators' trust. We performed a systematic review of empirical research on trust in automation from January 2002 to June 2013. Papers were deemed eligible only if they reported the results of a human-subjects experiment in which humans interacted with an automated system in order to achieve a goal. Additionally, a relationship between trust (or a trust-related behavior) and another variable had to be measured. All together, 101 total papers, containing 127 eligible studies, were included in the review. Our analysis revealed three layers of variability in human-automation trust (dispositional trust, situational trust, and learned trust), which we organize into a model. We propose design recommendations for creating trustworthy automation and identify environmental conditions that can affect the strength of the relationship between trust and reliance. Future research directions are also discussed for each layer of trust. Our three-layered trust model provides a new lens for conceptualizing the variability of trust in automation. Its structure can be applied to help guide future research and develop training interventions and design procedures that encourage appropriate trust. © 2014, Human Factors and Ergonomics Society.

  17. Integration of Small Solar tower Systems into Distributed Power Islands

    Energy Technology Data Exchange (ETDEWEB)

    Romero, M.; Marcos, M. J.; Tellez, F. M.; Blanco, M.; Fernandez, V.; Baonza, F.; Berger, S. [Ciemat, Madrid (Spain)

    2000-07-01

    One of the short-term priorities for renewable energies in Europe is their integration for local power supply into communities and energy islands (blocks of buildings, new neighborhoods in residential areas, shopping centers, hospitals, recreational areas, eco-paks, small rural areas or isolated ones such as islands or mountain communities). Following this strategy, the integration of small tower fields into so-called MIUS (Modular Integrated Utility Systems) is proposed. This application strongly influences field concepts leadings to modular multi-tower systems able to more closely track demand, meet reliability requirements with fewer megawatts of installed power and spread construction costs over time after output has begum. In addition, integration into single-cycle high-efficiency gas turbines plus waste-heat applications clearly increments the solar share. The chief questions are whether solar towers can be redesigned for such distributed markets and the keys to their feasibility. This paper includes the design and performance analysis of a 1.36-MW plant and integration in the MIUS system, as well as the expected cost of electricity and a sensitivity analysis of the small tower plant's performance with design parameters like heliostat configuration and tower height. A practical application is analyzed for a shopping center with 85% power demand during day-time by using a hybrid solar tower and a gas turbine producing electricity and waste heat for hot water and heating and cooling of spaces. The operation mode proposed is covering night demand with power from the grid and solar-gas power island mode during 14 hours daytime with a maximum power production of 1.36 MW. (Author) 26 refs.

  18. Integration of Small Solar Tower Systems Into Distributed Power Islands

    International Nuclear Information System (INIS)

    Romero, M.; Marcos, M. J.; Tellez, F. M.; Blanco, M.; Fernandez, V.; Baonza, F.; Berger, S.

    1999-01-01

    One of the short-term priorities for renewable energies in Europe is their integration for local power supply into communities and energy islands (blocks of buildings, new neighborhoods in residential areas, shopping centers, hospitals, recreational areas, eco-parks, small rural areas or isolated ones such as islands or mountain communities). Following this strategy, the integration of small tower fields into so-called MIUS (Modular Integrated Utility Systems) is proposed. This application strongly influences field concepts leading to modular multi-tower systems able to more closely track demand, meet reliability requirements with fewer megawatts of installed power and spread construction costs over time after output has begun. In addition, integration into single-cycle high-efficiency gas turbines plus waste-heat applications clearly increments the solar share. The chief questions are whether solar towers can be redesigned for such distributed markets and the keys to their feasibility. This paper includes the design and performance analysis of a 1.36-MW plant and integration in the MIUS system, as well as the expected cost of electricity and a sensitivity analysis of the small tower plant's performance with design parameters like heliostats configuration and tower height. A practical application is analyzed for a shopping center with 85% power demand during day-time by using a hybrid solar tower and a gas turbine producing electricity and waste heat for hot water and heating and cooling of spaces. The operation mode proposed is covering night demand with power from the grid and solar-gas power island mode during 14 hours daytime with a maximum power production of 1.36 MW. (Author) 26 refs

  19. Optimal Solar PV Arrays Integration for Distributed Generation

    Energy Technology Data Exchange (ETDEWEB)

    Omitaomu, Olufemi A [ORNL; Li, Xueping [University of Tennessee, Knoxville (UTK)

    2012-01-01

    Solar photovoltaic (PV) systems hold great potential for distributed energy generation by installing PV panels on rooftops of residential and commercial buildings. Yet challenges arise along with the variability and non-dispatchability of the PV systems that affect the stability of the grid and the economics of the PV system. This paper investigates the integration of PV arrays for distributed generation applications by identifying a combination of buildings that will maximize solar energy output and minimize system variability. Particularly, we propose mean-variance optimization models to choose suitable rooftops for PV integration based on Markowitz mean-variance portfolio selection model. We further introduce quantity and cardinality constraints to result in a mixed integer quadratic programming problem. Case studies based on real data are presented. An efficient frontier is obtained for sample data that allows decision makers to choose a desired solar energy generation level with a comfortable variability tolerance level. Sensitivity analysis is conducted to show the tradeoffs between solar PV energy generation potential and variability.

  20. INTEGRATION OF INFORMATIONAL COMPUTER TECHNOLOGIES SMK: AUTOMATION OF THE MAIN FUNCTIONS OF THE TECHNICAL CONTROL DEPARTMENT

    Directory of Open Access Journals (Sweden)

    S. A. Pavlenko

    2010-01-01

    Full Text Available It is shown that automation of some functions of control department allows to record defects, reclamations and failures of technology, to make the necessary reporting forms and quality certificates for production.

  1. ICECAP: an integrated, general-purpose, automation-assisted IC50/EC50 assay platform.

    Science.gov (United States)

    Li, Ming; Chou, Judy; King, Kristopher W; Jing, Jing; Wei, Dong; Yang, Liyu

    2015-02-01

    IC50 and EC50 values are commonly used to evaluate drug potency. Mass spectrometry (MS)-centric bioanalytical and biomarker labs are now conducting IC50/EC50 assays, which, if done manually, are tedious and error-prone. Existing bioanalytical sample preparation automation systems cannot meet IC50/EC50 assay throughput demand. A general-purpose, automation-assisted IC50/EC50 assay platform was developed to automate the calculations of spiking solutions and the matrix solutions preparation scheme, the actual spiking and matrix solutions preparations, as well as the flexible sample extraction procedures after incubation. In addition, the platform also automates the data extraction, nonlinear regression curve fitting, computation of IC50/EC50 values, graphing, and reporting. The automation-assisted IC50/EC50 assay platform can process the whole class of assays of varying assay conditions. In each run, the system can handle up to 32 compounds and up to 10 concentration levels per compound, and it greatly improves IC50/EC50 assay experimental productivity and data processing efficiency. © 2014 Society for Laboratory Automation and Screening.

  2. Foundations & principles of distributed manufacturing elements of manufacturing networks, cyber-physical production systems and smart automation

    CERN Document Server

    Kühnle, Hermann

    2015-01-01

    The book presents a coherent description of distributed manufacturing, providing a solid base for further research on the subject as well as smart implementations in companies. It provides a guide for those researching and working in a range of fields, such as smart manufacturing, cloud computing, RFID tracking, distributed automation, cyber physical production and global design anywhere, manufacture anywhere solutions. Foundations & Principles of Distributed Manufacturing anticipates future advances in the fields of embedded systems, the Internet of Things and cyber physical systems, outlining how adopting these innovations could rapidly bring about improvements in key performance indicators, which could in turn generate competition pressure by rendering successful business models obsolete. In laying the groundwork for powerful theoretical models, high standards for the homogeneity and soundness of the suggested setups are applied. The book especially elaborates on the upcoming competition in online manu...

  3. Distribution automation and control support; Analysis and interpretation of DAC working group results for use in project planning

    Science.gov (United States)

    Klock, P.; Evans, D.

    1979-01-01

    The Executive Summary and Proceedings of the Working Group Meeting was analyzed to identify specific projects appropriate for Distribution Automation and Control DAC RD&D. Specific projects that should be undertaken in the DAC RD&D program were recommended. The projects are presented under broad categories of work selected based on ESC's interpretation of the results of the Working Group Meeting. Some of the projects are noted as utility industry projects. The ESC recommendations regarding program management are presented. Utility versus Government management responsibilities are noted.

  4. BOA: Framework for Automated Builds

    CERN Document Server

    Ratnikova, N

    2003-01-01

    Managing large-scale software products is a complex software engineering task. The automation of the software development, release and distribution process is most beneficial in the large collaborations, where the big number of developers, multiple platforms and distributed environment are typical factors. This paper describes Build and Output Analyzer framework and its components that have been developed in CMS to facilitate software maintenance and improve software quality. The system allows to generate, control and analyze various types of automated software builds and tests, such as regular rebuilds of the development code, software integration for releases and installation of the existing versions.

  5. BOA: Framework for automated builds

    International Nuclear Information System (INIS)

    Ratnikova, N.

    2003-01-01

    Managing large-scale software products is a complex software engineering task. The automation of the software development, release and distribution process is most beneficial in the large collaborations, where the big number of developers, multiple platforms and distributed environment are typical factors. This paper describes Build and Output Analyzer framework and its components that have been developed in CMS to facilitate software maintenance and improve software quality. The system allows to generate, control and analyze various types of automated software builds and tests, such as regular rebuilds of the development code, software integration for releases and installation of the existing versions

  6. Automated collection of imaging and phenotypic data to centralized and distributed data repositories.

    Science.gov (United States)

    King, Margaret D; Wood, Dylan; Miller, Brittny; Kelly, Ross; Landis, Drew; Courtney, William; Wang, Runtang; Turner, Jessica A; Calhoun, Vince D

    2014-01-01

    Accurate data collection at the ground level is vital to the integrity of neuroimaging research. Similarly important is the ability to connect and curate data in order to make it meaningful and sharable with other investigators. Collecting data, especially with several different modalities, can be time consuming and expensive. These issues have driven the development of automated collection of neuroimaging and clinical assessment data within COINS (Collaborative Informatics and Neuroimaging Suite). COINS is an end-to-end data management system. It provides a comprehensive platform for data collection, management, secure storage, and flexible data retrieval (Bockholt et al., 2010; Scott et al., 2011). It was initially developed for the investigators at the Mind Research Network (MRN), but is now available to neuroimaging institutions worldwide. Self Assessment (SA) is an application embedded in the Assessment Manager (ASMT) tool in COINS. It is an innovative tool that allows participants to fill out assessments via the web-based Participant Portal. It eliminates the need for paper collection and data entry by allowing participants to submit their assessments directly to COINS. Instruments (surveys) are created through ASMT and include many unique question types and associated SA features that can be implemented to help the flow of assessment administration. SA provides an instrument queuing system with an easy-to-use drag and drop interface for research staff to set up participants' queues. After a queue has been created for the participant, they can access the Participant Portal via the internet to fill out their assessments. This allows them the flexibility to participate from home, a library, on site, etc. The collected data is stored in a PostgresSQL database at MRN. This data is only accessible by users that have explicit permission to access the data through their COINS user accounts and access to MRN network. This allows for high volume data collection and

  7. User-friendly establishment of trust in distributed home automation networks

    DEFF Research Database (Denmark)

    Hjorth, Theis Solberg; Madsen, Per Printz; Torbensen, Rune

    2012-01-01

    Current wireless technologies use a variety of methods to locally exchange and verify credentials between devices to establish trusted relationships. Scenarios in home automation networks also require this capability over the Internet, but the necessary involvement of non-expert users to setup...... these relationships can lead to misconfiguration or breaches of security. We outline a security system for Home Automation called Trusted Domain that can establish and maintain cryptographically secure relationships between devices connected via IP-based networks and the Internet. Trust establishment is presented...... of predefined pictograms. This method is designed to scale from smart-phones and tablets down to low-resource embedded systems. The presented approach is supported by an extensive literature study, and the ease of use and feasibility of the method has been indicated through a preliminary user study...

  8. AmeriFlux Data Processing: Integrating automated and manual data management across software technologies and an international network to generate timely data products

    Science.gov (United States)

    Christianson, D. S.; Beekwilder, N.; Chan, S.; Cheah, Y. W.; Chu, H.; Dengel, S.; O'Brien, F.; Pastorello, G.; Sandesh, M.; Torn, M. S.; Agarwal, D.

    2017-12-01

    AmeriFlux is a network of scientists who independently collect eddy covariance and related environmental observations at over 250 locations across the Americas. As part of the AmeriFlux Management Project, the AmeriFlux Data Team manages standardization, collection, quality assurance / quality control (QA/QC), and distribution of data submitted by network members. To generate data products that are timely, QA/QC'd, and repeatable, and have traceable provenance, we developed a semi-automated data processing pipeline. The new pipeline consists of semi-automated format and data QA/QC checks. Results are communicated via on-line reports as well as an issue-tracking system. Data processing time has been reduced from 2-3 days to a few hours of manual review time, resulting in faster data availability from the time of data submission. The pipeline is scalable to the network level and has the following key features. (1) On-line results of the format QA/QC checks are available immediately for data provider review. This enables data providers to correct and resubmit data quickly. (2) The format QA/QC assessment includes an automated attempt to fix minor format errors. Data submissions that are formatted in the new AmeriFlux FP-In standard can be queued for the data QA/QC assessment, often with minimal delay. (3) Automated data QA/QC checks identify and communicate potentially erroneous data via online, graphical quick views that highlight observations with unexpected values, incorrect units, time drifts, invalid multivariate correlations, and/or radiation shadows. (4) Progress through the pipeline is integrated with an issue-tracking system that facilitates communications between data providers and the data processing team in an organized and searchable fashion. Through development of these and other features of the pipeline, we present solutions to challenges that include optimizing automated with manual processing, bridging legacy data management infrastructure with

  9. Integrating packing and distribution problems and optimization through mathematical programming

    Directory of Open Access Journals (Sweden)

    Fabio Miguel

    2016-06-01

    Full Text Available This paper analyzes the integration of two combinatorial problems that frequently arise in production and distribution systems. One is the Bin Packing Problem (BPP problem, which involves finding an ordering of some objects of different volumes to be packed into the minimal number of containers of the same or different size. An optimal solution to this NP-Hard problem can be approximated by means of meta-heuristic methods. On the other hand, we consider the Capacitated Vehicle Routing Problem with Time Windows (CVRPTW, which is a variant of the Travelling Salesman Problem (again a NP-Hard problem with extra constraints. Here we model those two problems in a single framework and use an evolutionary meta-heuristics to solve them jointly. Furthermore, we use data from a real world company as a test-bed for the method introduced here.

  10. Predict-first experimental analysis using automated and integrated magnetohydrodynamic modeling

    Science.gov (United States)

    Lyons, B. C.; Paz-Soldan, C.; Meneghini, O.; Lao, L. L.; Weisberg, D. B.; Belli, E. A.; Evans, T. E.; Ferraro, N. M.; Snyder, P. B.

    2018-05-01

    An integrated-modeling workflow has been developed for the purpose of performing predict-first analysis of transient-stability experiments. Starting from an existing equilibrium reconstruction from a past experiment, the workflow couples together the EFIT Grad-Shafranov solver [L. Lao et al., Fusion Sci. Technol. 48, 968 (2005)], the EPED model for the pedestal structure [P. B. Snyder et al., Phys. Plasmas 16, 056118 (2009)], and the NEO drift-kinetic-equation solver [E. A. Belli and J. Candy, Plasma Phys. Controlled Fusion 54, 015015 (2012)] (for bootstrap current calculations) in order to generate equilibria with self-consistent pedestal structures as the plasma shape and various scalar parameters (e.g., normalized β, pedestal density, and edge safety factor [q95]) are changed. These equilibria are then analyzed using automated M3D-C1 extended-magnetohydrodynamic modeling [S. C. Jardin et al., Comput. Sci. Discovery 5, 014002 (2012)] to compute the plasma response to three-dimensional magnetic perturbations. This workflow was created in conjunction with a DIII-D experiment examining the effect of triangularity on the 3D plasma response. Several versions of the workflow were developed, and the initial ones were used to help guide experimental planning (e.g., determining the plasma current necessary to maintain the constant edge safety factor in various shapes). Subsequent validation with the experimental results was then used to revise the workflow, ultimately resulting in the complete model presented here. We show that quantitative agreement was achieved between the M3D-C1 plasma response calculated for equilibria generated by the final workflow and equilibria reconstructed from experimental data. A comparison of results from earlier workflows is used to show the importance of properly matching certain experimental parameters in the generated equilibria, including the normalized β, pedestal density, and q95. On the other hand, the details of the pedestal

  11. Integrating Xgrid into the HENP distributed computing model

    International Nuclear Information System (INIS)

    Hajdu, L; Lauret, J; Kocoloski, A; Miller, M

    2008-01-01

    Modern Macintosh computers feature Xgrid, a distributed computing architecture built directly into Apple's OS X operating system. While the approach is radically different from those generally expected by the Unix based Grid infrastructures (Open Science Grid, TeraGrid, EGEE), opportunistic computing on Xgrid is nonetheless a tempting and novel way to assemble a computing cluster with a minimum of additional configuration. In fact, it requires only the default operating system and authentication to a central controller from each node. OS X also implements arbitrarily extensible metadata, allowing an instantly updated file catalog to be stored as part of the filesystem itself. The low barrier to entry allows an Xgrid cluster to grow quickly and organically. This paper and presentation will detail the steps that can be taken to make such a cluster a viable resource for HENP research computing. We will further show how to provide to users a unified job submission framework by integrating Xgrid through the STAR Unified Meta-Scheduler (SUMS), making tasks and jobs submission effortlessly at reach for those users already using the tool for traditional Grid or local cluster job submission. We will discuss additional steps that can be taken to make an Xgrid cluster a full partner in grid computing initiatives, focusing on Open Science Grid integration. MIT's Xgrid system currently supports the work of multiple research groups in the Laboratory for Nuclear Science, and has become an important tool for generating simulations and conducting data analyses at the Massachusetts Institute of Technology

  12. Integration of distributed generation in the power system

    CERN Document Server

    Bollen, Math H J

    2011-01-01

    "The integration of new sources of energy like wind power, solar-power, small-scale generation, or combined heat and power in the power grid is something that impacts a lot of stakeholders: network companies (both distribution and transmission), the owners and operators of the DG units, other end-users of the power grid (including normal consumers like you and me) and not in the least policy makers and regulators. There is a lot of misunderstanding about the impact of DG on the power grid, with one side (including mainly some but certainly not all, network companies) claiming that the lights will go out soon, whereas the other side (including some DG operators and large parks of the general public) claiming that there is nothing to worry about and that it's all a conspiracy of the large production companies that want to protect their own interests and keep the electricity price high. The authors are of the strong opinion that this is NOT the way one should approach such an important subject as the integration...

  13. Integrating Xgrid into the HENP distributed computing model

    Science.gov (United States)

    Hajdu, L.; Kocoloski, A.; Lauret, J.; Miller, M.

    2008-07-01

    Modern Macintosh computers feature Xgrid, a distributed computing architecture built directly into Apple's OS X operating system. While the approach is radically different from those generally expected by the Unix based Grid infrastructures (Open Science Grid, TeraGrid, EGEE), opportunistic computing on Xgrid is nonetheless a tempting and novel way to assemble a computing cluster with a minimum of additional configuration. In fact, it requires only the default operating system and authentication to a central controller from each node. OS X also implements arbitrarily extensible metadata, allowing an instantly updated file catalog to be stored as part of the filesystem itself. The low barrier to entry allows an Xgrid cluster to grow quickly and organically. This paper and presentation will detail the steps that can be taken to make such a cluster a viable resource for HENP research computing. We will further show how to provide to users a unified job submission framework by integrating Xgrid through the STAR Unified Meta-Scheduler (SUMS), making tasks and jobs submission effortlessly at reach for those users already using the tool for traditional Grid or local cluster job submission. We will discuss additional steps that can be taken to make an Xgrid cluster a full partner in grid computing initiatives, focusing on Open Science Grid integration. MIT's Xgrid system currently supports the work of multiple research groups in the Laboratory for Nuclear Science, and has become an important tool for generating simulations and conducting data analyses at the Massachusetts Institute of Technology.

  14. Integrated Production-Distribution Scheduling Problem with Multiple Independent Manufacturers

    Directory of Open Access Journals (Sweden)

    Jianhong Hao

    2015-01-01

    Full Text Available We consider the nonstandard parts supply chain with a public service platform for machinery integration in China. The platform assigns orders placed by a machinery enterprise to multiple independent manufacturers who produce nonstandard parts and makes production schedule and batch delivery schedule for each manufacturer in a coordinate manner. Each manufacturer has only one plant with parallel machines and is located at a location far away from other manufacturers. Orders are first processed at the plants and then directly shipped from the plants to the enterprise in order to be finished before a given deadline. We study the above integrated production-distribution scheduling problem with multiple manufacturers to maximize a weight sum of the profit of each manufacturer under the constraints that all orders are finished before the deadline and the profit of each manufacturer is not negative. According to the optimal condition analysis, we formulate the problem as a mixed integer programming model and use CPLEX to solve it.

  15. Distributional aspects of emissions in climate change integrated assessment models

    International Nuclear Information System (INIS)

    Cantore, Nicola

    2011-01-01

    The recent failure of Copenhagen negotiations shows that concrete actions are needed to create the conditions for a consensus over global emission reduction policies. A wide coalition of countries in international climate change agreements could be facilitated by the perceived fairness of rich and poor countries of the abatement sharing at international level. In this paper I use two popular climate change integrated assessment models to investigate the path and decompose components and sources of future inequality in the emissions distribution. Results prove to be consistent with previous empirical studies and robust to model comparison and show that gaps in GDP across world regions will still play a crucial role in explaining different countries contributions to global warming. - Research highlights: → I implement a scenario analysis with two global climate change models. → I analyse inequality in the distribution of emissions. → I decompose emissions inequality components. → I find that GDP per capita is the main Kaya identity source of emissions inequality. → Current rich countries will mostly remain responsible for emissions inequality.

  16. A Case Study of Reverse Engineering Integrated in an Automated Design Process

    Science.gov (United States)

    Pescaru, R.; Kyratsis, P.; Oancea, G.

    2016-11-01

    This paper presents a design methodology which automates the generation of curves extracted from the point clouds that have been obtained by digitizing the physical objects. The methodology is described on a product belonging to the industry of consumables, respectively a footwear type product that has a complex shape with many curves. The final result is the automated generation of wrapping curves, surfaces and solids according to the characteristics of the customer's foot, and to the preferences for the chosen model, which leads to the development of customized products.

  17. Integrated automation of the New Waddell Dam performance data acquisition system

    International Nuclear Information System (INIS)

    Welch, L.R.; Fields, P.E.

    1999-01-01

    New Waddell Dam, a key feature of the US Bureau of Reclamation's Central Arizona Project, had elements of its dam safety data acquisition system incorporated into the design and construction. The instrumentation array is a reflection of the dam's large size and foundation complexity. Much of the instrumentation is automated. This automation was accomplished while maintaining independent communication connections to major divisions of the instrument array. Fiber optic cables are used to provide high Quality data, free from voltage surges that could originate in a nearby powerplant switchyard or from lightning. The system has been working well but there are concerns with a lack of continued equipment manufacturer support

  18. Hexicon 2: automated processing of hydrogen-deuterium exchange mass spectrometry data with improved deuteration distribution estimation.

    Science.gov (United States)

    Lindner, Robert; Lou, Xinghua; Reinstein, Jochen; Shoeman, Robert L; Hamprecht, Fred A; Winkler, Andreas

    2014-06-01

    Hydrogen-deuterium exchange (HDX) experiments analyzed by mass spectrometry (MS) provide information about the dynamics and the solvent accessibility of protein backbone amide hydrogen atoms. Continuous improvement of MS instrumentation has contributed to the increasing popularity of this method; however, comprehensive automated data analysis is only beginning to mature. We present Hexicon 2, an automated pipeline for data analysis and visualization based on the previously published program Hexicon (Lou et al. 2010). Hexicon 2 employs the sensitive NITPICK peak detection algorithm of its predecessor in a divide-and-conquer strategy and adds new features, such as chromatogram alignment and improved peptide sequence assignment. The unique feature of deuteration distribution estimation was retained in Hexicon 2 and improved using an iterative deconvolution algorithm that is robust even to noisy data. In addition, Hexicon 2 provides a data browser that facilitates quality control and provides convenient access to common data visualization tasks. Analysis of a benchmark dataset demonstrates superior performance of Hexicon 2 compared with its predecessor in terms of deuteration centroid recovery and deuteration distribution estimation. Hexicon 2 greatly reduces data analysis time compared with manual analysis, whereas the increased number of peptides provides redundant coverage of the entire protein sequence. Hexicon 2 is a standalone application available free of charge under http://hx2.mpimf-heidelberg.mpg.de.

  19. Hexicon 2: Automated Processing of Hydrogen-Deuterium Exchange Mass Spectrometry Data with Improved Deuteration Distribution Estimation

    Science.gov (United States)

    Lindner, Robert; Lou, Xinghua; Reinstein, Jochen; Shoeman, Robert L.; Hamprecht, Fred A.; Winkler, Andreas

    2014-06-01

    Hydrogen-deuterium exchange (HDX) experiments analyzed by mass spectrometry (MS) provide information about the dynamics and the solvent accessibility of protein backbone amide hydrogen atoms. Continuous improvement of MS instrumentation has contributed to the increasing popularity of this method; however, comprehensive automated data analysis is only beginning to mature. We present Hexicon 2, an automated pipeline for data analysis and visualization based on the previously published program Hexicon (Lou et al. 2010). Hexicon 2 employs the sensitive NITPICK peak detection algorithm of its predecessor in a divide-and-conquer strategy and adds new features, such as chromatogram alignment and improved peptide sequence assignment. The unique feature of deuteration distribution estimation was retained in Hexicon 2 and improved using an iterative deconvolution algorithm that is robust even to noisy data. In addition, Hexicon 2 provides a data browser that facilitates quality control and provides convenient access to common data visualization tasks. Analysis of a benchmark dataset demonstrates superior performance of Hexicon 2 compared with its predecessor in terms of deuteration centroid recovery and deuteration distribution estimation. Hexicon 2 greatly reduces data analysis time compared with manual analysis, whereas the increased number of peptides provides redundant coverage of the entire protein sequence. Hexicon 2 is a standalone application available free of charge under http://hx2.mpimf-heidelberg.mpg.de.

  20. Towards intelligent automation of power plant design and operations: The role of interactive simulations and distributed expert systems

    International Nuclear Information System (INIS)

    Otaduy, P.J.

    1992-01-01

    The design process of a power plant can be viewed as machine- chromosome engineering: When the final layout is implemented, the lifetime operating characteristics, constraints, strengths, and weaknesses of the resulting power-plant-specimen are durably determined. Hence, the safety, operability, maneuverability, availability, maintenance requirements, and costs of a power plant are directly related to the goodness of its electromechanical-genes. This paper addresses the desirability of incorporating distributed computing, distributed object management, and multimedia technologies to power plant engineering, in particular, to design and operations. The promise these technologies have for enhancing the quality and amount of engineering knowledge available, concurrently, online, to plant designers, maintenance crews, and operators is put into perspective. The role that advanced interactive simulations and expert systems will play in the intelligent automation of power plant design and operations is discussed

  1. Optically induced dielectropheresis sorting with automated medium exchange in an integrated optofluidic device resulting in higher cell viability.

    Science.gov (United States)

    Lee, Gwo-Bin; Wu, Huan-Chun; Yang, Po-Fu; Mai, John D

    2014-08-07

    We demonstrated the integration of a microfluidic device with an optically induced dielectrophoresis (ODEP) device such that the critical medium replacement process was performed automatically and the cells could be subsequently manipulated by using digitally projected optical images. ODEP has been demonstrated to generate sufficient forces for manipulating particles/cells by projecting a light pattern onto photoconductive materials which creates virtual electrodes. The production of the ODEP force usually requires a medium that has a suitable electrical conductivity and an appropriate dielectric constant. Therefore, a 0.2 M sucrose solution is commonly used. However, this requires a complicated medium replacement process before one is able to manipulate cells. Furthermore, the 0.2 M sucrose solution is not suitable for the long-term viability of cells. In comparison to conventional manual processes, our automated medium replacement process only took 25 minutes. Experimental data showed that there was up to a 96.2% recovery rate for the manipulated cells. More importantly, the survival rate of the cells was greatly enhanced due to this faster automated process. This newly developed microfluidic chip provided a promising platform for the rapid replacement of the cell medium and this was also the first time that an ODEP device was integrated with other active flow control components in a microfluidic device. By improving cell viability after cell manipulation, this design may contribute to the practical integration of ODEP modules into other lab-on-a-chip devices and biomedical applications in the future.

  2. Studying the Impact of Distributed Solar PV on Power Systems using Integrated Transmission and Distribution Models: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Jain, Himanshu [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Palmintier, Bryan S [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Krad, Ibrahim [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Krishnamurthy, Dheepak [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-08-24

    This paper presents the results of a distributed solar PV impact assessment study that was performed using a synthetic integrated transmission (T) and distribution (D) model. The primary objective of the study was to present a new approach for distributed solar PV impact assessment, where along with detailed models of transmission and distribution networks, consumer loads were modeled using the physics of end-use equipment, and distributed solar PV was geographically dispersed and connected to the secondary distribution networks. The highlights of the study results were (i) increase in the Area Control Error (ACE) at high penetration levels of distributed solar PV; and (ii) differences in distribution voltages profiles and voltage regulator operations between integrated T&D and distribution only simulations.

  3. Configuration Management Automation (CMA) -

    Data.gov (United States)

    Department of Transportation — Configuration Management Automation (CMA) will provide an automated, integrated enterprise solution to support CM of FAA NAS and Non-NAS assets and investments. CMA...

  4. Towards a fully automated lab-on-a-disc system integrating sample enrichment and detection of analytes from complex matrices

    DEFF Research Database (Denmark)

    Andreasen, Sune Zoëga

    the technology on a large scale from fulfilling its potential for maturing into applied technologies and products. In this work, we have taken the first steps towards realizing a capable and truly automated “sample-to-answer” analysis system, aimed at small molecule detection and quantification from a complex...... sample matrix. The main result is a working prototype of a microfluidic system, integrating both centrifugal microfluidics for sample handling, supported liquid membrane extraction (SLM) for selective and effective sample treatment, as well as in-situ electrochemical detection. As a case study...

  5. The Automated DC Parameter Testing of GaAs MESFETs Using the Singer Automatic Integrated Circuit Test System.

    Science.gov (United States)

    1980-09-01

    USING THE SINGER AUTOMATIC INTEGRATED CIRCUIT TEST SYSTEM, THOMAS L. HARPER AFIT/EE/GE/80- 7 Ist LT USAF -- -- - - __ AFIT/EE/GE/80-7 THE AUTOMATED DC...THOMAS L. HARPER ist Lt USAF Graduate Electrical Engineering September 1980 it’ Codes A _ _ _ J PREFACE This report is in support of the ongoing effort in...8217.-- I *t -1 ,p - tUel-, ir. ( /.s , j Yf) L) b ..... l P i:. +’ ,T i~: ",,’+l l L V i i ,’b : O Iil r, P V 47 C’+t ( ’ I ViH 47 V ’L 4 £, ,.;l 1 , h

  6. Automated detection of fluorescent cells in in-resin fluorescence sections for integrated light and electron microscopy.

    Science.gov (United States)

    Delpiano, J; Pizarro, L; Peddie, C J; Jones, M L; Griffin, L D; Collinson, L M

    2018-04-26

    Integrated array tomography combines fluorescence and electron imaging of ultrathin sections in one microscope, and enables accurate high-resolution correlation of fluorescent proteins to cell organelles and membranes. Large numbers of serial sections can be imaged sequentially to produce aligned volumes from both imaging modalities, thus producing enormous amounts of data that must be handled and processed using novel techniques. Here, we present a scheme for automated detection of fluorescent cells within thin resin sections, which could then be used to drive automated electron image acquisition from target regions via 'smart tracking'. The aim of this work is to aid in optimization of the data acquisition process through automation, freeing the operator to work on other tasks and speeding up the process, while reducing data rates by only acquiring images from regions of interest. This new method is shown to be robust against noise and able to deal with regions of low fluorescence. © 2018 The Authors. Journal of Microscopy published by JohnWiley & Sons Ltd on behalf of Royal Microscopical Society.

  7. Appropriate Automation-Integrating Technical, Human, Organisational, Economic and Cultural Factors

    NARCIS (Netherlands)

    Martin, T.; Kiwinen, J.; Rijnsdorp, J.E.; Rijnsdorp, J.E.; Rodd, M.G.; Rouse, W.B.

    1991-01-01

    Automation technology, including digital computer and communication techniques, is being applied in an ever-increasing range of private and public spheres, and reaching third world cultures not previously exposed to such technology. It is engineers' responsibility to consider the direct and indirect

  8. The integrated business information system: using automation to monitor cost-effectiveness of park operations

    Science.gov (United States)

    Dick Stanley; Bruce Jackson

    1995-01-01

    The cost-effectiveness of park operations is often neglected because information is laborious to compile. The information, however, is critical if we are to derive maximum benefit from scarce resources. This paper describes an automated system for calculating cost-effectiveness ratios with minimum effort using data from existing data bases.

  9. Standard IEC 61850 substation automation

    Energy Technology Data Exchange (ETDEWEB)

    Bricchi, A.; Mezzadri, D. [Selta, Tortoreto (Italy)

    2008-07-01

    The International Electrotechnical Commission (IEC) 61850 standard is the reference communication protocol for all electrical substations protection and control systems. It creates models of all the elements and functionalities of an electrical substation, including physical elements such as switches or circuit breakers, as well as protection, control and monitoring functionalities. Network managers need to renew power substation automation and control systems in order to improve the efficiency and quality of services offered by electric utilities. Selta has proposed a new integrated solution for the automation of power substations which is fully compliant with the IEC 61850 norms. The solution involves the integration of control, automation, protection, monitoring and maintenance functions and applies leading edge technology to its systems, particularly for the TERNA network. The system is based on the use of many electronic devices at a power plant, each one with a specialized function, and all interconnected via a Station LAN. This solution, was tested on the TERNA network in Italy, in VHV and HV stations. It was shown to offer many advantages, such as an architecture based on full interoperability between control, monitoring and protection equipment; centralized and distributed automation; a LAN station that allows full interoperability between different bay units and protection relays in order to integrate equipment from various suppliers; the integration of automation systems in existing bay units and protection relays equipped with standard communication buses or with proprietary interfaces; and time synchronization for the entire system through a station GPS reception system. 10 refs., 1 tab., 7 figs.

  10. Integrating Xgrid into the HENP distributed computing model

    Energy Technology Data Exchange (ETDEWEB)

    Hajdu, L; Lauret, J [Brookhaven National Laboratory, Upton, NY 11973 (United States); Kocoloski, A; Miller, M [Department of Physics, Massachusetts Institute of Technology, Cambridge, MA 02139 (United States)], E-mail: kocolosk@mit.edu

    2008-07-15

    Modern Macintosh computers feature Xgrid, a distributed computing architecture built directly into Apple's OS X operating system. While the approach is radically different from those generally expected by the Unix based Grid infrastructures (Open Science Grid, TeraGrid, EGEE), opportunistic computing on Xgrid is nonetheless a tempting and novel way to assemble a computing cluster with a minimum of additional configuration. In fact, it requires only the default operating system and authentication to a central controller from each node. OS X also implements arbitrarily extensible metadata, allowing an instantly updated file catalog to be stored as part of the filesystem itself. The low barrier to entry allows an Xgrid cluster to grow quickly and organically. This paper and presentation will detail the steps that can be taken to make such a cluster a viable resource for HENP research computing. We will further show how to provide to users a unified job submission framework by integrating Xgrid through the STAR Unified Meta-Scheduler (SUMS), making tasks and jobs submission effortlessly at reach for those users already using the tool for traditional Grid or local cluster job submission. We will discuss additional steps that can be taken to make an Xgrid cluster a full partner in grid computing initiatives, focusing on Open Science Grid integration. MIT's Xgrid system currently supports the work of multiple research groups in the Laboratory for Nuclear Science, and has become an important tool for generating simulations and conducting data analyses at the Massachusetts Institute of Technology.

  11. Energy Systems Integration: Demonstrating Distribution Feeder Voltage Control

    Energy Technology Data Exchange (ETDEWEB)

    2017-01-01

    Overview fact sheet about the Smarter Grid Solutions Integrated Network Testbed for Energy Grid Research and Technology Experimentation (INTEGRATE) project at the Energy Systems Integration Facility. INTEGRATE is part of the U.S. Department of Energy's Grid Modernization Initiative.

  12. Energy Systems Integration: Demonstrating Distributed Grid-Edge Control Hierarchy

    Energy Technology Data Exchange (ETDEWEB)

    2017-01-01

    Overview fact sheet about the OMNETRIC Group Integrated Network Testbed for Energy Grid Research and Technology Experimentation (INTEGRATE) project at the Energy Systems Integration Facility. INTEGRATE is part of the U.S. Department of Energy's Grid Modernization Initiative.

  13. Integrated Tools for Future Distributed Engine Control Technologies

    Science.gov (United States)

    Culley, Dennis; Thomas, Randy; Saus, Joseph

    2013-01-01

    Turbine engines are highly complex mechanical systems that are becoming increasingly dependent on control technologies to achieve system performance and safety metrics. However, the contribution of controls to these measurable system objectives is difficult to quantify due to a lack of tools capable of informing the decision makers. This shortcoming hinders technology insertion in the engine design process. NASA Glenn Research Center is developing a Hardware-inthe- Loop (HIL) platform and analysis tool set that will serve as a focal point for new control technologies, especially those related to the hardware development and integration of distributed engine control. The HIL platform is intended to enable rapid and detailed evaluation of new engine control applications, from conceptual design through hardware development, in order to quantify their impact on engine systems. This paper discusses the complex interactions of the control system, within the context of the larger engine system, and how new control technologies are changing that paradigm. The conceptual design of the new HIL platform is then described as a primary tool to address those interactions and how it will help feed the insertion of new technologies into future engine systems.

  14. SSTAC/ARTS review of the draft Integrated Technology Plan (ITP). Volume 8: Aerothermodynamics Automation and Robotics (A/R) systems sensors, high-temperature superconductivity

    Science.gov (United States)

    1991-01-01

    Viewgraphs of briefings presented at the SSTAC/ARTS review of the draft Integrated Technology Plan (ITP) on aerothermodynamics, automation and robotics systems, sensors, and high-temperature superconductivity are included. Topics covered include: aerothermodynamics; aerobraking; aeroassist flight experiment; entry technology for probes and penetrators; automation and robotics; artificial intelligence; NASA telerobotics program; planetary rover program; science sensor technology; direct detector; submillimeter sensors; laser sensors; passive microwave sensing; active microwave sensing; sensor electronics; sensor optics; coolers and cryogenics; and high temperature superconductivity.

  15. SSTAC/ARTS review of the draft Integrated Technology Plan (ITP). Volume 8: Aerothermodynamics Automation and Robotics (A/R) systems sensors, high-temperature superconductivity

    International Nuclear Information System (INIS)

    1991-06-01

    Viewgraphs of briefings presented at the SSTAC/ARTS review of the draft Integrated Technology Plan (ITP) on aerothermodynamics, automation and robotics systems, sensors, and high-temperature superconductivity are included. Topics covered include: aerothermodynamics; aerobraking; aeroassist flight experiment; entry technology for probes and penetrators; automation and robotics; artificial intelligence; NASA telerobotics program; planetary rover program; science sensor technology; direct detector; submillimeter sensors; laser sensors; passive microwave sensing; active microwave sensing; sensor electronics; sensor optics; coolers and cryogenics; and high temperature superconductivity

  16. Process automation

    International Nuclear Information System (INIS)

    Moser, D.R.

    1986-01-01

    Process automation technology has been pursued in the chemical processing industries and to a very limited extent in nuclear fuel reprocessing. Its effective use has been restricted in the past by the lack of diverse and reliable process instrumentation and the unavailability of sophisticated software designed for process control. The Integrated Equipment Test (IET) facility was developed by the Consolidated Fuel Reprocessing Program (CFRP) in part to demonstrate new concepts for control of advanced nuclear fuel reprocessing plants. A demonstration of fuel reprocessing equipment automation using advanced instrumentation and a modern, microprocessor-based control system is nearing completion in the facility. This facility provides for the synergistic testing of all chemical process features of a prototypical fuel reprocessing plant that can be attained with unirradiated uranium-bearing feed materials. The unique equipment and mission of the IET facility make it an ideal test bed for automation studies. This effort will provide for the demonstration of the plant automation concept and for the development of techniques for similar applications in a full-scale plant. A set of preliminary recommendations for implementing process automation has been compiled. Some of these concepts are not generally recognized or accepted. The automation work now under way in the IET facility should be useful to others in helping avoid costly mistakes because of the underutilization or misapplication of process automation. 6 figs

  17. Automating multistep flow synthesis: approach and challenges in integrating chemistry, machines and logic

    Directory of Open Access Journals (Sweden)

    Chinmay A. Shukla

    2017-05-01

    Full Text Available The implementation of automation in the multistep flow synthesis is essential for transforming laboratory-scale chemistry into a reliable industrial process. In this review, we briefly introduce the role of automation based on its application in synthesis viz. auto sampling and inline monitoring, optimization and process control. Subsequently, we have critically reviewed a few multistep flow synthesis and suggested a possible control strategy to be implemented so that it helps to reliably transfer the laboratory-scale synthesis strategy to a pilot scale at its optimum conditions. Due to the vast literature in multistep synthesis, we have classified the literature and have identified the case studies based on few criteria viz. type of reaction, heating methods, processes involving in-line separation units, telescopic synthesis, processes involving in-line quenching and process with the smallest time scale of operation. This classification will cover the broader range in the multistep synthesis literature.

  18. Analog integrated circuit design automation placement, routing and parasitic extraction techniques

    CERN Document Server

    Martins, Ricardo; Horta, Nuno

    2017-01-01

    This book introduces readers to a variety of tools for analog layout design automation. After discussing the placement and routing problem in electronic design automation (EDA), the authors overview a variety of automatic layout generation tools, as well as the most recent advances in analog layout-aware circuit sizing. The discussion includes different methods for automatic placement (a template-based Placer and an optimization-based Placer), a fully-automatic Router and an empirical-based Parasitic Extractor. The concepts and algorithms of all the modules are thoroughly described, enabling readers to reproduce the methodologies, improve the quality of their designs, or use them as starting point for a new tool. All the methods described are applied to practical examples for a 130nm design process, as well as placement and routing benchmark sets. Introduces readers to hierarchical combination of Pareto fronts of placements; Presents electromigration-aware routing with multilayer multiport terminal structures...

  19. Automated quantitative 3D analysis of aorta size, morphology, and mural calcification distributions

    Energy Technology Data Exchange (ETDEWEB)

    Kurugol, Sila, E-mail: sila.kurugol@childrens.harvard.edu; Come, Carolyn E.; Diaz, Alejandro A.; Ross, James C.; Washko, George R.; San Jose Estepar, Raul [Brigham and Women’s Hospital and Harvard Medical School, Boston, Massachusetts 02115 (United States); Kinney, Greg L.; Black-Shinn, Jennifer L.; Hokanson, John E. [Colorado School of Public Health, University of Colorado Denver, Aurora, Colorado 80045 (United States); Budoff, Matthew J. [Los Angeles Biomedical Research Center at Harbor and UCLA Medical Center, Torrance, California 90502 (United States)

    2015-09-15

    Purpose: The purpose of this work is to develop a fully automated pipeline to compute aorta morphology and calcification measures in large cohorts of CT scans that can be used to investigate the potential of these measures as imaging biomarkers of cardiovascular disease. Methods: The first step of the automated pipeline is aorta segmentation. The algorithm the authors propose first detects an initial aorta boundary by exploiting cross-sectional circularity of aorta in axial slices and aortic arch in reformatted oblique slices. This boundary is then refined by a 3D level-set segmentation that evolves the boundary to the location of nearby edges. The authors then detect the aortic calcifications with thresholding and filter out the false positive regions due to nearby high intensity structures based on their anatomical location. The authors extract the centerline and oblique cross sections of the segmented aortas and compute the aorta morphology and calcification measures of the first 2500 subjects from COPDGene study. These measures include volume and number of calcified plaques and measures of vessel morphology such as average cross-sectional area, tortuosity, and arch width. Results: The authors computed the agreement between the algorithm and expert segmentations on 45 CT scans and obtained a closest point mean error of 0.62 ± 0.09 mm and a Dice coefficient of 0.92 ± 0.01. The calcification detection algorithm resulted in an improved true positive detection rate of 0.96 compared to previous work. The measurements of aorta size agreed with the measurements reported in previous work. The initial results showed associations of aorta morphology with calcification and with aging. These results may indicate aorta stiffening and unwrapping with calcification and aging. Conclusions: The authors have developed an objective tool to assess aorta morphology and aortic calcium plaques on CT scans that may be used to provide information about the presence of cardiovascular

  20. FIGENIX: Intelligent automation of genomic annotation: expertise integration in a new software platform

    Directory of Open Access Journals (Sweden)

    Pontarotti Pierre

    2005-08-01

    Full Text Available Abstract Background Two of the main objectives of the genomic and post-genomic era are to structurally and functionally annotate genomes which consists of detecting genes' position and structure, and inferring their function (as well as of other features of genomes. Structural and functional annotation both require the complex chaining of numerous different software, algorithms and methods under the supervision of a biologist. The automation of these pipelines is necessary to manage huge amounts of data released by sequencing projects. Several pipelines already automate some of these complex chaining but still necessitate an important contribution of biologists for supervising and controlling the results at various steps. Results Here we propose an innovative automated platform, FIGENIX, which includes an expert system capable to substitute to human expertise at several key steps. FIGENIX currently automates complex pipelines of structural and functional annotation under the supervision of the expert system (which allows for example to make key decisions, check intermediate results or refine the dataset. The quality of the results produced by FIGENIX is comparable to those obtained by expert biologists with a drastic gain in terms of time costs and avoidance of errors due to the human manipulation of data. Conclusion The core engine and expert system of the FIGENIX platform currently handle complex annotation processes of broad interest for the genomic community. They could be easily adapted to new, or more specialized pipelines, such as for example the annotation of miRNAs, the classification of complex multigenic families, annotation of regulatory elements and other genomic features of interest.

  1. Integration of biotechnology, visualisation technology and robot technology for automated mass propagation af elite trees

    DEFF Research Database (Denmark)

    Find, Jens

    for the production of Christmas trees and Sitka spruce has gained renewed interest as a fast growing species for the production biofuels. These species are used as model systems for the development of automated plant production based on robot and visualisation technology. The commercial aspect of the project aims at......: 1) the market for cloned elite plants in the forestry sector and 2) the market for robot technology in the production of plants for the forestry sector....

  2. Automated mineralogy and petrology - applications of TESCAN Integrated Mineral Analyzer (TIMA)

    Czech Academy of Sciences Publication Activity Database

    Hrstka, Tomáš; Gottlieb, P.; Skála, Roman; Breiter, Karel; Motl, D.

    2018-01-01

    Roč. 63, č. 1 (2018), s. 47-63 ISSN 1802-6222 Grant - others:AV ČR(CZ) StrategieAV21/4 Program:StrategieAV Institutional support: RVO:67985831 Keywords : TIMA * Automated SEM/EDS * applied mineralogy * modal analysis * artificial intelligence * neural networks Subject RIV: DB - Geology ; Mineralogy OBOR OECD: Mineralogy Impact factor: 0.609, year: 2016

  3. Intelligent Assistants for Distributed Knowledge Acquisition, Integration, Validation, and Maintenance

    National Research Council Canada - National Science Library

    Tecuci, Gheorghe; Boicu, Mihai

    2008-01-01

    This research has developed an integrated set of tools, called Disciple 2008 learning agent shell, for continuous acquisition of knowledge directly from subject matter experts, and for the integration...

  4. Using integrated environmental modeling to automate a process-based Quantitative Microbial Risk Assessment

    Science.gov (United States)

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, an...

  5. Using Integrated Environmental Modeling to Automate a Process-Based Quantitative Microbial Risk Assessment (presentation)

    Science.gov (United States)

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, and...

  6. Market Integration Dynamics and Asymptotic Price Convergence in Distribution

    NARCIS (Netherlands)

    A. García-Hiernaux (Alfredo); D.E. Guerrero (David); M.J. McAleer (Michael)

    2015-01-01

    textabstractThis paper analyzes the market integration process of nominal prices, develops a model to analyze market integration, and presents a test of increasing market integration. A distinction is made between the economic concepts of price conver- gence in mean and variance. When both types of

  7. An automated system for studying the power distribution of electron beams

    Energy Technology Data Exchange (ETDEWEB)

    Filarowski, C.A.

    1994-12-01

    Precise welds with an electron beam welder are difficult to reproduce because the factors effecting the electron beam current density distribution are not easily controlled. One method for measuring the power density distribution in EB welds uses computer tomography to reconstruct an image of the current density distribution. This technique uses many separate pieces of hardware and software packages to obtain the data and then reconstruct it consequently, transferring this technology between different machines and operators is difficult. Consolidating all of the hardware and software into one machine to execute the same tasks will allow for real-time measurement of the EB power density distribution and will provide a facilitated means for transferring various welding procedure between different machines and operators, thereby enhancing reproducibility of electron beam welds.

  8. 75 FR 5244 - Pipeline Safety: Integrity Management Program for Gas Distribution Pipelines; Correction

    Science.gov (United States)

    2010-02-02

    ... Management Program for Gas Distribution Pipelines; Correction AGENCY: Pipeline and Hazardous Materials Safety... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration 49 CFR Part... Regulations to require operators of gas distribution pipelines to develop and implement integrity management...

  9. Integrated automated nanomanipulation and real-time cellular surface imaging for mechanical properties characterization

    Science.gov (United States)

    Eslami, Sohrab; Zareian, Ramin; Jalili, Nader

    2012-10-01

    Surface microscopy of individual biological cells is essential for determining the patterns of cell migration to study the tumor formation or metastasis. This paper presents a correlated and effective theoretical and experimental technique to automatically address the biophysical and mechanical properties and acquire live images of biological cells which are of interest in studying cancer. In the theoretical part, a distributed-parameters model as the comprehensive representation of the microcantilever is presented along with a model of the contact force as a function of the indentation depth and mechanical properties of the biological sample. Analysis of the transfer function of the whole system in the frequency domain is carried out to characterize the stiffness and damping coefficients of the sample. In the experimental section, unlike the conventional atomic force microscope techniques basically using the laser for determining the deflection of microcantilever's tip, a piezoresistive microcantilever serving as a force sensor is implemented to produce the appropriate voltage and measure the deflection of the microcantilever. A micromanipulator robotic system is integrated with the MATLAB® and programmed in such a way to automatically control the microcantilever mounted on the tip of the micromanipulator to achieve the topography of biological samples including the human corneal cells. For this purpose, the human primary corneal fibroblasts are extracted and adhered on a sterilized culture dish and prepared to attain their topographical image. The proposed methodology herein allows an approach to obtain 2D quality images of cells being comparatively cost effective and extendable to obtain 3D images of individual cells. The characterized mechanical properties of the human corneal cell are furthermore established by comparing and validating the phase shift of the theoretical and experimental results of the frequency response.

  10. Integration of multispectral face recognition and multi-PTZ camera automated surveillance for security applications

    Science.gov (United States)

    Chen, Chung-Hao; Yao, Yi; Chang, Hong; Koschan, Andreas; Abidi, Mongi

    2013-06-01

    Due to increasing security concerns, a complete security system should consist of two major components, a computer-based face-recognition system and a real-time automated video surveillance system. A computerbased face-recognition system can be used in gate access control for identity authentication. In recent studies, multispectral imaging and fusion of multispectral narrow-band images in the visible spectrum have been employed and proven to enhance the recognition performance over conventional broad-band images, especially when the illumination changes. Thus, we present an automated method that specifies the optimal spectral ranges under the given illumination. Experimental results verify the consistent performance of our algorithm via the observation that an identical set of spectral band images is selected under all tested conditions. Our discovery can be practically used for a new customized sensor design associated with given illuminations for an improved face recognition performance over conventional broad-band images. In addition, once a person is authorized to enter a restricted area, we still need to continuously monitor his/her activities for the sake of security. Because pantilt-zoom (PTZ) cameras are capable of covering a panoramic area and maintaining high resolution imagery for real-time behavior understanding, researches in automated surveillance systems with multiple PTZ cameras have become increasingly important. Most existing algorithms require the prior knowledge of intrinsic parameters of the PTZ camera to infer the relative positioning and orientation among multiple PTZ cameras. To overcome this limitation, we propose a novel mapping algorithm that derives the relative positioning and orientation between two PTZ cameras based on a unified polynomial model. This reduces the dependence on the knowledge of intrinsic parameters of PTZ camera and relative positions. Experimental results demonstrate that our proposed algorithm presents substantially

  11. An integrated control for overvoltage mitigation in the distribution network

    NARCIS (Netherlands)

    Viyathukattuva Mohamed Ali, M.M.; Nguyen, P.H.; Kling, W.L.

    2014-01-01

    Increasing share of distributed renewable energy sources (DRES) in the distribution network raises new operational and power quality challenges like overvoltage in the network feeders. Such power quality challenge limits the penetration of DRES in the distribution network. This paper addresses a

  12. Integrated automation system for a pilot plant for energy conversion using PEMFCs

    International Nuclear Information System (INIS)

    Culcer, Mihai; Iliescu, Mariana; Raceanu, Mircea; Stanciu, Vasile; Stefanescu, Ioan; Enache, Adrian; Lazaro, Pavel Gabriel; Lazaroiu, Gheorghe; Badea, Adrian

    2007-01-01

    Based on Hydrogen and Fuel Cells researches and technological capabilities achieved in the National R and D Programs, ICIT Rm. Valcea built an experimental-demonstrative pilot plant for energy conversion using hydrogen PEMFCs. This pilot plant consists of a fuel processor based on steam methane reforming (SMR) process, a hydrogen purification unit, a PEM fuel cells stack (FCS) and a power electronics unit. The paper deals with the dedicated controlling system that provides automated data acquisition, manual or on-line operational control, gas management, humidification, temperature and flow controls. (authors)

  13. The use of software agents and distributed objects to integrate enterprises: Compatible or competing technologies?

    Energy Technology Data Exchange (ETDEWEB)

    Pancerella, C.M.

    1998-04-01

    Distributed object and software agent technologies are two integration methods for connecting enterprises. The two technologies have overlapping goals--interoperability and architectural support for integrating software components--though to date little or no integration of the two technologies has been made at the enterprise level. The primary difference between these two technologies is that distributed object technologies focus on the problems inherent in connecting distributed heterogeneous systems whereas software agent technologies focus on the problems involved with coordination and knowledge exchange across domain boundaries. This paper addresses the integration of these technologies in support of enterprise integration across organizational and geographic boundaries. The authors discuss enterprise integration issues, review their experiences with both technologies, and make recommendations for future work. Neither technology is a panacea. Good software engineering techniques must be applied to integrate an enterprise because scalability and a distributed software development team are realities.

  14. Improving transcriptome construction in non-model organisms: integrating manual and automated gene definition in Emiliania huxleyi.

    Science.gov (United States)

    Feldmesser, Ester; Rosenwasser, Shilo; Vardi, Assaf; Ben-Dor, Shifra

    2014-02-22

    The advent of Next Generation Sequencing technologies and corresponding bioinformatics tools allows the definition of transcriptomes in non-model organisms. Non-model organisms are of great ecological and biotechnological significance, and consequently the understanding of their unique metabolic pathways is essential. Several methods that integrate de novo assembly with genome-based assembly have been proposed. Yet, there are many open challenges in defining genes, particularly where genomes are not available or incomplete. Despite the large numbers of transcriptome assemblies that have been performed, quality control of the transcript building process, particularly on the protein level, is rarely performed if ever. To test and improve the quality of the automated transcriptome reconstruction, we used manually defined and curated genes, several of them experimentally validated. Several approaches to transcript construction were utilized, based on the available data: a draft genome, high quality RNAseq reads, and ESTs. In order to maximize the contribution of the various data, we integrated methods including de novo and genome based assembly, as well as EST clustering. After each step a set of manually curated genes was used for quality assessment of the transcripts. The interplay between the automated pipeline and the quality control indicated which additional processes were required to improve the transcriptome reconstruction. We discovered that E. huxleyi has a very high percentage of non-canonical splice junctions, and relatively high rates of intron retention, which caused unique issues with the currently available tools. While individual tools missed genes and artificially joined overlapping transcripts, combining the results of several tools improved the completeness and quality considerably. The final collection, created from the integration of several quality control and improvement rounds, was compared to the manually defined set both on the DNA and

  15. Development of an Automated LIBS Analytical Test System Integrated with Component Control and Spectrum Analysis Capabilities

    International Nuclear Information System (INIS)

    Ding Yu; Tian Di; Chen Feipeng; Chen Pengfei; Qiao Shujun; Yang Guang; Li Chunsheng

    2015-01-01

    The present paper proposes an automated Laser-Induced Breakdown Spectroscopy (LIBS) analytical test system, which consists of a LIBS measurement and control platform based on a modular design concept, and a LIBS qualitative spectrum analysis software and is developed in C#. The platform provides flexible interfacing and automated control; it is compatible with different manufacturer component models and is constructed in modularized form for easy expandability. During peak identification, a more robust peak identification method with improved stability in peak identification has been achieved by applying additional smoothing on the slope obtained by calculation before peak identification. For the purpose of element identification, an improved main lines analysis method, which detects all elements on the spectral peak to avoid omission of certain elements without strong spectral lines, is applied to element identification in the tested LIBS samples. This method also increases the identification speed. In this paper, actual applications have been carried out. According to tests, the analytical test system is compatible with components of various models made by different manufacturers. It can automatically control components to get experimental data and conduct filtering, peak identification and qualitative analysis, etc. on spectral data. (paper)

  16. The Development of PIPA: An Integrated and Automated Pipeline for Genome-Wide Protein Function Annotation

    National Research Council Canada - National Science Library

    Yu, Chenggang; Zavaljevski, Nela; Desai, Valmik; Johnson, Seth; Stevens, Fred J; Reifman, Jaques

    2008-01-01

    .... With the existence of many programs and databases for inferring different protein functions, a pipeline that properly integrates these resources will benefit from the advantages of each method...

  17. Passivity-Based Automated Design of Stable Multi-Feedback Distributed Power Delivery Systems

    Science.gov (United States)

    2017-03-01

    power ility criterion tomated desi designing a shown in Fig gulator is syn ogy and desig is used durin ber and lo upplies in th stem. During livery...Nu co ,736 318 578 776 ng scenario i y system. Th ted system i rent loads. I power supplie ocation of th fies the qualit the distribute tomated

  18. Demand side integration aspects in active distribution planning

    DEFF Research Database (Denmark)

    Silvestro, Federico; Baitch, Alex; Pilo, Fabrizio

    2013-01-01

    will be implemented in practice? How will regulatory frameworks and connection agreements evolve to support ADSs? The present work provides some information about the closer and closer integration between network planning and Demand Side Integration that is foreseen in the future and shows the necessity to develop...

  19. MannDB – A microbial database of automated protein sequence analyses and evidence integration for protein characterization

    Directory of Open Access Journals (Sweden)

    Kuczmarski Thomas A

    2006-10-01

    Full Text Available Abstract Background MannDB was created to meet a need for rapid, comprehensive automated protein sequence analyses to support selection of proteins suitable as targets for driving the development of reagents for pathogen or protein toxin detection. Because a large number of open-source tools were needed, it was necessary to produce a software system to scale the computations for whole-proteome analysis. Thus, we built a fully automated system for executing software tools and for storage, integration, and display of automated protein sequence analysis and annotation data. Description MannDB is a relational database that organizes data resulting from fully automated, high-throughput protein-sequence analyses using open-source tools. Types of analyses provided include predictions of cleavage, chemical properties, classification, features, functional assignment, post-translational modifications, motifs, antigenicity, and secondary structure. Proteomes (lists of hypothetical and known proteins are downloaded and parsed from Genbank and then inserted into MannDB, and annotations from SwissProt are downloaded when identifiers are found in the Genbank entry or when identical sequences are identified. Currently 36 open-source tools are run against MannDB protein sequences either on local systems or by means of batch submission to external servers. In addition, BLAST against protein entries in MvirDB, our database of microbial virulence factors, is performed. A web client browser enables viewing of computational results and downloaded annotations, and a query tool enables structured and free-text search capabilities. When available, links to external databases, including MvirDB, are provided. MannDB contains whole-proteome analyses for at least one representative organism from each category of biological threat organism listed by APHIS, CDC, HHS, NIAID, USDA, USFDA, and WHO. Conclusion MannDB comprises a large number of genomes and comprehensive protein

  20. Integrating security in a group oriented distributed system

    Science.gov (United States)

    Reiter, Michael; Birman, Kenneth; Gong, LI

    1992-01-01

    A distributed security architecture is proposed for incorporation into group oriented distributed systems, and in particular, into the Isis distributed programming toolkit. The primary goal of the architecture is to make common group oriented abstractions robust in hostile settings, in order to facilitate the construction of high performance distributed applications that can tolerate both component failures and malicious attacks. These abstractions include process groups and causal group multicast. Moreover, a delegation and access control scheme is proposed for use in group oriented systems. The focus is the security architecture; particular cryptosystems and key exchange protocols are not emphasized.

  1. SU-E-T-427: Feasibility Study for Evaluation of IMRT Dose Distribution Using Geant4-Based Automated Algorithms

    International Nuclear Information System (INIS)

    Choi, H; Shin, W; Testa, M; Min, C; Kim, J

    2015-01-01

    Purpose: For intensity-modulated radiation therapy (IMRT) treatment planning validation using Monte Carlo (MC) simulations, a precise and automated procedure is necessary to evaluate the patient dose distribution. The aim of this study is to develop an automated algorithm for IMRT simulations using DICOM files and to evaluate the patient dose based on 4D simulation using the Geant4 MC toolkit. Methods: The head of a clinical linac (Varian Clinac 2300 IX) was modeled in Geant4 along with particular components such as the flattening filter and the multi-leaf collimator (MLC). Patient information and the position of the MLC were imported from the DICOM-RT interface. For each position of the MLC, a step- and-shoot technique was adopted. PDDs and lateral profiles were simulated in a water phantom (50×50×40 cm 3 ) and compared to measurement data. We used a lung phantom and MC-dose calculations were compared to the clinical treatment planning used at the Seoul National University Hospital. Results: In order to reproduce the measurement data, we tuned three free parameters: mean and standard deviation of the primary electron beam energy and the beam spot size. These parameters for 6 MV were found to be 5.6 MeV, 0.2378 MeV and 1 mm FWHM respectively. The average dose difference between measurements and simulations was less than 2% for PDDs and radial profiles. The lung phantom study showed fairly good agreement between MC and planning dose despite some unavoidable statistical fluctuation. Conclusion: The current feasibility study using the lung phantom shows the potential for IMRT dose validation using 4D MC simulations using Geant4 tool kits. This research was supported by Korea Institute of Nuclear safety and Development of Measurement Standards for Medical Radiation funded by Korea research Institute of Standards and Science. (KRISS-2015-15011032)

  2. High-resolution monitoring of marine protists based on an observation strategy integrating automated on-board filtration and molecular analyses

    Science.gov (United States)

    Metfies, Katja; Schroeder, Friedhelm; Hessel, Johanna; Wollschläger, Jochen; Micheller, Sebastian; Wolf, Christian; Kilias, Estelle; Sprong, Pim; Neuhaus, Stefan; Frickenhaus, Stephan; Petersen, Wilhelm

    2016-11-01

    Information on recent biomass distribution and biogeography of photosynthetic marine protists with adequate temporal and spatial resolution is urgently needed to better understand the consequences of environmental change for marine ecosystems. Here we introduce and review a molecular-based observation strategy for high-resolution assessment of these protists in space and time. It is the result of extensive technology developments, adaptations and evaluations which are documented in a number of different publications, and the results of the recently completed field testing which are introduced in this paper. The observation strategy is organized at four different levels. At level 1, samples are collected at high spatiotemporal resolution using the remotely controlled automated filtration system AUTOFIM. Resulting samples can either be preserved for later laboratory analyses, or directly subjected to molecular surveillance of key species aboard the ship via an automated biosensor system or quantitative polymerase chain reaction (level 2). Preserved samples are analyzed at the next observational levels in the laboratory (levels 3 and 4). At level 3 this involves molecular fingerprinting methods for a quick and reliable overview of differences in protist community composition. Finally, selected samples can be used to generate a detailed analysis of taxonomic protist composition via the latest next generation sequencing technology (NGS) at level 4. An overall integrated dataset of the results based on the different analyses provides comprehensive information on the diversity and biogeography of protists, including all related size classes. At the same time the cost of the observation is optimized with respect to analysis effort and time.

  3. Species distribution modeling based on the automated identification of citizen observations.

    Science.gov (United States)

    Botella, Christophe; Joly, Alexis; Bonnet, Pierre; Monestiez, Pascal; Munoz, François

    2018-02-01

    A species distribution model computed with automatically identified plant observations was developed and evaluated to contribute to future ecological studies. We used deep learning techniques to automatically identify opportunistic plant observations made by citizens through a popular mobile application. We compared species distribution modeling of invasive alien plants based on these data to inventories made by experts. The trained models have a reasonable predictive effectiveness for some species, but they are biased by the massive presence of cultivated specimens. The method proposed here allows for fine-grained and regular monitoring of some species of interest based on opportunistic observations. More in-depth investigation of the typology of the observations and the sampling bias should help improve the approach in the future.

  4. Developing and Integrating Advanced Movement Features Improves Automated Classification of Ciliate Species.

    Science.gov (United States)

    Soleymani, Ali; Pennekamp, Frank; Petchey, Owen L; Weibel, Robert

    2015-01-01

    Recent advances in tracking technologies such as GPS or video tracking systems describe the movement paths of individuals in unprecedented details and are increasingly used in different fields, including ecology. However, extracting information from raw movement data requires advanced analysis techniques, for instance to infer behaviors expressed during a certain period of the recorded trajectory, or gender or species identity in case data is obtained from remote tracking. In this paper, we address how different movement features affect the ability to automatically classify the species identity, using a dataset of unicellular microbes (i.e., ciliates). Previously, morphological attributes and simple movement metrics, such as speed, were used for classifying ciliate species. Here, we demonstrate that adding advanced movement features, in particular such based on discrete wavelet transform, to morphological features can improve classification. These results may have practical applications in automated monitoring of waste water facilities as well as environmental monitoring of aquatic systems.

  5. Improved Automated Detection of Diabetic Retinopathy on a Publicly Available Dataset Through Integration of Deep Learning.

    Science.gov (United States)

    Abràmoff, Michael David; Lou, Yiyue; Erginay, Ali; Clarida, Warren; Amelon, Ryan; Folk, James C; Niemeijer, Meindert

    2016-10-01

    To compare performance of a deep-learning enhanced algorithm for automated detection of diabetic retinopathy (DR), to the previously published performance of that algorithm, the Iowa Detection Program (IDP)-without deep learning components-on the same publicly available set of fundus images and previously reported consensus reference standard set, by three US Board certified retinal specialists. We used the previously reported consensus reference standard of referable DR (rDR), defined as International Clinical Classification of Diabetic Retinopathy moderate, severe nonproliferative (NPDR), proliferative DR, and/or macular edema (ME). Neither Messidor-2 images, nor the three retinal specialists setting the Messidor-2 reference standard were used for training IDx-DR version X2.1. Sensitivity, specificity, negative predictive value, area under the curve (AUC), and their confidence intervals (CIs) were calculated. Sensitivity was 96.8% (95% CI: 93.3%-98.8%), specificity was 87.0% (95% CI: 84.2%-89.4%), with 6/874 false negatives, resulting in a negative predictive value of 99.0% (95% CI: 97.8%-99.6%). No cases of severe NPDR, PDR, or ME were missed. The AUC was 0.980 (95% CI: 0.968-0.992). Sensitivity was not statistically different from published IDP sensitivity, which had a CI of 94.4% to 99.3%, but specificity was significantly better than the published IDP specificity CI of 55.7% to 63.0%. A deep-learning enhanced algorithm for the automated detection of DR, achieves significantly better performance than a previously reported, otherwise essentially identical, algorithm that does not employ deep learning. Deep learning enhanced algorithms have the potential to improve the efficiency of DR screening, and thereby to prevent visual loss and blindness from this devastating disease.

  6. Distributed Leadership of School Curriculum Change: An Integrative Approach

    Science.gov (United States)

    Fasso, Wendy; Knight, Bruce Allen; Purnell, Ken

    2016-01-01

    Since its inception in 1999, the distributed leadership framework of Spillane, Halverson, and Diamond [2004. "Towards a Theory of Leadership Practice: A Distributed Perspective." "Journal of Curriculum Studies" 36 (1): 3-34. doi:10.1080/0022027032000106726] has supported research into leadership and change in schools. Whilst…

  7. Automated Break-Out Box for use with Low Cost Spacecraft Integration and Test, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Electrical checkout and testing is a critical part of the overall spacecraft integration and test flow. Verifying proper harness and connector signal interfaces is...

  8. Integration and Evaluation of Automated Pavement Distress Data in INDOT’s Pavement Management System

    Science.gov (United States)

    2017-05-01

    This study was in two parts. The first part established and demonstrated a framework for pavement data integration. This is critical for fulfilling QC/QA needs of INDOTs pavement management system, because the precision of the physical location re...

  9. The nucleon-nucleon correlations and the integral characteristics of the potential distributions in nuclei

    International Nuclear Information System (INIS)

    Knyaz'kov, O.M.; Kukhtina, I.N.

    1989-01-01

    The integral characteristics of the potential distribution in nuclei, namely the volume integrals, moments and mean square radii are studied in the framework of the semimicroscopic approach to the interaction of low energy nucleons with nuclei on the base of the exchange nucleon-nucleon correlations and the density dependence of effective forces. The ratio of the normalized multipole moments of potential and matter distributions is investigated. The energy dependence of the integral characteristics is analyzed. 15 refs.; 2 tabs

  10. Development of a semi-automated method for subspecialty case distribution and prediction of intraoperative consultations in surgical pathology

    Directory of Open Access Journals (Sweden)

    Raul S Gonzalez

    2015-01-01

    Full Text Available Background: In many surgical pathology laboratories, operating room schedules are prospectively reviewed to determine specimen distribution to different subspecialty services and to predict the number and nature of potential intraoperative consultations for which prior medical records and slides require review. At our institution, such schedules were manually converted into easily interpretable, surgical pathology-friendly reports to facilitate these activities. This conversion, however, was time-consuming and arguably a non-value-added activity. Objective: Our goal was to develop a semi-automated method of generating these reports that improved their readability while taking less time to perform than the manual method. Materials and Methods: A dynamic Microsoft Excel workbook was developed to automatically convert published operating room schedules into different tabular formats. Based on the surgical procedure descriptions in the schedule, a list of linked keywords and phrases was utilized to sort cases by subspecialty and to predict potential intraoperative consultations. After two trial-and-optimization cycles, the method was incorporated into standard practice. Results: The workbook distributed cases to appropriate subspecialties and accurately predicted intraoperative requests. Users indicated that they spent 1-2 h fewer per day on this activity than before, and team members preferred the formatting of the newer reports. Comparison of the manual and semi-automatic predictions showed that the mean daily difference in predicted versus actual intraoperative consultations underwent no statistically significant changes before and after implementation for most subspecialties. Conclusions: A well-designed, lean, and simple information technology solution to determine subspecialty case distribution and prediction of intraoperative consultations in surgical pathology is approximately as accurate as the gold standard manual method and requires less

  11. Automation of innovative facade systems with integrated technical building equipment under consideration of comfort aspects; Automatisierung innovativer Fassadensysteme mit integrierter technischer Gebaeudeausruestung unter Beruecksichtigung von Behaglichkeitsaspekten

    Energy Technology Data Exchange (ETDEWEB)

    Hasert, Anita; Becker, Martin [Hochschule Biberach (Germany). Inst. fuer Gebaeude- und Energiesysteme (IGE)

    2012-07-01

    Facades are not only a shell of the protected habitats and a boundary between the indoor climate and the environment of buildings. Facades also convert from previously passive elements to active building systems which perform various functions of the room conditioning (heating, cooling, ventilation, lighting, and so on). The associated increased demands on the system integration into facades require new solutions for the planning, implementation and operation of these innovative systems. Within the intelligent handling of increasing complexity, superior automation strategies have to be developed by means of facade automation. These automation strategies have to match the individual functions with each other, and to ensure a building sector comprehensive functionality. Furthermore, another criterion for the design of integrated facade systems is the consideration of the user's feeling with respect to comfort as well as the user's control and user's acceptance. In line with the research project AUTiFAS (= Automation of innovative facade systems), different automation strategies of the facades and room automation are considered on the basis of metrological investigations and simulation analyses. For this purpose, an innovative facade element with a decentralized ventilation unit and an integrated sunshade had to be integrated into a test room initially. The functionality and the constructional tightness of the total test stand had to be verified and matched to the requirements of the tests. With the objective to develop a standardized description of the control and regulation functions of the building sector comprehensive automation strategies, an automation library was developed based on standard structures and forms of representation using a test facade as an example. The standards DIN EN 15232, IEC 61131 as well as the guidelines VDI 3813 and VDI 3814 provide the fundamentals. The developed automation strategies form the basis for the development of

  12. Power electronics for renewable and distributed energy systems a sourcebook of topologies, control and integration

    CERN Document Server

    Chakraborty, Sudipta; Kramer, William E

    2013-01-01

    While most books approach power electronics and renewable energy as two separate subjects, Power Electronics for Renewable and Distributed Energy Systems takes an integrative approach; discussing power electronic converters topologies, controls and integration that are specific to the renewable and distributed energy system applications. An overview of power electronic technologies is followed by the introduction of various renewable and distributed energy resources that includes photovoltaics, wind, small hydroelectric, fuel cells, microturbines and variable speed generation. Energy storage s

  13. Device for simulation of integral dose distribution in multifield radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Belyakov, E K; Voronin, V V; Kolosova, V F; Moskalev, A I; Marova, Yu M; Stavitskii, R V; Yarovoi, V S

    1974-11-15

    Described is a device for simulation of the sum dose distribution at multifield radiation therapy; the device comprises a mechanical unit on which the emission sources and detectors are mounted, an electromechanical scanning equipment, amplifiers, an adder, a position sensor and a recording instrument. The device suggested raises an accuracy of a sick man radiation program elaboration at a remote multifield radiation therapy, permits to estimate the irradiated medium heterogeneity and beam shaper influence on the sum dose distribution and also ensured the information on the sum dose distribution of the relative or absolute units. Additional filters simulating heterogeneity and beam shaping conditions of ionizing radiation may be mounted between the quantum emission sources and detectors, and an amplifier with a variable amplification factor may be placed between the adders and printers. Thus it is possible to obtain a sum dose distribution at static methods of the remote radiation therapy at a high degree of accuracy (up to +-10%).

  14. Network integration of distributed generation: international research and development

    Energy Technology Data Exchange (ETDEWEB)

    Watson, J.

    2003-07-01

    This report provides information on privately and publicly funded research and development programmes in distributed generation (DG) in the USA, the European Union and Japan. Protection systems for the installation of DG, power electronics for the connection of DG to electricity distribution systems, reliability modelling, power quality issues, connection standards, and simulation and computer modelling are examined. The relevance of the programmes to the UK is considered.

  15. Oh and by the way, you get meter readings too : a look at distribution automation, intelligent grids, and demand side management : getting the real value from your AMI

    Energy Technology Data Exchange (ETDEWEB)

    Summerlin, T. [Gestalt, Camden, NJ (United States); Ferguson, P.D. [Newmarket Hydro Ltd., Newmarket, ON (Canada)

    2006-07-01

    The full value of smart metering programs will be realized when information and communication capabilities are used to enable distribution automation, intelligent grids, and sufficient data management that will transform interval meter data into useful information. By leveraging and managing meter data effectively, utilities can increase their operational efficiency, improve their understanding of customers' needs, and develop more effective demand side management programs. This presentation examined some of the changing priorities of advanced metering infrastructure (AMI) strategies, and provided details of meter data management (MDM) technologies developed to help utilities increase efficiency, cut costs and provide better service to their customers. An MDM system is the set of data bases and applications required to provide utilities with a solution for the data retention, analysis and storage repository gaps that will be created when monthly manual meter readings are replaced with AMI systems. In order to resolve storage, functionality and legacy integration gaps, MDM systems must be scalable systems that can support large and small quantities of meter data, and must also conform to industry standard data warehouse designs. Data structure in the systems must support both regulated and deregulated markets, and be capable of providing extensive graphical, tabular and Excel export of the metered and totalized data from the interval to system level. MDM systems can provide improved support for demand response decision-making processes; distribution planning and reliability; outage management; revenue assurance; forecasting; and curtailment. It was concluded that MDM systems can be used to improve processes and provide additional benefits well beyond the meter reading and billing process benefits originally identified by utilities as a primary goal of implementing AMI. refs., tabs., figs.

  16. Final Technical Report: Integrated Distribution-Transmission Analysis for Very High Penetration Solar PV

    Energy Technology Data Exchange (ETDEWEB)

    Palmintier, Bryan [NREL (National Renewable Energy Laboratory (NREL), Golden, CO (United States)); Hale, Elaine [NREL (National Renewable Energy Laboratory (NREL), Golden, CO (United States)); Hansen, Timothy M. [NREL (National Renewable Energy Laboratory (NREL), Golden, CO (United States)); Jones, Wesley [NREL (National Renewable Energy Laboratory (NREL), Golden, CO (United States)); Biagioni, David [NREL (National Renewable Energy Laboratory (NREL), Golden, CO (United States)); Baker, Kyri [NREL (National Renewable Energy Laboratory (NREL), Golden, CO (United States)); Wu, Hongyu [NREL (National Renewable Energy Laboratory (NREL), Golden, CO (United States)); Giraldez, Julieta [NREL (National Renewable Energy Laboratory (NREL), Golden, CO (United States)); Sorensen, Harry [NREL (National Renewable Energy Laboratory (NREL), Golden, CO (United States)); Lunacek, Monte [NREL (National Renewable Energy Laboratory (NREL), Golden, CO (United States)); Merket, Noel [NREL (National Renewable Energy Laboratory (NREL), Golden, CO (United States)); Jorgenson, Jennie [NREL (National Renewable Energy Laboratory (NREL), Golden, CO (United States)); Hodge, Bri-Mathias [NREL (National Renewable Energy Laboratory (NREL), Golden, CO (United States))

    2016-01-29

    Transmission and distribution simulations have historically been conducted separately, echoing their division in grid operations and planning while avoiding inherent computational challenges. Today, however, rapid growth in distributed energy resources (DERs)--including distributed generation from solar photovoltaics (DGPV)--requires understanding the unprecedented interactions between distribution and transmission. To capture these interactions, especially for high-penetration DGPV scenarios, this research project developed a first-of-its-kind, high performance computer (HPC) based, integrated transmission-distribution tool, the Integrated Grid Modeling System (IGMS). The tool was then used in initial explorations of system-wide operational interactions of high-penetration DGPV.

  17. Systems for protection and automation of distribution electric networks 22 kv

    International Nuclear Information System (INIS)

    Horak, M.

    2012-01-01

    This article deals with the new concept of fault location in overhead and cable 22 kV distribution systems, which are operated by ZSE Distribucia, a.s.. In the past the localisation of faults was a demanding and lengthy procedure, because the electric protection indicated only the affected output feeder. The exact fault point was then localised by manipulation in the field and by test switching in the power line where the fault took place. The current development of electric facilities and sharp fall in prices give use the possibility to broadly apply the simple digital metering devices equipped with the electric protection function. Once these devices are positioned densely along the individual power lines of the network, it is possible to localise the failure position quite exactly and electricity supply can be quickly restored. (Authors)

  18. Development of an automation system for a distribution operation center; Desenvolvimento de um sistema de automacao para um Centro de Operacao da Distribuicao

    Energy Technology Data Exchange (ETDEWEB)

    Surur, Paulo Sergio Miguel

    1996-07-01

    The great problems caused by a deficient electric energy supply, mainly referring to quality in the distribution system are widely known. The automation of the feeder and of the Distribution Operational Center, contributes to improving quality mainly concerning the restoring time of the lines during cut-outs decreasing the non-supplied energy. This paper presents an automation system of COD - Distribution Operation Center and tests performed to evaluate the system performance in a substation and in the primary network manoeuvre switch. Considerations on the hardware, software and man machine interface developed for the operator were taken aiming at justifying the adopted choice for this project. Software and hardware modules available in the Brazilian market were applied in this work. Tests of the system were done at a substation and in a laboratory. The results were satisfactory. (author)

  19. Integrated and automated data analysis for neuronal activation studies using positron emission tomography. Methodology and applications

    International Nuclear Information System (INIS)

    Minoshima, Satoshi; Arimizu, Noboru; Koeppe, R.A.; Kuhl, D.E.

    1994-01-01

    A data analysis method was developed for neuronal activation studies using [ 15 O] water positron emission tomography (PET). The method consists of several procedures including intra-subject head motion correction (co-registration), detection of the mid-sagittal plane of the brain, detection of the intercommissural (AC-PC) line, linear scaling and non-linear warping for anatomical standardization, pixel-by-pixel statistical analysis, and data display. All steps are performed in three dimensions and are fully automated. Each step was validated using a brain phantom, computer simulations, and data from human subjects, demonstrating accuracy and reliability of the procedure. The method was applied to human neuronal activation studies using vibratory and visual stimulations. The method detected significant blood flow increases in the primary sensory cortices as well as in other regions such as the secondary sensory cortex and cerebellum. The proposed method should enhance application of PET neuronal activation studies to the investigation of higher-order human brain functions. (author) 38 refs

  20. Automated Data Collection for Determining Statistical Distributions of Module Power Undergoing Potential-Induced Degradation: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Hacke, P.; Spataru, S.

    2014-08-01

    We propose a method for increasing the frequency of data collection and reducing the time and cost of accelerated lifetime testing of photovoltaic modules undergoing potential-induced degradation (PID). This consists of in-situ measurements of dark current-voltage curves of the modules at elevated stress temperature, their use to determine the maximum power at 25 degrees C standard test conditions (STC), and distribution statistics for determining degradation rates as a function of stress level. The semi-continuous data obtained by this method clearly show degradation curves of the maximum power, including an incubation phase, rates and extent of degradation, precise time to failure, and partial recovery. Stress tests were performed on crystalline silicon modules at 85% relative humidity and 60 degrees C, 72 degrees C, and 85 degrees C. Activation energy for the mean time to failure (1% relative) of 0.85 eV was determined and a mean time to failure of 8,000 h at 25 degrees C and 85% relative humidity is predicted. No clear trend in maximum degradation as a function of stress temperature was observed.

  1. LXtoo: an integrated live Linux distribution for the bioinformatics community.

    Science.gov (United States)

    Yu, Guangchuang; Wang, Li-Gen; Meng, Xiao-Hua; He, Qing-Yu

    2012-07-19

    Recent advances in high-throughput technologies dramatically increase biological data generation. However, many research groups lack computing facilities and specialists. This is an obstacle that remains to be addressed. Here, we present a Linux distribution, LXtoo, to provide a flexible computing platform for bioinformatics analysis. Unlike most of the existing live Linux distributions for bioinformatics limiting their usage to sequence analysis and protein structure prediction, LXtoo incorporates a comprehensive collection of bioinformatics software, including data mining tools for microarray and proteomics, protein-protein interaction analysis, and computationally complex tasks like molecular dynamics. Moreover, most of the programs have been configured and optimized for high performance computing. LXtoo aims to provide well-supported computing environment tailored for bioinformatics research, reducing duplication of efforts in building computing infrastructure. LXtoo is distributed as a Live DVD and freely available at http://bioinformatics.jnu.edu.cn/LXtoo.

  2. Successful integration requires data on disease distribution and utilization patterns.

    Science.gov (United States)

    1998-08-01

    Planning and development: Where should integrated networks locate or establish contracts with physician offices, hospitals, nursing homes or other facilities? A close look at demand and supply data is the only way to effectively determine your community's needs. Sources abound for such data, but there are a few things you need to be careful about when using national or regional information.

  3. Dynamic state estimation for distribution networks with renewable energy integration

    NARCIS (Netherlands)

    Nguyen, P.H.; Venayagamoorthy, G.K.; Kling, W.L.; Ribeiro, P.F.

    2013-01-01

    The massive integration of variable and unpredictable Renewable Energy Sources (RES) and new types of load consumptions increases the dynamic and uncertain nature of the electricity grid. Emerging interests have focused on improving the monitoring capabilities of network operators so that they can

  4. Collision integral and equilibrium distributions for a bounded plasma

    International Nuclear Information System (INIS)

    Zagorodnij, A.G.; Usenko, A.S.; Yakimenko, I.P.

    1985-01-01

    A kinetic equation of Balesku-Lennard type for multicomponent system of charged particle limited by two flat-parallel surfaces is derived on the basis of the general theory of electromagnetic fluctuations in plasma. Equilibrium values of collision integral for a plasma with arbitrary configuration boundaries are calculated and general ratios describing charged particles density profiles in such systems are obtained

  5. Enforcing Integrity of Agent Migration Paths by Distribution of Trust

    NARCIS (Netherlands)

    Warnier, M.E.; Oey, M.A.; Timmer, R.J.; Overeinder, B.J.; Brazier, F.M.

    2008-01-01

    Agent mobility is the ability of an agent to migrate from one location to another across a network. Though conceptually relatively straightforward, in practice security of mobile agents is a challenge: from transport layer security to preservation of integrity in open environments. This paper

  6. Distribution theory for Schrödinger’s integral equation

    NARCIS (Netherlands)

    Lange, R.J.

    2015-01-01

    Much of the literature on point interactions in quantum mechanics has focused on the differential form of Schrödinger's equation. This paper, in contrast, investigates the integral form of Schrödinger's equation. While both forms are known to be equivalent for smooth potentials, this is not true for

  7. Towards the Development of an Automated Learning Assistant for Vector Calculus: Integration over Planar Regions

    Science.gov (United States)

    Yaacob, Yuzita; Wester, Michael; Steinberg, Stanly

    2010-01-01

    This paper presents a prototype of a computer learning assistant ILMEV (Interactive Learning-Mathematica Enhanced Vector calculus) package with the purpose of helping students to understand the theory and applications of integration in vector calculus. The main problem for students using Mathematica is to convert a textbook description of a…

  8. Automated characterization of nerve fibers labeled fluorescently: determination of size, class and spatial distribution.

    Science.gov (United States)

    Prodanov, Dimiter; Feirabend, Hans K P

    2008-10-03

    Morphological classification of nerve fibers could help interpret the assessment of neural regeneration and the understanding of selectivity of nerve stimulation. Specific populations of myelinated nerve fibers can be investigated by retrograde tracing from a muscle followed by microscopic measurements of the labeled fibers at different anatomical levels. Gastrocnemius muscles of adult rats were injected with the retrograde tracer Fluoro-Gold. After a survival period of 3 days, cross-sections of spinal cords, ventral roots, sciatic, and tibial nerves were collected and imaged on a fluorescence microscope. Nerve fibers were classified using a variation-based criterion acting on the distribution of their equivalent diameters. The same criterion was used to classify the labeled axons using the size of the fluorescent marker. Measurements of the axons were paired to those of the entire fibers (axons+myelin sheaths) in order to establish the correspondence between so-established axonal and fiber classifications. It was found that nerve fibers in L6 ventral roots could be classified into four populations comprising two classes of Aalpha (denoted Aalpha1 and Aalpha2), Agamma, and an additional class of Agammaalpha fibers. Cut-off borders between Agamma and Agammaalpha fiber classes were estimated to be 5.00+/-0.09 microm (SEM); between Agammaalpha and Aalpha1 fiber classes to be 6.86+/-0.11 microm (SEM); and between Aalpha1 and Aalpha2 fiber classes to be 8.66+/-0.16 microm (SEM). Topographical maps of the nerve fibers that innervate the gastrocnemius muscles were constructed per fiber class for the spinal root L6. The major advantage of the presented approach consists of the combined indirect classification of nerve fiber types and the construction of topographical maps of so-identified fiber classes.

  9. Organizational changes and automation: Towards a customer-oriented business organization for energy distribution companies: Part 1

    International Nuclear Information System (INIS)

    Van Gelder, J.W.

    1994-01-01

    Automation offers great opportunities in the efforts of energy utilities in the Netherlands to reorganize towards more customer-oriented businesses. However, automation in itself is not enough. First, the organizational structure has to be changed considerably. Various energy utilities have already started on it. The restructuring principle is the same everywhere, but the way it is implemented differs widely

  10. Integrating smart grid solution into distribution network planning

    NARCIS (Netherlands)

    Grond, M.O.W.; Morren, J.; Slootweg, J.G.

    2013-01-01

    The planning of medium voltage (MV) distribution networks is a challenging optimization problem due to its scale, its inherent uncertainty, and non-linear nature. In the international technical literature, there are many different optimization models and methods available to approach this planning

  11. Leakage detection algorithm integrating water distribution networks hydraulic model

    CSIR Research Space (South Africa)

    Adedeji, K

    2017-06-01

    Full Text Available Water loss through leaking pipes is inexorable in water distribution networks (WDNs) and has been recognized as a major challenge facing the operation of municipal water services. This is strongly linked with financial costs due to economic loss...

  12. Integrating image processing and classification technology into automated polarizing film defect inspection

    Science.gov (United States)

    Kuo, Chung-Feng Jeffrey; Lai, Chun-Yu; Kao, Chih-Hsiang; Chiu, Chin-Hsun

    2018-05-01

    In order to improve the current manual inspection and classification process for polarizing film on production lines, this study proposes a high precision automated inspection and classification system for polarizing film, which is used for recognition and classification of four common defects: dent, foreign material, bright spot, and scratch. First, the median filter is used to remove the impulse noise in the defect image of polarizing film. The random noise in the background is smoothed by the improved anisotropic diffusion, while the edge detail of the defect region is sharpened. Next, the defect image is transformed by Fourier transform to the frequency domain, combined with a Butterworth high pass filter to sharpen the edge detail of the defect region, and brought back by inverse Fourier transform to the spatial domain to complete the image enhancement process. For image segmentation, the edge of the defect region is found by Canny edge detector, and then the complete defect region is obtained by two-stage morphology processing. For defect classification, the feature values, including maximum gray level, eccentricity, the contrast, and homogeneity of gray level co-occurrence matrix (GLCM) extracted from the images, are used as the input of the radial basis function neural network (RBFNN) and back-propagation neural network (BPNN) classifier, 96 defect images are then used as training samples, and 84 defect images are used as testing samples to validate the classification effect. The result shows that the classification accuracy by using RBFNN is 98.9%. Thus, our proposed system can be used by manufacturing companies for a higher yield rate and lower cost. The processing time of one single image is 2.57 seconds, thus meeting the practical application requirement of an industrial production line.

  13. Creation of an Integrated Environment to Supply e-Learning Platforms with Office Automation Features

    Science.gov (United States)

    Palumbo, Emilio; Verga, Francesca

    2015-01-01

    Over the last years great efforts have been made within the University environment to implement e-learning technologies in the standard educational practice. These learning technologies distribute online educational multimedia contents through technological platforms. Even though specific e-learning tools for technical disciplines were already…

  14. Evaluating Maximum Photovoltaic Integration in District Distribution Systems Considering Optimal Inverter Dispatch and Cloud Shading Conditions

    DEFF Research Database (Denmark)

    Ding, Tao; Kou, Yu; Yang, Yongheng

    2017-01-01

    . However, the intermittency of solar PV energy (e.g., due to passing clouds) may affect the PV generation in the district distribution network. To address this issue, the voltage magnitude constraints under the cloud shading conditions should be taken into account in the optimization model, which can......As photovoltaic (PV) integration increases in distribution systems, to investigate the maximum allowable PV integration capacity for a district distribution system becomes necessary in the planning phase, an optimization model is thus proposed to evaluate the maximum PV integration capacity while...

  15. Integration of offshore wind farms into the local distribution network

    Energy Technology Data Exchange (ETDEWEB)

    Youssef, R.D. [and others

    2003-07-01

    This report summarises the results of a study developing static and dynamic models for a doubly-fed induction generator and the integration of the models into the commercially available and widely used power system analysis computer programme IPSA. Details are given of connection studies involving fixed speed, variable speed and double-fed induction machines; the development of optimal power flow and use of the Optimal Power Flow (OPF) tool; and voltage control studies. The system and offshore connection, connection studies and policies, technical problems, stability connection studies for wind farms with synchronous generators and transient stability connection studies for fixed speed and doubly-fed induction generators are discussed along with the integration of OPF into IPSA.

  16. Integration of XNAT/PACS, DICOM, and research software for automated multi-modal image analysis

    Science.gov (United States)

    Gao, Yurui; Burns, Scott S.; Lauzon, Carolyn B.; Fong, Andrew E.; James, Terry A.; Lubar, Joel F.; Thatcher, Robert W.; Twillie, David A.; Wirt, Michael D.; Zola, Marc A.; Logan, Bret W.; Anderson, Adam W.; Landman, Bennett A.

    2013-03-01

    Traumatic brain injury (TBI) is an increasingly important public health concern. While there are several promising avenues of intervention, clinical assessments are relatively coarse and comparative quantitative analysis is an emerging field. Imaging data provide potentially useful information for evaluating TBI across functional, structural, and microstructural phenotypes. Integration and management of disparate data types are major obstacles. In a multi-institution collaboration, we are collecting electroencephalogy (EEG), structural MRI, diffusion tensor MRI (DTI), and single photon emission computed tomography (SPECT) from a large cohort of US Army service members exposed to mild or moderate TBI who are undergoing experimental treatment. We have constructed a robust informatics backbone for this project centered on the DICOM standard and eXtensible Neuroimaging Archive Toolkit (XNAT) server. Herein, we discuss (1) optimization of data transmission, validation and storage, (2) quality assurance and workflow management, and (3) integration of high performance computing with research software.

  17. Integration of numerical analysis tools for automated numerical optimization of a transportation package design

    International Nuclear Information System (INIS)

    Witkowski, W.R.; Eldred, M.S.; Harding, D.C.

    1994-01-01

    The use of state-of-the-art numerical analysis tools to determine the optimal design of a radioactive material (RAM) transportation container is investigated. The design of a RAM package's components involves a complex coupling of structural, thermal, and radioactive shielding analyses. The final design must adhere to very strict design constraints. The current technique used by cask designers is uncoupled and involves designing each component separately with respect to its driving constraint. With the use of numerical optimization schemes, the complex couplings can be considered directly, and the performance of the integrated package can be maximized with respect to the analysis conditions. This can lead to more efficient package designs. Thermal and structural accident conditions are analyzed in the shape optimization of a simplified cask design. In this paper, details of the integration of numerical analysis tools, development of a process model, nonsmoothness difficulties with the optimization of the cask, and preliminary results are discussed

  18. An Integrative Behavioral Health Care Model Using Automated SBIRT and Care Coordination in Community Health Care.

    Science.gov (United States)

    Dwinnells, Ronald; Misik, Lauren

    2017-10-01

    Efficient and effective integration of behavioral health programs in a community health care practice emphasizes patient-centered medical home principles to improve quality of care. A prospective, 3-period, interrupted time series study was used to explore which of 3 different integrative behavioral health care screening and management processes were the most efficient and effective in prompting behavioral health screening, identification, interventions, and referrals in a community health practice. A total of 99.5% ( P < .001) of medical patients completed behavioral health screenings; brief intervention rates nearly doubled to 83% ( P < .001) and 100% ( P < .001) of identified at-risk patients had referrals made using a combination of electronic tablets, electronic medical record, and behavioral health care coordination.

  19. Integration of the Chinese HPC Grid in ATLAS Distributed Computing

    Science.gov (United States)

    Filipčič, A.; ATLAS Collaboration

    2017-10-01

    Fifteen Chinese High-Performance Computing sites, many of them on the TOP500 list of most powerful supercomputers, are integrated into a common infrastructure providing coherent access to a user through an interface based on a RESTful interface called SCEAPI. These resources have been integrated into the ATLAS Grid production system using a bridge between ATLAS and SCEAPI which translates the authorization and job submission protocols between the two environments. The ARC Computing Element (ARC-CE) forms the bridge using an extended batch system interface to allow job submission to SCEAPI. The ARC-CE was setup at the Institute for High Energy Physics, Beijing, in order to be as close as possible to the SCEAPI front-end interface at the Computing Network Information Center, also in Beijing. This paper describes the technical details of the integration between ARC-CE and SCEAPI and presents results so far with two supercomputer centers, Tianhe-IA and ERA. These two centers have been the pilots for ATLAS Monte Carlo Simulation in SCEAPI and have been providing CPU power since fall 2015.

  20. Integration of the Chinese HPC Grid in ATLAS Distributed Computing

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00081160; The ATLAS collaboration

    2016-01-01

    Fifteen Chinese High Performance Computing sites, many of them on the TOP500 list of most powerful supercomputers, are integrated into a common infrastructure providing coherent access to a user through an interface based on a RESTful interface called SCEAPI. These resources have been integrated into the ATLAS Grid production system using a bridge between ATLAS and SCEAPI which translates the authorization and job submission protocols between the two environments. The ARC Computing Element (ARC CE) forms the bridge using an extended batch system interface to allow job submission to SCEAPI. The ARC CE was setup at the Institute for High Energy Physics, Beijing, in order to be as close as possible to the SCEAPI front-end interface at the Computing Network Information Center, also in Beijing. This paper describes the technical details of the integration between ARC CE and SCEAPI and presents results so far with two supercomputer centers, Tianhe-IA and ERA. These two centers have been the pilots for ATLAS Monte C...

  1. Integration of the Chinese HPC Grid in ATLAS Distributed Computing

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00081160

    2017-01-01

    Fifteen Chinese High-Performance Computing sites, many of them on the TOP500 list of most powerful supercomputers, are integrated into a common infrastructure providing coherent access to a user through an interface based on a RESTful interface called SCEAPI. These resources have been integrated into the ATLAS Grid production system using a bridge between ATLAS and SCEAPI which translates the authorization and job submission protocols between the two environments. The ARC Computing Element (ARC-CE) forms the bridge using an extended batch system interface to allow job submission to SCEAPI. The ARC-CE was setup at the Institute for High Energy Physics, Beijing, in order to be as close as possible to the SCEAPI front-end interface at the Computing Network Information Center, also in Beijing. This paper describes the technical details of the integration between ARC-CE and SCEAPI and presents results so far with two supercomputer centers, Tianhe-IA and ERA. These two centers have been the pilots for ATLAS Monte C...

  2. Automated estimation of leaf distribution for individual trees based on TLS point clouds

    Science.gov (United States)

    Koma, Zsófia; Rutzinger, Martin; Bremer, Magnus

    2017-04-01

    parameters was evaluated as the following: i) the sum area of the collected leaves and the point cloud, ii) the segmented leaf length-width ratio iii) the distribution of the leaf area for the segmented and the reference-ones were compared and the ideal parameter-set was found. The results show that the leaves can be captured with the developed workflow and the slope can be determined robustly for the segmented leaves. However, area, length and width values are systematically depending on the angle and the distance from the scanner. For correction of the systematic underestimation, more systematic measurement or LiDAR simulation is required for further detailed analysis. The results of leaf segmentation algorithm show high potential in generating more precise tree models with correctly located leaves in order to extract more precise input model for biological modeling of LAI or atmospheric corrections studies. The presented workflow also can be used in monitoring the change of angle of the leaves due to sun irradiation, water balance, and day-night rhythm.

  3. SEMI-AUTOMATED APPROACH FOR MAPPING URBAN TREES FROM INTEGRATED AERIAL LiDAR POINT CLOUD AND DIGITAL IMAGERY DATASETS

    Directory of Open Access Journals (Sweden)

    M. A. Dogon-Yaro

    2016-09-01

    Full Text Available Mapping of trees plays an important role in modern urban spatial data management, as many benefits and applications inherit from this detailed up-to-date data sources. Timely and accurate acquisition of information on the condition of urban trees serves as a tool for decision makers to better appreciate urban ecosystems and their numerous values which are critical to building up strategies for sustainable development. The conventional techniques used for extracting trees include ground surveying and interpretation of the aerial photography. However, these techniques are associated with some constraints, such as labour intensive field work and a lot of financial requirement which can be overcome by means of integrated LiDAR and digital image datasets. Compared to predominant studies on trees extraction mainly in purely forested areas, this study concentrates on urban areas, which have a high structural complexity with a multitude of different objects. This paper presented a workflow about semi-automated approach for extracting urban trees from integrated processing of airborne based LiDAR point cloud and multispectral digital image datasets over Istanbul city of Turkey. The paper reveals that the integrated datasets is a suitable technology and viable source of information for urban trees management. As a conclusion, therefore, the extracted information provides a snapshot about location, composition and extent of trees in the study area useful to city planners and other decision makers in order to understand how much canopy cover exists, identify new planting, removal, or reforestation opportunities and what locations have the greatest need or potential to maximize benefits of return on investment. It can also help track trends or changes to the urban trees over time and inform future management decisions.

  4. Semi-Automated Approach for Mapping Urban Trees from Integrated Aerial LiDAR Point Cloud and Digital Imagery Datasets

    Science.gov (United States)

    Dogon-Yaro, M. A.; Kumar, P.; Rahman, A. Abdul; Buyuksalih, G.

    2016-09-01

    Mapping of trees plays an important role in modern urban spatial data management, as many benefits and applications inherit from this detailed up-to-date data sources. Timely and accurate acquisition of information on the condition of urban trees serves as a tool for decision makers to better appreciate urban ecosystems and their numerous values which are critical to building up strategies for sustainable development. The conventional techniques used for extracting trees include ground surveying and interpretation of the aerial photography. However, these techniques are associated with some constraints, such as labour intensive field work and a lot of financial requirement which can be overcome by means of integrated LiDAR and digital image datasets. Compared to predominant studies on trees extraction mainly in purely forested areas, this study concentrates on urban areas, which have a high structural complexity with a multitude of different objects. This paper presented a workflow about semi-automated approach for extracting urban trees from integrated processing of airborne based LiDAR point cloud and multispectral digital image datasets over Istanbul city of Turkey. The paper reveals that the integrated datasets is a suitable technology and viable source of information for urban trees management. As a conclusion, therefore, the extracted information provides a snapshot about location, composition and extent of trees in the study area useful to city planners and other decision makers in order to understand how much canopy cover exists, identify new planting, removal, or reforestation opportunities and what locations have the greatest need or potential to maximize benefits of return on investment. It can also help track trends or changes to the urban trees over time and inform future management decisions.

  5. Grid Integrated Distributed PV (GridPV) Version 2.

    Energy Technology Data Exchange (ETDEWEB)

    Reno, Matthew J.; Coogan, Kyle

    2014-12-01

    This manual provides the documentation of the MATLAB toolbox of functions for using OpenDSS to simulate the impact of solar energy on the distribution system. The majority of the functio ns are useful for interfacing OpenDSS and MATLAB, and they are of generic use for commanding OpenDSS from MATLAB and retrieving information from simulations. A set of functions is also included for modeling PV plant output and setting up the PV plant in th e OpenDSS simulation. The toolbox contains functions for modeling the OpenDSS distribution feeder on satellite images with GPS coordinates. Finally, example simulations functions are included to show potential uses of the toolbox functions. Each function i n the toolbox is documented with the function use syntax, full description, function input list, function output list, example use, and example output.

  6. Integrating wind turbines into the Orcas Island distribution system

    Energy Technology Data Exchange (ETDEWEB)

    Zaininger, H.W. [Zaininger Engineering Co., Roseville, CA (United States)

    1998-09-01

    This research effort consists of two years of wind data collection and analysis to investigate the possibility of strategically locating a megawatt (MW) scale wind farm near the end of an Orcas Power and light Company (OPALCO) 25-kilovolt (kV) distribution circuit to defer the need to upgrade the line to 69 kV. The results of this study support the results of previous work in which another year of wind data and collection was performed. Both this study and the previous study show that adding a MW-scale wind farm at the Mt. Constitution site is a feasible alternative to upgrading the OPALCO 25-kV distribution circuit to 69 kV.

  7. High-Penetration PV Integration Handbook for Distribution Engineers

    Energy Technology Data Exchange (ETDEWEB)

    Seguin, Rich [Electrical Distribution Design, Blacksburg, VA (United States); Woyak, Jeremy [Electrical Distribution Design, Blacksburg, VA (United States); Costyk, David [Electrical Distribution Design, Blacksburg, VA (United States); Hambrick, Josh [Electrical Distribution Design, Blacksburg, VA (United States); Mather, Barry [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2016-01-01

    This handbook has been developed as part of a five-year research project which began in 2010. The National Renewable Energy Laboratory (NREL), Southern California Edison (SCE), Quanta Technology, Satcon Technology Corporation, Electrical Distribution Design (EDD), and Clean Power Research (CPR) teamed together to analyze the impacts of high-penetration levels of photovoltaic (PV) systems interconnected onto the SCE distribution system. This project was designed specifically to leverage the experience that SCE and the project team would gain during the significant installation of 500 MW of commercial scale PV systems (1-5 MW typically) starting in 2010 and completing in 2015 within SCE’s service territory through a program approved by the California Public Utility Commission (CPUC).

  8. Review of Integration of Distributed Energy Resources (DERs) into Power Systems

    DEFF Research Database (Denmark)

    Wu, Qiuwei; Xu, Zhao

    2011-01-01

    state‐of‐the‐art DER integration concepts  relations existing DER integration concepts to the EV system The power balancing challenges of power systems brought by high penetration of intermittent DER have been discussed, especially the wind power integration in the Danish context. The relevance...... of the integration of electric vehicles (EVs) to the DER integration concepts have been analyzed as well based on the energy storage potential of EVs.   Two main concepts for DER integration, virtual power plant (VPP) and microgrids, are described and a comparison of the two concepts have been done. The comparison......An overview of the integration of distributed energy resources (DER) into power systems has been presented in this report. Different aspects of integration of DER into power systems have been reviewed and discussed which are listed below.    needs of DER integration into power systems  various...

  9. Distributed Energy Neural Network Integration System: Year One Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Regan, T.; Sinnock, H.; Davis, A.

    2003-06-01

    This report describes the work of Orion Engineering Corp. to develop a DER household controller module and demonstrate the ability of a group of these controllers to operate through an intelligent, neighborhood controller. The controllers will provide a smart, technologically advanced, simple, efficient, and economic solution for aggregating a community of small distributed generators into a larger single, virtual generator capable of selling power or other services to a utility, independent system operator (ISO), or other entity in a coordinated manner.

  10. Hardware and Software Integration in Project Development of Automated Controller System Using LABVIEW FPGA

    International Nuclear Information System (INIS)

    Mohd Khairulezwan Abd Manan; Mohd Sabri Minhat; Izhar Abu Hussin

    2014-01-01

    The Field-Programmable Gate Array (FPGA) is a semiconductor device that can be programmed after manufacturing. Instead of being restricted to any predetermined hardware function, an FPGA allows user to program product features and functions, adapt to new standards, and reconfigure hardware for specific applications even after the product has been installed in the field, hence the name field-programmable. This project developed a control system using LabVIEW FPGA. LabVIEW FPGA is easier where it is programmed by using drag and drop icon. Then it will be integrated with the hardware input and output. (author)

  11. 77 FR 34123 - Pipeline Safety: Public Meeting on Integrity Management of Gas Distribution Pipelines

    Science.gov (United States)

    2012-06-08

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No. PHMSA-2012-0100] Pipeline Safety: Public Meeting on Integrity Management of Gas Distribution Pipelines AGENCY: Office of Pipeline Safety, Pipeline and Hazardous Materials Safety Administration, DOT. ACTION...

  12. Modeling nurses' attitude toward using automated unit-based medication storage and distribution systems: an extension of the technology acceptance model.

    Science.gov (United States)

    Escobar-Rodríguez, Tomás; Romero-Alonso, María Mercedes

    2013-05-01

    This article analyzes the attitude of nurses toward the use of automated unit-based medication storage and distribution systems and identifies influencing factors. Understanding these factors provides an opportunity to explore actions that might be taken to boost adoption by potential users. The theoretical grounding for this research is the Technology Acceptance Model. The Technology Acceptance Model specifies the causal relationships between perceived usefulness, perceived ease of use, attitude toward using, and actual usage behavior. The research model has six constructs, and nine hypotheses were generated from connections between these six constructs. These constructs include perceived risks, experience level, and training. The findings indicate that these three external variables are related to the perceived ease of use and perceived usefulness of automated unit-based medication storage and distribution systems, and therefore, they have a significant influence on attitude toward the use of these systems.

  13. Comprehensive evaluation of impacts of distributed generation integration in distribution network

    Science.gov (United States)

    Peng, Sujiang; Zhou, Erbiao; Ji, Fengkun; Cao, Xinhui; Liu, Lingshuang; Liu, Zifa; Wang, Xuyang; Cai, Xiaoyu

    2018-04-01

    All Distributed generation (DG) as the supplement to renewable energy centralized utilization, is becoming the focus of development direction of renewable energy utilization. With the increasing proportion of DG in distribution network, the network power structure, power flow distribution, operation plans and protection are affected to some extent. According to the main impacts of DG, a comprehensive evaluation model of distributed network with DG is proposed in this paper. A comprehensive evaluation index system including 7 aspects, along with their corresponding index calculation method is established for quantitative analysis. The indices under different access capacity of DG in distribution network are calculated based on the IEEE RBTS-Bus 6 system and the evaluation result is calculated by analytic hierarchy process (AHP). The proposed model and method are verified effective and validity through case study.

  14. AMIC: an expandable integrated analog front-end for light distribution moments analysis

    OpenAIRE

    SPAGGIARI, MICHELE; Herrero Bosch, Vicente; Lerche, Christoph Werner; Aliaga Varea, Ramón José; Monzó Ferrer, José María; Gadea Gironés, Rafael

    2011-01-01

    In this article we introduce AMIC (Analog Moments Integrated Circuit), a novel analog Application Specific Integrated Circuit (ASIC) front-end for Positron Emission Tomography (PET) applications. Its working principle is based on mathematical analysis of light distribution through moments calculation. Each moment provides useful information about light distribution, such as energy, position, depth of interaction, skewness (deformation due to border effect) etc. A current buffer delivers a cop...

  15. Human-Automation Integration: Principle and Method for Design and Evaluation

    Science.gov (United States)

    Billman, Dorrit; Feary, Michael

    2012-01-01

    Future space missions will increasingly depend on integration of complex engineered systems with their human operators. It is important to ensure that the systems that are designed and developed do a good job of supporting the needs of the work domain. Our research investigates methods for needs analysis. We included analysis of work products (plans for regulation of the space station) as well as work processes (tasks using current software), in a case study of Attitude Determination and Control Officers (ADCO) planning work. This allows comparing how well different designs match the structure of the work to be supported. Redesigned planning software that better matches the structure of work was developed and experimentally assessed. The new prototype enabled substantially faster and more accurate performance in plan revision tasks. This success suggests the approach to needs assessment and use in design and evaluation is promising, and merits investigatation in future research.

  16. StorNet: Integrated Dynamic Storage and Network Resource Provisioning and Management for Automated Data Transfers

    International Nuclear Information System (INIS)

    Gu Junmin; Natarajan, Vijaya; Shoshani, Arie; Sim, Alex; Katramatos, Dimitrios; Liu Xin; Yu Dantong; Bradley, Scott; McKee, Shawn

    2011-01-01

    StorNet is a joint project of Brookhaven National Laboratory (BNL) and Lawrence Berkeley National Laboratory (LBNL) to research, design, and develop an integrated end-to-end resource provisioning and management framework for high-performance data transfers. The StorNet framework leverages heterogeneous network protocols and storage types in a federated computing environment to provide the capability of predictable, efficient delivery of high-bandwidth data transfers for data intensive applications. The framework incorporates functional modules to perform such data transfers through storage and network bandwidth co-scheduling, storage and network resource provisioning, and performance monitoring, and is based on LBNL's BeStMan/SRM, BNL's TeraPaths, and ESNet's OSCARS systems.

  17. Effects of further integration of distributed generation on the electricity market

    NARCIS (Netherlands)

    Frunt, J.; Kling, W.L.; Myrzik, J.M.A.; Nobel, Frank; Klaar, D.A.M.

    2006-01-01

    Environmental concern leads to legislation to stimulate the further integration of renewable energy in the Dutch electricity supply system. Distributed generation is suited for the integration of renewable energy sources. Furthermore it can be used to generate both heat and electricity in a more

  18. Hierarchical predictive control scheme for distributed energy storage integrated with residential demand and photovoltaic generation

    NARCIS (Netherlands)

    Lampropoulos, I.; Garoufalis, P.; van den Bosch, P.P.J.; Kling, W.L.

    2015-01-01

    A hierarchical control scheme is defined for the energy management of a battery energy storage system which is integrated in a low-voltage distribution grid with residential customers and photovoltaic installations. The scope is the economic optimisation of the integrated system by employing

  19. An integrated, multi-vendor distributed data acquisition system

    International Nuclear Information System (INIS)

    Butner, D.N.; Drlik, M.; Meyer, W.H.; Moller, J.M.; Preckshot, G.G.

    1988-01-01

    A distributed data acquisition system that uses various computer hardware and software is being developed to support magnetic fusion experiments at Lawrence Livermore National Laboratory (LLNL). The experimental sequence of operations is controlled by a supervisory program, which coordinates software running on Digital Equipment Corporation (DEC) VAX computers, Hewlett-Packard (HP) UNIX-based workstations, and HP BASIC desktop computers. An interprocess communication system (IPCS) allows programs to communicate with one another in a standard manner regardless of program location in the network or of operating system and hardware differences. We discuss the design and implementation of this data acquisition system with particular emphasis on the coordination model and the IPCS. 5 refs., 3 figs

  20. Integration of Electric Vehicles in Low Voltage Danish Distribution Grids

    DEFF Research Database (Denmark)

    Pillai, Jayakrishnan Radhakrishna; Thøgersen, Paul; Møller, Jan

    2012-01-01

    Electric Vehicles (EVs) are considered as one of the important components of the future intelligent grids. Their role as energy storages in the electricity grid could provide local sustainable solutions to support more renewable energy. In order to estimate the extent of interaction of EVs...... in the electricity grid operation, a careful examination in the local electricity system is essential. This paper investigates the degree of EV penetration and its key influence on the low voltage distribution grids. Three detailed models of residential grids in Denmark are considered as test cases in this study...... it is shown that there is enough head-space on the transformer capacity which can be used to charge many EVs during a day. The overall transformer capability of handling EV loads varies between 6-40% for peak and minimum demand hours, which is dependent on the robustness of the grids. The voltage drops...

  1. Automated, Miniaturized and Integrated Quality Control-on-Chip (QC-on-a-Chip for Advanced Cell Therapy Applications

    Directory of Open Access Journals (Sweden)

    David eWartmann

    2015-09-01

    Full Text Available The combination of microfabrication-based technologies with cell biology has laid the foundation for the development of advanced in vitro diagnostic systems capable of evaluating cell cultures under defined, reproducible and standardizable measurement conditions. In the present review we describe recent lab-on-a-chip developments for cell analysis and how these methodologies could improve standard quality control in the field of manufacturing cell-based vaccines for clinical purposes. We highlight in particular the regulatory requirements for advanced cell therapy applications using as an example dendritic cell-based cancer vaccines to describe the tangible advantages of microfluidic devices that overcome most of the challenges associated with automation, miniaturization and integration of cell-based assays. As its main advantage lab-on-a-chip technology allows for precise regulation of culturing conditions, while simultaneously monitoring cell relevant parameters using embedded sensory systems. State-of-the-art lab-on-a-chip platforms for in vitro assessment of cell cultures and their potential future applications for cell therapies and cancer immunotherapy are discussed in the present review.

  2. Power distribution automation

    CERN Document Server

    Das, Biswarup

    2016-01-01

    This comprehensive book provides a detailed description of all the major components of a DA system, including communication infrastructure and analysis tools, and includes extensive international case studies showing how the technology has been implemented in real-world situations.

  3. Attention, spatial integration, and the tail of response time distributions in Stroop task performance

    NARCIS (Netherlands)

    Roelofs, A.P.A.

    2012-01-01

    A few studies have examined selective attention in Stroop task performance through ex-Gaussian analyses of response time (RT) distributions. It has remained unclear whether the tail of the RT distribution in vocal responding reflects spatial integration of relevant and irrelevant attributes, as

  4. 76 FR 22944 - Pipeline Safety: Notice of Public Webinars on Implementation of Distribution Integrity Management...

    Science.gov (United States)

    2011-04-25

    ... oversight program and operating conditions as well as the evolutionary process that distribution system... 20590. Hand Delivery: Docket Management System, Room W12-140, on the ground floor of the West Building... PHMSA-2011-0084] Pipeline Safety: Notice of Public Webinars on Implementation of Distribution Integrity...

  5. Preventing Distribution Grid Congestion by Integrating Indirect Control in a Hierarchical Electric Vehicles Management System

    DEFF Research Database (Denmark)

    Hu, Junjie; Si, Chengyong; Lind, Morten

    2016-01-01

    In this paper, a hierarchical management system is proposed to integrate electric vehicles (EVs) into a distribution grid. Three types of actors are included in the system: Distribution system operators (DSOs), Fleet operators (FOs) and EV owners. In contrast to a typical hierarchical control sys...

  6. Active integration of electric vehicles in the distribution network - theory, modelling and practice

    DEFF Research Database (Denmark)

    Knezovic, Katarina

    an attractive asset for the distribution system operator (DSO). This thesis investigates how EVs can mitigate the self-induced adverse effects and actively help the distribution grid operation, either autonomously or in coordination, e.g., with an EV aggregator. The general framework for EV integration...

  7. Loaded dice in Monte Carlo : importance sampling in phase space integration and probability distributions for discrepancies

    NARCIS (Netherlands)

    Hameren, Andreas Ferdinand Willem van

    2001-01-01

    Discrepancies play an important role in the study of uniformity properties of point sets. Their probability distributions are a help in the analysis of the efficiency of the Quasi Monte Carlo method of numerical integration, which uses point sets that are distributed more uniformly than sets of

  8. Network Regulation and Support Schemes - How Policy Interactions Affect the Integration of Distributed Generation

    DEFF Research Database (Denmark)

    Ropenus, Stephanie; Jacobsen, Henrik; Schröder, Sascha Thorsten

    2011-01-01

    This article seeks to investigate the interactions between the policy dimensions of support schemes and network regulation and how they affect distributed generation. Firstly, the incentives of distributed generators and distribution system operators are examined. Frequently there exists a trade......-off between the incentives for these two market agents to facilitate the integration of distributed generation. Secondly, the interaction of these policy dimensions is analyzed, including case studies based on five EU Member States. Aspects of operational nature and investments in grid and distributed...

  9. Wind integration in self-regulating electric load distributions

    Energy Technology Data Exchange (ETDEWEB)

    Parkinson, Simon; Wang, Dan; Crawford, Curran; Djilali, Ned [University of Victoria, Department of Mechanical Engineering, Institute for Integrated Energy Systems, STN CSC, Victoria, BC (Canada)

    2012-12-15

    The purpose of this paper is to introduce and assess an alternative method of mitigating short-term wind energy production variability through the control of electric loads. In particular, co-located populations of electric vehicles and heat pumps are targeted to provide regulation-based ancillary services, as the inherent operational flexibility and autonomous device-level control strategy associated with these load-types provide an ideal platform to mitigate enhanced variability within the power system. An optimal control strategy capable of simultaneously balancing these grid-side objectives with those typically expected on the demand-side is introduced. End-use digital communication hardware is used to track and control population dynamics through the development of online aggregate load models equivalent to conventional dispatchable generation. The viability of the proposed load control strategy is assessed through model-based simulations that explicitly track end-use functionality of responsive devices within a power systems analysis typically implemented to observe the effects of integrated wind energy systems. Results indicate that there is great potential for the proposed method to displace the need for increased online regulation reserve capacity in systems considering a high penetration of wind energy, thereby allowing conventional generation to operate more efficiently. (orig.)

  10. Dynamic autofocus for continuous-scanning time-delay-and-integration image acquisition in automated microscopy.

    Science.gov (United States)

    Bravo-Zanoguera, Miguel E; Laris, Casey A; Nguyen, Lam K; Oliva, Mike; Price, Jeffrey H

    2007-01-01

    Efficient image cytometry of a conventional microscope slide means rapid acquisition and analysis of 20 gigapixels of image data (at 0.3-microm sampling). The voluminous data motivate increased acquisition speed to enable many biomedical applications. Continuous-motion time-delay-and-integrate (TDI) scanning has the potential to speed image acquisition while retaining sensitivity, but the challenge of implementing high-resolution autofocus operating simultaneously with acquisition has limited its adoption. We develop a dynamic autofocus system for this need using: 1. a "volume camera," consisting of nine fiber optic imaging conduits to charge-coupled device (CCD) sensors, that acquires images in parallel from different focal planes, 2. an array of mixed analog-digital processing circuits that measure the high spatial frequencies of the multiple image streams to create focus indices, and 3. a software system that reads and analyzes the focus data streams and calculates best focus for closed feedback loop control. Our system updates autofocus at 56 Hz (or once every 21 microm of stage travel) to collect sharply focused images sampled at 0.3x0.3 microm(2)/pixel at a stage speed of 2.3 mms. The system, tested by focusing in phase contrast and imaging long fluorescence strips, achieves high-performance closed-loop image-content-based autofocus in continuous scanning for the first time.

  11. Methodology for prioritizing projects considering the generation, transmission and distribution integrated planning and the financial restraints; Metodologia para priorizacao de projetos, considerando o planejamento integrado de geracao, transmissao e distribuicao e as restricoes financeiras

    Energy Technology Data Exchange (ETDEWEB)

    Souza, Denis Claudio Cruz de; Andrade, Eduardo Leopoldino de; Pimentel, Elson Luiz de Almeida; Pinto, Everton Barroso [Companhia Energetica de Minas Gerais (CEMIG), Belo Horizonte, MG (Brazil)

    1995-12-31

    This technical report presents a methodology for economical evaluation and work design priorities in the electric system and the experience of CEMIG, an electric power utility of State of Minas Gerais - Southeast Brazil, in defining its transmission expansion plan. It is presented and discussed the concept of integrated projects for expansion, involving generation, transmission, distribution, automation and telecommunication works 3 refs., 3 figs., 1 tab.

  12. Automation and integration of components for generalized semantic markup of electronic medical texts.

    Science.gov (United States)

    Dugan, J M; Berrios, D C; Liu, X; Kim, D K; Kaizer, H; Fagan, L M

    1999-01-01

    Our group has built an information retrieval system based on a complex semantic markup of medical textbooks. We describe the construction of a set of web-based knowledge-acquisition tools that expedites the collection and maintenance of the concepts required for text markup and the search interface required for information retrieval from the marked text. In the text markup system, domain experts (DEs) identify sections of text that contain one or more elements from a finite set of concepts. End users can then query the text using a predefined set of questions, each of which identifies a subset of complementary concepts. The search process matches that subset of concepts to relevant points in the text. The current process requires that the DE invest significant time to generate the required concepts and questions. We propose a new system--called ACQUIRE (Acquisition of Concepts and Queries in an Integrated Retrieval Environment)--that assists a DE in two essential tasks in the text-markup process. First, it helps her to develop, edit, and maintain the concept model: the set of concepts with which she marks the text. Second, ACQUIRE helps her to develop a query model: the set of specific questions that end users can later use to search the marked text. The DE incorporates concepts from the concept model when she creates the questions in the query model. The major benefit of the ACQUIRE system is a reduction in the time and effort required for the text-markup process. We compared the process of concept- and query-model creation using ACQUIRE to the process used in previous work by rebuilding two existing models that we previously constructed manually. We observed a significant decrease in the time required to build and maintain the concept and query models.

  13. Congestion Control Algorithm in Distribution Feeders: Integration in a Distribution Management System

    Directory of Open Access Journals (Sweden)

    Tine L. Vandoorn

    2015-06-01

    Full Text Available The increasing share of distributed energy resources poses a challenge to the distribution network operator (DNO to maintain the current availability of the system while limiting the investment costs. Related to this, there is a clear trend in DNOs trying to better monitor their grid by installing a distribution management system (DMS. This DMS enables the DNOs to remotely switch their network or better localize and solve faults. Moreover, the DMS can be used to centrally control the grid assets. Therefore, in this paper, a control strategy is discussed that can be implemented in the DMS for solving current congestion problems posed by the increasing share of renewables in the grid. This control strategy controls wind turbines in order to avoid congestion while mitigating the required investment costs in order to achieve a global cost-efficient solution. Next to the application and objective of the control, the parameter tuning of the control algorithm is discussed.

  14. Optimal distribution of integration time for intensity measurements in degree of linear polarization polarimetry.

    Science.gov (United States)

    Li, Xiaobo; Hu, Haofeng; Liu, Tiegen; Huang, Bingjing; Song, Zhanjie

    2016-04-04

    We consider the degree of linear polarization (DOLP) polarimetry system, which performs two intensity measurements at orthogonal polarization states to estimate DOLP. We show that if the total integration time of intensity measurements is fixed, the variance of the DOLP estimator depends on the distribution of integration time for two intensity measurements. Therefore, by optimizing the distribution of integration time, the variance of the DOLP estimator can be decreased. In this paper, we obtain the closed-form solution of the optimal distribution of integration time in an approximate way by employing Delta method and Lagrange multiplier method. According to the theoretical analyses and real-world experiments, it is shown that the variance of the DOLP estimator can be decreased for any value of DOLP. The method proposed in this paper can effectively decrease the measurement variance and thus statistically improve the measurement accuracy of the polarimetry system.

  15. Optimal distribution of integration time for intensity measurements in Stokes polarimetry.

    Science.gov (United States)

    Li, Xiaobo; Liu, Tiegen; Huang, Bingjing; Song, Zhanjie; Hu, Haofeng

    2015-10-19

    We consider the typical Stokes polarimetry system, which performs four intensity measurements to estimate a Stokes vector. We show that if the total integration time of intensity measurements is fixed, the variance of the Stokes vector estimator depends on the distribution of the integration time at four intensity measurements. Therefore, by optimizing the distribution of integration time, the variance of the Stokes vector estimator can be decreased. In this paper, we obtain the closed-form solution of the optimal distribution of integration time by employing Lagrange multiplier method. According to the theoretical analysis and real-world experiment, it is shown that the total variance of the Stokes vector estimator can be significantly decreased about 40% in the case discussed in this paper. The method proposed in this paper can effectively decrease the measurement variance and thus statistically improves the measurement accuracy of the polarimetric system.

  16. A Multiagent System-Based Protection and Control Scheme for Distribution System With Distributed-Generation Integration

    DEFF Research Database (Denmark)

    Liu, Z.; Su, Chi; Hoidalen, Hans

    2017-01-01

    In this paper, a multi agent system (MAS) based protection and control scheme is proposed to deal with diverse operation conditions in distribution system due to distributed generation (DG) integration. Based on cooperation between DG controller and relays, an adaptive protection and control...... algorithm is designed on converter based wind turbine DG to limit the influence of infeed fault current. With the consideration of DG control modes, an adaptive relay setting strategy is developed to help protective relays adapt suitable settings to different operation conditions caused by the variations...

  17. Fuzzy comprehensive evaluation for grid-connected performance of integrated distributed PV-ES systems

    Science.gov (United States)

    Lv, Z. H.; Li, Q.; Huang, R. W.; Liu, H. M.; Liu, D.

    2016-08-01

    Based on the discussion about topology structure of integrated distributed photovoltaic (PV) power generation system and energy storage (ES) in single or mixed type, this paper focuses on analyzing grid-connected performance of integrated distributed photovoltaic and energy storage (PV-ES) systems, and proposes a comprehensive evaluation index system. Then a multi-level fuzzy comprehensive evaluation method based on grey correlation degree is proposed, and the calculations for weight matrix and fuzzy matrix are presented step by step. Finally, a distributed integrated PV-ES power generation system connected to a 380 V low voltage distribution network is taken as the example, and some suggestions are made based on the evaluation results.

  18. An integrated DEA-COLS-SFA algorithm for optimization and policy making of electricity distribution units

    International Nuclear Information System (INIS)

    Azadeh, A.; Ghaderi, S.F.; Omrani, H.; Eivazy, H.

    2009-01-01

    This paper presents an integrated data envelopment analysis (DEA)-corrected ordinary least squares (COLS)-stochastic frontier analysis (SFA)-principal component analysis (PCA)-numerical taxonomy (NT) algorithm for performance assessment, optimization and policy making of electricity distribution units. Previous studies have generally used input-output DEA models for benchmarking and evaluation of electricity distribution units. However, this study proposes an integrated flexible approach to measure the rank and choose the best version of the DEA method for optimization and policy making purposes. It covers both static and dynamic aspects of information environment due to involvement of SFA which is finally compared with the best DEA model through the Spearman correlation technique. The integrated approach would yield in improved ranking and optimization of electricity distribution systems. To illustrate the usability and reliability of the proposed algorithm, 38 electricity distribution units in Iran have been considered, ranked and optimized by the proposed algorithm of this study.

  19. Advanced Communication and Control for Distributed Energy Resource Integration: Phase 2 Scientific Report

    Energy Technology Data Exchange (ETDEWEB)

    BPL Global

    2008-09-30

    The objective of this research project is to demonstrate sensing, communication, information and control technologies to achieve a seamless integration of multivendor distributed energy resource (DER) units at aggregation levels that meet individual user requirements for facility operations (residential, commercial, industrial, manufacturing, etc.) and further serve as resource options for electric and natural gas utilities. The fully demonstrated DER aggregation system with embodiment of communication and control technologies will lead to real-time, interactive, customer-managed service networks to achieve greater customer value. Work on this Advanced Communication and Control Project (ACCP) consists of a two-phase approach for an integrated demonstration of communication and control technologies to achieve a seamless integration of DER units to reach progressive levels of aggregated power output. Phase I involved design and proof-of-design, and Phase II involves real-world demonstration of the Phase I design architecture. The scope of work for Phase II of this ACCP involves demonstrating the Phase I design architecture in large scale real-world settings while integrating with the operations of one or more electricity supplier feeder lines. The communication and control architectures for integrated demonstration shall encompass combinations of software and hardware components, including: sensors, data acquisition and communication systems, remote monitoring systems, metering (interval revenue, real-time), local and wide area networks, Web-based systems, smart controls, energy management/information systems with control and automation of building energy loads, and demand-response management with integration of real-time market pricing. For Phase II, BPL Global shall demonstrate the Phase I design for integrating and controlling the operation of more than 10 DER units, dispersed at various locations in one or more Independent System Operator (ISO) Control Areas, at

  20. Smart thermal grid with integration of distributed and centralized solar energy systems

    International Nuclear Information System (INIS)

    Yang, Libing; Entchev, Evgueniy; Rosato, Antonio; Sibilio, Sergio

    2017-01-01

    Smart thermal grids (STGs) are able to perform the same function as classical grids, but are developed in order to make better use of distributed, possibly intermittent, thermal energy resources and to provide the required energy when needed through efficient resources utilization and intelligent management. District heating (DH) plays a significant role in the implementation of future smart energy systems. To fulfil its role, DH technologies must be further developed to integrate renewable resources, create low-temperature networks, and consequently to make existing or new DH networks ready for integration into future STGs. Solar heating is a promising option for low-temperature DH systems. Thermal energy storage (TES) can make the availability of the energy supply match the demand. An integration of centralized seasonal and distributed short-term thermal storages would facilitate an efficient recovery of the solar energy. This study, through modelling and simulation, investigates the impacts of such integration on the overall performance of a community-level solar DH system. The performance analysis results show that the solar DH system with integration of distributed and centralized seasonal TESs improves system overall efficiency, and reduces DH network heat losses, primary energy consumption and greenhouse gas emissions, in comparison to the one without integration. - Highlights: • STG should be designed to store energy in the most efficient way at the most effective location. • Integration of centralized seasonal and distributed TESs in a solar DH system is proposed. • Performance of such integrated solar DH system is evaluated and compared to the one without. • The integration results in reduction of primary energy consumption and GHG emission. • The integration improves the overall efficiency of the total solar energy system.

  1. Integration of distributed system simulation tools for a holistic approach to integrated building and system design

    NARCIS (Netherlands)

    Radosevic, M.; Hensen, J.L.M.; Wijsman, A.J.T.M.; Hensen, J.L.M.; Lain, M.

    2004-01-01

    Advanced architectural developments require an integrated approach to design where simulation tools available today deal. only with a small subset of the overall problem. The aim of this study is to enable run time exchange of necessary data at suitable frequency between different simulation

  2. Towards Integrating Distributed Energy Resources and Storage Devices in Smart Grid.

    Science.gov (United States)

    Xu, Guobin; Yu, Wei; Griffith, David; Golmie, Nada; Moulema, Paul

    2017-02-01

    Internet of Things (IoT) provides a generic infrastructure for different applications to integrate information communication techniques with physical components to achieve automatic data collection, transmission, exchange, and computation. The smart grid, as one of typical applications supported by IoT, denoted as a re-engineering and a modernization of the traditional power grid, aims to provide reliable, secure, and efficient energy transmission and distribution to consumers. How to effectively integrate distributed (renewable) energy resources and storage devices to satisfy the energy service requirements of users, while minimizing the power generation and transmission cost, remains a highly pressing challenge in the smart grid. To address this challenge and assess the effectiveness of integrating distributed energy resources and storage devices, in this paper we develop a theoretical framework to model and analyze three types of power grid systems: the power grid with only bulk energy generators, the power grid with distributed energy resources, and the power grid with both distributed energy resources and storage devices. Based on the metrics of the power cumulative cost and the service reliability to users, we formally model and analyze the impact of integrating distributed energy resources and storage devices in the power grid. We also use the concept of network calculus, which has been traditionally used for carrying out traffic engineering in computer networks, to derive the bounds of both power supply and user demand to achieve a high service reliability to users. Through an extensive performance evaluation, our data shows that integrating distributed energy resources conjointly with energy storage devices can reduce generation costs, smooth the curve of bulk power generation over time, reduce bulk power generation and power distribution losses, and provide a sustainable service reliability to users in the power grid.

  3. Automation and robotics

    Science.gov (United States)

    Montemerlo, Melvin

    1988-01-01

    The Autonomous Systems focus on the automation of control systems for the Space Station and mission operations. Telerobotics focuses on automation for in-space servicing, assembly, and repair. The Autonomous Systems and Telerobotics each have a planned sequence of integrated demonstrations showing the evolutionary advance of the state-of-the-art. Progress is briefly described for each area of concern.

  4. Probabilistic Harmonic Analysis on Distributed Photovoltaic Integration Considering Typical Weather Scenarios

    Science.gov (United States)

    Bin, Che; Ruoying, Yu; Dongsheng, Dang; Xiangyan, Wang

    2017-05-01

    Distributed Generation (DG) integrating to the network would cause the harmonic pollution which would cause damages on electrical devices and affect the normal operation of power system. On the other hand, due to the randomness of the wind and solar irradiation, the output of DG is random, too, which leads to an uncertainty of the harmonic generated by the DG. Thus, probabilistic methods are needed to analyse the impacts of the DG integration. In this work we studied the harmonic voltage probabilistic distribution and the harmonic distortion in distributed network after the distributed photovoltaic (DPV) system integrating in different weather conditions, mainly the sunny day, cloudy day, rainy day and the snowy day. The probabilistic distribution function of the DPV output power in different typical weather conditions could be acquired via the parameter identification method of maximum likelihood estimation. The Monte-Carlo simulation method was adopted to calculate the probabilistic distribution of harmonic voltage content at different frequency orders as well as the harmonic distortion (THD) in typical weather conditions. The case study was based on the IEEE33 system and the results of harmonic voltage content probabilistic distribution as well as THD in typical weather conditions were compared.

  5. Integrated operation of electric vehicles and renewable generation in a smart distribution system

    International Nuclear Information System (INIS)

    Zakariazadeh, Alireza; Jadid, Shahram; Siano, Pierluigi

    2015-01-01

    Highlights: • The contribution of electric vehicles to provide the reserve capacity is analyzed. • Decentralized energy and reserve scheduling in a distribution system is presented. • The integrated operation of renewable generation and electric vehicles is proposed. - Abstract: Distribution system complexity is increasing mainly due to technological innovation, renewable Distributed Generation (DG) and responsive loads. This complexity makes difficult the monitoring, control and operation of distribution networks for Distribution System Operators (DSOs). In order to cope with this complexity, a novel method for the integrated operational planning of a distribution system is presented in this paper. The method introduces the figure of the aggregator, conceived as an intermediate agent between end-users and DSOs. In the proposed method, energy and reserve scheduling is carried out by both aggregators and DSO. Moreover, Electric Vehicles (EVs) are considered as responsive loads that can participate in ancillary service programs by providing reserve to the system. The efficiency of the proposed method is evaluated on an 84-bus distribution test system. Simulation results show that the integrated scheduling of EVs and renewable generators can mitigate the negative effects related to the uncertainty of renewable generation

  6. Generative Adversarial Networks Based Heterogeneous Data Integration and Its Application for Intelligent Power Distribution and Utilization

    Directory of Open Access Journals (Sweden)

    Yuanpeng Tan

    2018-01-01

    Full Text Available Heterogeneous characteristics of a big data system for intelligent power distribution and utilization have already become more and more prominent, which brings new challenges for the traditional data analysis technologies and restricts the comprehensive management of distribution network assets. In order to solve the problem that heterogeneous data resources of power distribution systems are difficult to be effectively utilized, a novel generative adversarial networks (GANs based heterogeneous data integration method for intelligent power distribution and utilization is proposed. In the proposed method, GANs theory is introduced to expand the distribution of completed data samples. Then, a so-called peak clustering algorithm is proposed to realize the finite open coverage of the expanded sample space, and repair those incomplete samples to eliminate the heterogeneous characteristics. Finally, in order to realize the integration of the heterogeneous data for intelligent power distribution and utilization, the well-trained discriminator model of GANs is employed to check the restored data samples. The simulation experiments verified the validity and stability of the proposed heterogeneous data integration method, which provides a novel perspective for the further data quality management of power distribution systems.

  7. DISCO - A concept of a system for integrated data base management in distributed data processing systems

    International Nuclear Information System (INIS)

    Holler, E.

    1980-01-01

    The development in data processing technology favors the trend towards distributed data processing systems: The large-scale integration of semiconductor devices has lead to very efficient (approx. 10 6 operations per second) and relatively cheap low end computers being offered today, that allow to install distributed data processing systems with a total capacity coming near to that of large-scale data processing plants at a tolerable investment expenditure. The technologies of communication and data banks, each by itself, have reached a state of development justifying their routine application. This is made evident by the present efforts for standardization in both areas. The integration of both technologies in the development of systems for integrated distributed data bank management, however, is new territory for engineering. (orig.) [de

  8. Integration of Geographical Information Systems and Geophysical Applications with Distributed Computing Technologies.

    Science.gov (United States)

    Pierce, M. E.; Aktas, M. S.; Aydin, G.; Fox, G. C.; Gadgil, H.; Sayar, A.

    2005-12-01

    We examine the application of Web Service Architectures and Grid-based distributed computing technologies to geophysics and geo-informatics. We are particularly interested in the integration of Geographical Information System (GIS) services with distributed data mining applications. GIS services provide the general purpose framework for building archival data services, real time streaming data services, and map-based visualization services that may be integrated with data mining and other applications through the use of distributed messaging systems and Web Service orchestration tools. Building upon on our previous work in these areas, we present our current research efforts. These include fundamental investigations into increasing XML-based Web service performance, supporting real time data streams, and integrating GIS mapping tools with audio/video collaboration systems for shared display and annotation.

  9. System Integration of Distributed Power for Complete Building Systems: Phase 2 Report

    Energy Technology Data Exchange (ETDEWEB)

    Kramer, R.

    2003-12-01

    This report describes NiSource Energy Technologies Inc.'s second year of a planned 3-year effort to advance distributed power development, deployment, and integration. Its long-term goal is to design ways to extend distributed generation into the physical design and controls of buildings. NET worked to meet this goal through advances in the implementation and control of combined heat and power systems in end-user environments and a further understanding of electric interconnection and siting issues. The specific objective of work under this subcontract is to identify the system integration and implementation issues of DG and develop and test potential solutions to these issues. In addition, recommendations are made to resolve identified issues that may hinder or slow the integration of integrated energy systems into the national energy picture.

  10. Integration of refrigerators into facility automation with the aid of bus systems; Integration der Kaeltetechnik in die Gebaeudeautomation mit Bus-Systemen

    Energy Technology Data Exchange (ETDEWEB)

    Baumgarth, S.; Heiser, M. [Fachhochschule Braunschweig-Wolfenbuettel, Wolfenbuettel (DE). Inst. fuer Verbrennungstechnik und Prozessautomation (IVP)

    2000-07-01

    The integration of refrigeration systems in building automation is illustrated by the example of a ventilation system and a cooling ceiling. Cold is supplied by a coldwater unit. This necessitates supply of consumption data to the central refrigeration system. So far, technical facilities are commonly controlled by DDC systems and in-house bus systems. The demand for open communication between different systems resulted in the development of neutral systems like FND and Profibus, which were followed by a multitude of other, less generally accepted systems. In the field of electrical installations, the European Installation Bus EIB was generally accepted as a certified, open bus system which can be combined with DDC technology and integrated in in-house control systems. Another technology developed in the USA is the Local Operating Network, LON, whic has a hither transmission rate and higher information content for the various bus members. The contribution compares the two bus systems. [German] Die Einbindung der Kaeltetechnik in die Gebaeudeautomation wird an den Beispielen Lueftungsanlage und Kuehldecke vorgestellt. Die erforderliche Kaelte soll durch einen Kaltwasserersatz energieoptimiert bereitgestellt werden. Dazu muessen Informationen aus dem Verbraucherbereich in der Kaeltezentrale zur Verfuegung stehen. RTL-Anlagen wurden bisher mittels DDC-Technik und firmenspezifischen Bus-Systemen geregelt und gesteuert. Wenn verschiedene Anlagen durch DDC-Systeme unterschiedlicher Hersteller betrieben wurden, so war anfaenglich eine gemeinsame Ueberwachung auf einer zentralen Leitwarte nur ueber Gateways moeglich. Aus der Forderung nach offener Kommunikation unterschiedlicher Systeme entstanden die ersten firmenneutralen Entwicklungen FND und Profibus. Es folgten eine Vielzahl weiterer Netz- und Busdefinitionen und deren Kommunikationsprotokolle, denen jedoch weitgehend eine allgemeine Anerkennung versagt blieb. Im Bereich der Elektroinstallation profilierte sich der

  11. A Gordeyev integral for electrostatic waves in a magnetized plasma with a kappa velocity distribution

    International Nuclear Information System (INIS)

    Mace, R.L.

    2003-01-01

    A Gordeyev-type integral for the investigation of electrostatic waves in magnetized plasma having a kappa or generalized Lorentzian velocity distribution is derived. The integral readily reduces, in the unmagnetized and parallel propagation limits, to simple expressions involving the Z κ function. For propagation perpendicular to the magnetic field, it is shown that the Gordeyev integral can be written in closed form as a sum of two generalized hypergeometric functions, which permits easy analysis of the dispersion relation for electrostatic waves. Employing the same analytical techniques used for the kappa distribution, it is further shown that the well-known Gordeyev integral for a Maxwellian distribution can be written very concisely as a generalized hypergeometric function in the limit of perpendicular propagation. This expression, in addition to its mathematical conciseness, has other advantages over the traditional sum over modified Bessel functions form. Examples of the utility of these generalized hypergeometric series, especially how they simplify analyses of electrostatic waves propagating perpendicular to the magnetic field, are given. The new expression for the Gordeyev integral for perpendicular propagation is solved numerically to obtain the dispersion relations for the electrostatic Bernstein modes in a plasma with a kappa distribution

  12. INTEGRATING DISTRIBUTED WORK: COMPARING TASK DESIGN, COMMUNICATION, AND TACIT COORDINATION MECHANISMS

    DEFF Research Database (Denmark)

    Srikanth, K.; Puranam, P.

    2011-01-01

    We investigate coordination strategies in integrating distributed work. In the context of Business Process Offshoring (BPO), we analyze survey data from 126 offshored processes to understand both the sources of difficulty in integrating distributed work as well as how organizations overcome...... on tacit coordination-and theoretically articulate and empirically show that tacit coordination mechanisms are distinct from the well-known duo of coordination strategies: building communication channels or modularizing processes to minimize the need for communication. We discuss implications for the study...

  13. High-dimensional quantum key distribution based on multicore fiber using silicon photonic integrated circuits

    DEFF Research Database (Denmark)

    Ding, Yunhong; Bacco, Davide; Dalgaard, Kjeld

    2017-01-01

    is intrinsically limited to 1 bit/photon. Here we propose and experimentally demonstrate, for the first time, a high-dimensional quantum key distribution protocol based on space division multiplexing in multicore fiber using silicon photonic integrated lightwave circuits. We successfully realized three mutually......-dimensional quantum states, and enables breaking the information efficiency limit of traditional quantum key distribution protocols. In addition, the silicon photonic circuits used in our work integrate variable optical attenuators, highly efficient multicore fiber couplers, and Mach-Zehnder interferometers, enabling...

  14. Human automation integration

    NARCIS (Netherlands)

    Barnes, M.; Cosenzo, K.; Galster, s.; Hollnagel, E.; Miller, C.; Parasuraman, R.; Reising, J.; Taylor, R.; Breda, L. van

    2007-01-01

    Many versions of future concept of operations (CONOPS) rely heavily on UMVs. The pressure to take the human out of immediate control of these vehicles is being driven by several factors. These factors include a reduction in cost for the production and maintenance of the vehicle, operational

  15. Uncertainty assessment of integrated distributed hydrological models using GLUE with Markov chain Monte Carlo sampling

    DEFF Research Database (Denmark)

    Blasone, Roberta-Serena; Madsen, Henrik; Rosbjerg, Dan

    2008-01-01

    uncertainty estimation (GLUE) procedure based on Markov chain Monte Carlo sampling is applied in order to improve the performance of the methodology in estimating parameters and posterior output distributions. The description of the spatial variations of the hydrological processes is accounted for by defining......In recent years, there has been an increase in the application of distributed, physically-based and integrated hydrological models. Many questions regarding how to properly calibrate and validate distributed models and assess the uncertainty of the estimated parameters and the spatially......-site validation must complement the usual time validation. In this study, we develop, through an application, a comprehensive framework for multi-criteria calibration and uncertainty assessment of distributed physically-based, integrated hydrological models. A revised version of the generalized likelihood...

  16. Distributed finite-time containment control for double-integrator multiagent systems.

    Science.gov (United States)

    Wang, Xiangyu; Li, Shihua; Shi, Peng

    2014-09-01

    In this paper, the distributed finite-time containment control problem for double-integrator multiagent systems with multiple leaders and external disturbances is discussed. In the presence of multiple dynamic leaders, by utilizing the homogeneous control technique, a distributed finite-time observer is developed for the followers to estimate the weighted average of the leaders' velocities at first. Then, based on the estimates and the generalized adding a power integrator approach, distributed finite-time containment control algorithms are designed to guarantee that the states of the followers converge to the dynamic convex hull spanned by those of the leaders in finite time. Moreover, as a special case of multiple dynamic leaders with zero velocities, the proposed containment control algorithms also work for the case of multiple stationary leaders without using the distributed observer. Simulations demonstrate the effectiveness of the proposed control algorithms.

  17. Order Division Automated System.

    Science.gov (United States)

    Kniemeyer, Justin M.; And Others

    This publication was prepared by the Order Division Automation Project staff to fulfill the Library of Congress' requirement to document all automation efforts. The report was originally intended for internal use only and not for distribution outside the Library. It is now felt that the library community at-large may have an interest in the…

  18. Management functions for a distributed system applied in power plants and substations automation; Funcoes de gerenciamento para um sistema distribuido aplicado na automacao de usinas e subestacoes

    Energy Technology Data Exchange (ETDEWEB)

    Lemos, Rogerio Sergio Neves de

    1988-04-01

    This work presents the management functions for a distributed computer control system, which has been developed at Centro de Pesquisas de Energia Eletrica - CEPEL with the aim to automate hydroelectric power plants and extra high voltage power substations. The initial specifications developed are based on a architecture without redundancies which initially has the objective to get the effective knowledge of the system's behaviour. In the first part is given an introduction of dependability, with two aims: to present the basic concepts and terminology, and, to present a set of techniques in the area of computing systems fault tolerance. In control system is introduced, in terms of its architecture and functionality. The third part introduces the management activities for the distributed system are introduced together with a general model. Finally, the fault diagnosis in the distributed system is discussed, where is presented a new diagnosis algorithm. (author)

  19. Management functions for a distributed system applied in power plants and substations automation; Funcoes de gerenciamento para um sistema distribuido aplicado na automacao de usinas e subestacoes

    Energy Technology Data Exchange (ETDEWEB)

    Lemos, Rogerio Sergio Neves de

    1988-04-01

    This work presents the management functions for a distributed computer control system, which has been developed at Centro de Pesquisas de Energia Eletrica - CEPEL with the aim to automate hydroelectric power plants and extra high voltage power substations. The initial specifications developed are based on a architecture without redundancies which initially has the objective to get the effective knowledge of the system's behaviour. In the first part is given an introduction of dependability, with two aims: to present the basic concepts and terminology, and, to present a set of techniques in the area of computing systems fault tolerance. In control system is introduced, in terms of its architecture and functionality. The third part introduces the management activities for the distributed system are introduced together with a general model. Finally, the fault diagnosis in the distributed system is discussed, where is presented a new diagnosis algorithm. (author)

  20. Building Automation and Control Systems and Electrical Distribution Grids: A Study on the Effects of Loads Control Logics on Power Losses and Peaks

    Directory of Open Access Journals (Sweden)

    Salvatore Favuzza

    2018-03-01

    Full Text Available Growing home comfort is causing increasing energy consumption in residential buildings and a consequent stress in urban medium and low voltage distribution networks. Therefore, distribution system operators are obliged to manage problems related to the reliability of the electricity system and, above all, they must consider investments for enhancing the electrical infrastructure. The purpose of this paper is to assess how the reduction of building electricity consumption and the modification of the building load profile, due to load automation, combined with suitable load control programs, can improve network reliability and distribution efficiency. This paper proposes an extensive study on this issue, considering various operating scenarios with four load control programs with different purposes, the presence/absence of local generation connected to the buildings and different external thermal conditions. The study also highlights how different climatic conditions can influence the effects of the load control logics.

  1. Multisensor Distributed Track Fusion AlgorithmBased on Strong Tracking Filter and Feedback Integration1)

    Institute of Scientific and Technical Information of China (English)

    YANGGuo-Sheng; WENCheng-Lin; TANMin

    2004-01-01

    A new multisensor distributed track fusion algorithm is put forward based on combiningthe feedback integration with the strong tracking Kalman filter. Firstly, an effective tracking gateis constructed by taking the intersection of the tracking gates formed before and after feedback.Secondly, on the basis of the constructed effective tracking gate, probabilistic data association andstrong tracking Kalman filter are combined to form the new multisensor distributed track fusionalgorithm. At last, simulation is performed on the original algorithm and the algorithm presented.

  2. Plant automation

    International Nuclear Information System (INIS)

    Christensen, L.J.; Sackett, J.I.; Dayal, Y.; Wagner, W.K.

    1989-01-01

    This paper describes work at EBR-II in the development and demonstration of new control equipment and methods and associated schemes for plant prognosis, diagnosis, and automation. The development work has attracted the interest of other national laboratories, universities, and commercial companies. New initiatives include use of new control strategies, expert systems, advanced diagnostics, and operator displays. The unique opportunity offered by EBR-II is as a test bed where a total integrated approach to automatic reactor control can be directly tested under real power plant conditions

  3. Integration of quantum key distribution and private classical communication through continuous variable

    Science.gov (United States)

    Wang, Tianyi; Gong, Feng; Lu, Anjiang; Zhang, Damin; Zhang, Zhengping

    2017-12-01

    In this paper, we propose a scheme that integrates quantum key distribution and private classical communication via continuous variables. The integrated scheme employs both quadratures of a weak coherent state, with encrypted bits encoded on the signs and Gaussian random numbers encoded on the values of the quadratures. The integration enables quantum and classical data to share the same physical and logical channel. Simulation results based on practical system parameters demonstrate that both classical communication and quantum communication can be implemented over distance of tens of kilometers, thus providing a potential solution for simultaneous transmission of quantum communication and classical communication.

  4. Effects of age condition on the distribution and integrity of inorganic fillers in dental resin composites.

    Science.gov (United States)

    D'Alpino, Paulo Henrique Perlatti; Svizero, Nádia da Rocha; Bim Júnior, Odair; Valduga, Claudete Justina; Graeff, Carlos Frederico de Oliveira; Sauro, Salvatore

    2016-06-01

    The aim of this study is to evaluate the distribution of the filler size along with the zeta potential, and the integrity of silane-bonded filler surface in different types of restorative dental composites as a function of the material age condition. Filtek P60 (hybrid composite), Filtek Z250 (small-particle filled composite), Filtek Z350XT (nanofilled composite), and Filtek Silorane (silorane composite) (3M ESPE) were tested at different stage condition (i.e., fresh/new, aged, and expired). Composites were submitted to an accelerated aging protocol (Arrhenius model). Specimens were obtained by first diluting each composite specimen in ethanol and then dispersed in potassium chloride solution (0.001 mol%). Composite fillers were characterized for their zeta potential, mean particle size, size distribution, via poly-dispersion dynamic light scattering. The integrity of the silane-bonded surface of the fillers was characterized by FTIR. The material age influenced significantly the outcomes; Zeta potential, filler characteristics, and silane integrity varied both after aging and expiration. Silorane presented the broadest filler distribution and lowest zeta potential. Nanofilled and silorane composites exhibited decreased peak intensities in the FTIR analysis, indicating a deficiency of the silane integrity after aging or expiry time. Regardless to the material condition, the hybrid and the small-particle-filled composites were more stable overtime as no significant alteration in filler size distribution, diameter, and zeta potential occurred. A deficiency in the silane integrity in the nanofilled and silorane composites seems to be affected by the material stage condition. The materials conditions tested in this study influenced the filler size distribution, the zeta potential, and integrity of the silane adsorbed on fillers in the nanofilled and silorane composites. Thus, this may result in a decrease of the clinical performance of aforementioned composites, in

  5. Automated contouring error detection based on supervised geometric attribute distribution models for radiation therapy: A general strategy

    International Nuclear Information System (INIS)

    Chen, Hsin-Chen; Tan, Jun; Dolly, Steven; Kavanaugh, James; Harold Li, H.; Altman, Michael; Gay, Hiram; Thorstad, Wade L.; Mutic, Sasa; Li, Hua; Anastasio, Mark A.; Low, Daniel A.

    2015-01-01

    Purpose: One of the most critical steps in radiation therapy treatment is accurate tumor and critical organ-at-risk (OAR) contouring. Both manual and automated contouring processes are prone to errors and to a large degree of inter- and intraobserver variability. These are often due to the limitations of imaging techniques in visualizing human anatomy as well as to inherent anatomical variability among individuals. Physicians/physicists have to reverify all the radiation therapy contours of every patient before using them for treatment planning, which is tedious, laborious, and still not an error-free process. In this study, the authors developed a general strategy based on novel geometric attribute distribution (GAD) models to automatically detect radiation therapy OAR contouring errors and facilitate the current clinical workflow. Methods: Considering the radiation therapy structures’ geometric attributes (centroid, volume, and shape), the spatial relationship of neighboring structures, as well as anatomical similarity of individual contours among patients, the authors established GAD models to characterize the interstructural centroid and volume variations, and the intrastructural shape variations of each individual structure. The GAD models are scalable and deformable, and constrained by their respective principal attribute variations calculated from training sets with verified OAR contours. A new iterative weighted GAD model-fitting algorithm was developed for contouring error detection. Receiver operating characteristic (ROC) analysis was employed in a unique way to optimize the model parameters to satisfy clinical requirements. A total of forty-four head-and-neck patient cases, each of which includes nine critical OAR contours, were utilized to demonstrate the proposed strategy. Twenty-nine out of these forty-four patient cases were utilized to train the inter- and intrastructural GAD models. These training data and the remaining fifteen testing data sets

  6. Automated contouring error detection based on supervised geometric attribute distribution models for radiation therapy: A general strategy

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Hsin-Chen; Tan, Jun; Dolly, Steven; Kavanaugh, James; Harold Li, H.; Altman, Michael; Gay, Hiram; Thorstad, Wade L.; Mutic, Sasa; Li, Hua, E-mail: huli@radonc.wustl.edu [Department of Radiation Oncology, Washington University, St. Louis, Missouri 63110 (United States); Anastasio, Mark A. [Department of Biomedical Engineering, Washington University, St. Louis, Missouri 63110 (United States); Low, Daniel A. [Department of Radiation Oncology, University of California Los Angeles, Los Angeles, California 90095 (United States)

    2015-02-15

    Purpose: One of the most critical steps in radiation therapy treatment is accurate tumor and critical organ-at-risk (OAR) contouring. Both manual and automated contouring processes are prone to errors and to a large degree of inter- and intraobserver variability. These are often due to the limitations of imaging techniques in visualizing human anatomy as well as to inherent anatomical variability among individuals. Physicians/physicists have to reverify all the radiation therapy contours of every patient before using them for treatment planning, which is tedious, laborious, and still not an error-free process. In this study, the authors developed a general strategy based on novel geometric attribute distribution (GAD) models to automatically detect radiation therapy OAR contouring errors and facilitate the current clinical workflow. Methods: Considering the radiation therapy structures’ geometric attributes (centroid, volume, and shape), the spatial relationship of neighboring structures, as well as anatomical similarity of individual contours among patients, the authors established GAD models to characterize the interstructural centroid and volume variations, and the intrastructural shape variations of each individual structure. The GAD models are scalable and deformable, and constrained by their respective principal attribute variations calculated from training sets with verified OAR contours. A new iterative weighted GAD model-fitting algorithm was developed for contouring error detection. Receiver operating characteristic (ROC) analysis was employed in a unique way to optimize the model parameters to satisfy clinical requirements. A total of forty-four head-and-neck patient cases, each of which includes nine critical OAR contours, were utilized to demonstrate the proposed strategy. Twenty-nine out of these forty-four patient cases were utilized to train the inter- and intrastructural GAD models. These training data and the remaining fifteen testing data sets

  7. Multi-agent based modeling for electric vehicle integration in a distribution network operation

    DEFF Research Database (Denmark)

    Hu, Junjie; Morais, Hugo; Lind, Morten

    2016-01-01

    The purpose of this paper is to present a multi-agent based modeling technology for simulating and operating a hierarchical energy management of a power distribution system with focus on EVs integration. The proposed multi-agent system consists of four types of agents: i) Distribution system...... operator (DSO) technical agent and ii) DSO market agents that both belong to the top layer of the hierarchy and their roles are to manage the distribution network by avoiding grid congestions and using congestion prices to coordinate the energy scheduled; iii) Electric vehicle virtual power plant agents...

  8. Predicting cycle time distributions for integrated processing workstations : an aggregate modeling approach

    NARCIS (Netherlands)

    Veeger, C.P.L.; Etman, L.F.P.; Lefeber, A.A.J.; Adan, I.J.B.F.; Herk, van J.; Rooda, J.E.

    2011-01-01

    To predict cycle time distributions of integrated processing workstations, detailed simulation models are almost exclusively used; these models require considerable development and maintenance effort. As an alternative, we propose an aggregate model that is a lumped-parameter representation of the

  9. TUZ, Resonance Integrals in Unresolved Region, Various Temperature, From Porter-Thomas Distribution

    International Nuclear Information System (INIS)

    Kuncir, G.F.

    1969-01-01

    1 - Nature of physical problem solved: TUZ computes resonance integrals for a wide variety of temperatures, compositions, and geometries for the unresolved resonances. 2 - Method of solution: The resonances are considered to be defined by an average over the Porter-Thomas distribution of neutron widths

  10. Distributed constraint satisfaction for coordinating and integrating a large-scale, heterogenous enterprise

    CERN Document Server

    Eisenberg, C

    2003-01-01

    Market forces are continuously driving public and private organisations towards higher productivity, shorter process and production times, and fewer labour hours. To cope with these changes, organisations are adopting new organisational models of coordination and cooperation that increase their flexibility, consistency, efficiency, productivity and profit margins. In this thesis an organisational model of coordination and cooperation is examined using a real life example; the technical integration of a distributed large-scale project of an international physics collaboration. The distributed resource constraint project scheduling problem is modelled and solved with the methods of distributed constraint satisfaction. A distributed local search method, the distributed breakout algorithm (DisBO), is used as the basis for the coordination scheme. The efficiency of the local search method is improved by extending it with an incremental problem solving scheme with variable ordering. The scheme is implemented as cen...

  11. Distribution transformer lifetime analysis in the presence of demand response and rooftop PV integration

    Directory of Open Access Journals (Sweden)

    Behi Behnaz

    2017-01-01

    Full Text Available Many distribution transformers have already exceeded half of their expected service life of 35 years in the infrastructure of Western Power, the electric distribution company supplying southwest of Western Australia, Australia. Therefore, it is anticipated that a high investment on transformer replacement happens in the near future. However, high renewable integration and demand response (DR are promising resources to defer the investment on infrastructure upgrade and extend the lifetime of transformers. This paper investigates the impact of rooftop photovoltaic (PV integration and customer engagement through DR on the lifetime of transformers in electric distribution networks. To this aim, first, a time series modelling of load, DR and PV is utilised for each year over a planning period. This load model is applied to a typical distribution transformer for which the hot-spot temperature rise is modelled based on the relevant standard. Using this calculation platform, the loss of life and the actual age of distribution transformer are obtained. Then, various scenarios including different levels of PV penetration and DR contribution are examined, and their impacts on the age of transformer are reported. Finally, the equivalent loss of net present value of distribution transformer is formulated and discussed. This formulation gives major benefits to the distribution network planners for analysing the contribution of PV and DR on lifetime extension of the distribution transformer. In addition, the provided model can be utilised in optimal investment analysis to find the best time for the transformer replacement and the associated cost considering PV penetration and DR. The simulation results show that integration of PV and DR within a feeder can significantly extend the lifetime of transformers.

  12. Integration of distributed plant process computer systems to nuclear power generation facilities

    International Nuclear Information System (INIS)

    Bogard, T.; Finlay, K.

    1996-01-01

    Many operating nuclear power generation facilities are replacing their plant process computer. Such replacement projects are driven by equipment obsolescence issues and associated objectives to improve plant operability, increase plant information access, improve man machine interface characteristics, and reduce operation and maintenance costs. This paper describes a few recently completed and on-going replacement projects with emphasis upon the application integrated distributed plant process computer systems. By presenting a few recent projects, the variations of distributed systems design show how various configurations can address needs for flexibility, open architecture, and integration of technological advancements in instrumentation and control technology. Architectural considerations for optimal integration of the plant process computer and plant process instrumentation ampersand control are evident from variations of design features

  13. Effects of active conductance distribution over dendrites on the synaptic integration in an identified nonspiking interneuron.

    Directory of Open Access Journals (Sweden)

    Akira Takashima

    Full Text Available The synaptic integration in individual central neuron is critically affected by how active conductances are distributed over dendrites. It has been well known that the dendrites of central neurons are richly endowed with voltage- and ligand-regulated ion conductances. Nonspiking interneurons (NSIs, almost exclusively characteristic to arthropod central nervous systems, do not generate action potentials and hence lack voltage-regulated sodium channels, yet having a variety of voltage-regulated potassium conductances on their dendritic membrane including the one similar to the delayed-rectifier type potassium conductance. It remains unknown, however, how the active conductances are distributed over dendrites and how the synaptic integration is affected by those conductances in NSIs and other invertebrate neurons where the cell body is not included in the signal pathway from input synapses to output sites. In the present study, we quantitatively investigated the functional significance of active conductance distribution pattern in the spatio-temporal spread of synaptic potentials over dendrites of an identified NSI in the crayfish central nervous system by computer simulation. We systematically changed the distribution pattern of active conductances in the neuron's multicompartment model and examined how the synaptic potential waveform was affected by each distribution pattern. It was revealed that specific patterns of nonuniform distribution of potassium conductances were consistent, while other patterns were not, with the waveform of compound synaptic potentials recorded physiologically in the major input-output pathway of the cell, suggesting that the possibility of nonuniform distribution of potassium conductances over the dendrite cannot be excluded as well as the possibility of uniform distribution. Local synaptic circuits involving input and output synapses on the same branch or on the same side were found to be potentially affected under

  14. Development of System Architecture to Investigate the Impact of Integrated Air and Missile Defense in a Distributed Lethality Environment

    Science.gov (United States)

    2017-12-01

    SYSTEM ARCHITECTURE TO INVESTIGATE THE IMPACT OF INTEGRATED AIR AND MISSILE DEFENSE IN A DISTRIBUTED LETHALITY ENVIRONMENT by Justin K. Davis...TO INVESTIGATE THE IMPACT OF INTEGRATED AIR AND MISSILE DEFENSE IN A DISTRIBUTED LETHALITY ENVIRONMENT 5. FUNDING NUMBERS 6. AUTHOR(S) Justin K...ARCHITECTURE TO INVESTIGATE THE IMPACT OF INTEGRATED AIR AND MISSILE DEFENSE IN A DISTRIBUTED LETHALITY ENVIRONMENT Justin K. Davis Lieutenant

  15. An Extended Genetic Algorithm for Distributed Integration of Fuzzy Process Planning and Scheduling

    Directory of Open Access Journals (Sweden)

    Shuai Zhang

    2016-01-01

    Full Text Available The distributed integration of process planning and scheduling (DIPPS aims to simultaneously arrange the two most important manufacturing stages, process planning and scheduling, in a distributed manufacturing environment. Meanwhile, considering its advantage corresponding to actual situation, the triangle fuzzy number (TFN is adopted in DIPPS to represent the machine processing and transportation time. In order to solve this problem and obtain the optimal or near-optimal solution, an extended genetic algorithm (EGA with innovative three-class encoding method, improved crossover, and mutation strategies is proposed. Furthermore, a local enhancement strategy featuring machine replacement and order exchange is also added to strengthen the local search capability on the basic process of genetic algorithm. Through the verification of experiment, EGA achieves satisfactory results all in a very short period of time and demonstrates its powerful performance in dealing with the distributed integration of fuzzy process planning and scheduling (DIFPPS.

  16. Extending the NIF DISCO framework to automate complex workflow: coordinating the harvest and integration of data from diverse neuroscience information resources.

    Science.gov (United States)

    Marenco, Luis N; Wang, Rixin; Bandrowski, Anita E; Grethe, Jeffrey S; Shepherd, Gordon M; Miller, Perry L

    2014-01-01

    This paper describes how DISCO, the data aggregator that supports the Neuroscience Information Framework (NIF), has been extended to play a central role in automating the complex workflow required to support and coordinate the NIF's data integration capabilities. The NIF is an NIH Neuroscience Blueprint initiative designed to help researchers access the wealth of data related to the neurosciences available via the Internet. A central component is the NIF Federation, a searchable database that currently contains data from 231 data and information resources regularly harvested, updated, and warehoused in the DISCO system. In the past several years, DISCO has greatly extended its functionality and has evolved to play a central role in automating the complex, ongoing process of harvesting, validating, integrating, and displaying neuroscience data from a growing set of participating resources. This paper provides an overview of DISCO's current capabilities and discusses a number of the challenges and future directions related to the process of coordinating the integration of neuroscience data within the NIF Federation.

  17. Gauss-Kronrod-Trapezoidal Integration Scheme for Modeling Biological Tissues with Continuous Fiber Distributions

    Science.gov (United States)

    Hou, Chieh; Ateshian, Gerard A.

    2015-01-01

    Fibrous biological tissues may be modeled using a continuous fiber distribution (CFD) to capture tension-compression nonlinearity, anisotropic fiber distributions, and load-induced anisotropy. The CFD framework requires spherical integration of weighted individual fiber responses, with fibers contributing to the stress response only when they are in tension. The common method for performing this integration employs the discretization of the unit sphere into a polyhedron with nearly uniform triangular faces (finite element integration or FEI scheme). Although FEI has proven to be more accurate and efficient than integration using spherical coordinates, it presents three major drawbacks: First, the number of elements on the unit sphere needed to achieve satisfactory accuracy becomes a significant computational cost in a finite element analysis. Second, fibers may not be in tension in some regions on the unit sphere, where the integration becomes a waste. Third, if tensed fiber bundles span a small region compared to the area of the elements on the sphere, a significant discretization error arises. This study presents an integration scheme specialized to the CFD framework, which significantly mitigates the first drawback of the FEI scheme, while eliminating the second and third completely. Here, integration is performed only over the regions of the unit sphere where fibers are in tension. Gauss-Kronrod quadrature is used across latitudes and the trapezoidal scheme across longitudes. Over a wide range of strain states, fiber material properties, and fiber angular distributions, results demonstrate that this new scheme always outperforms FEI, sometimes by orders of magnitude in the number of computational steps and relative accuracy of the stress calculation. PMID:26291492

  18. A Gauss-Kronrod-Trapezoidal integration scheme for modeling biological tissues with continuous fiber distributions.

    Science.gov (United States)

    Hou, Chieh; Ateshian, Gerard A

    2016-01-01

    Fibrous biological tissues may be modeled using a continuous fiber distribution (CFD) to capture tension-compression nonlinearity, anisotropic fiber distributions, and load-induced anisotropy. The CFD framework requires spherical integration of weighted individual fiber responses, with fibers contributing to the stress response only when they are in tension. The common method for performing this integration employs the discretization of the unit sphere into a polyhedron with nearly uniform triangular faces (finite element integration or FEI scheme). Although FEI has proven to be more accurate and efficient than integration using spherical coordinates, it presents three major drawbacks: First, the number of elements on the unit sphere needed to achieve satisfactory accuracy becomes a significant computational cost in a finite element (FE) analysis. Second, fibers may not be in tension in some regions on the unit sphere, where the integration becomes a waste. Third, if tensed fiber bundles span a small region compared to the area of the elements on the sphere, a significant discretization error arises. This study presents an integration scheme specialized to the CFD framework, which significantly mitigates the first drawback of the FEI scheme, while eliminating the second and third completely. Here, integration is performed only over the regions of the unit sphere where fibers are in tension. Gauss-Kronrod quadrature is used across latitudes and the trapezoidal scheme across longitudes. Over a wide range of strain states, fiber material properties, and fiber angular distributions, results demonstrate that this new scheme always outperforms FEI, sometimes by orders of magnitude in the number of computational steps and relative accuracy of the stress calculation.

  19. Organizational and methodical approaches to disclosure of income distribution in integrated reporting

    Directory of Open Access Journals (Sweden)

    Legenchyk S.F.

    2017-08-01

    Full Text Available The essence of the integrated reporting is described. Preconditions and problems when introducing the integrated reporting are presented. The information requests of integrated reporting users are described. The necessity of reflection of all types of capital, namely, natural, social, human and intellectual in the integrated reporting are revealed. The main advantages of compiling the integrated accounts for the enterprise are established: a broader perspective of consideration of the activity; the improvement of accounting policy as a result of integration of the principles of sustainable development into activity; increasing the trust of workers and consumers in the safety of technological processes and products for environment. For the wider introduction of the integrated reporting it is necessary to develop the methodological provision for accounting in accordance with the principles of sustainable development to ensure the reliability of the indices obtained; it is necessary to select the optimal list of indices that can meet the information needs of all users, in particular, investors, the state, auditors, society, creditors, consumers, employees, management personnel, academics, and the media. The main tasks and principles of compilation of the integrated reporting are presented. Orientation to the future, materiality, demand, integrity, reliability, completeness, periodicity, consistency, timeliness, interpretation, and comparability are suggested as the principles of integrated reporting. The issue of the necessity of conducting the external audit of the data verification of the integrity reporting is highlighted. The algorithm of profit distribution depending on the chosen strategy of enterprise development is proposed. When directing net profits for the development of production, the authors identify the following three strategies: insufficient upgrades, upgrades at the level of wear and the advanced renewal of non-current assets. It

  20. Distributed XQuery-Based Integration and Visualization of Multimodality Brain Mapping Data.

    Science.gov (United States)

    Detwiler, Landon T; Suciu, Dan; Franklin, Joshua D; Moore, Eider B; Poliakov, Andrew V; Lee, Eunjung S; Corina, David P; Ojemann, George A; Brinkley, James F

    2009-01-01

    This paper addresses the need for relatively small groups of collaborating investigators to integrate distributed and heterogeneous data about the brain. Although various national efforts facilitate large-scale data sharing, these approaches are generally too "heavyweight" for individual or small groups of investigators, with the result that most data sharing among collaborators continues to be ad hoc. Our approach to this problem is to create a "lightweight" distributed query architecture, in which data sources are accessible via web services that accept arbitrary query languages but return XML results. A Distributed XQuery Processor (DXQP) accepts distributed XQueries in which subqueries are shipped to the remote data sources to be executed, with the resulting XML integrated by DXQP. A web-based application called DXBrain accesses DXQP, allowing a user to create, save and execute distributed XQueries, and to view the results in various formats including a 3-D brain visualization. Example results are presented using distributed brain mapping data sources obtained in studies of language organization in the brain, but any other XML source could be included. The advantage of this approach is that it is very easy to add and query a new source, the tradeoff being that the user needs to understand XQuery and the schemata of the underlying sources. For small numbers of known sources this burden is not onerous for a knowledgeable user, leading to the conclusion that the system helps to fill the gap between ad hoc local methods and large scale but complex national data sharing efforts.

  1. Centrifugal LabTube platform for fully automated DNA purification and LAMP amplification based on an integrated, low-cost heating system.

    Science.gov (United States)

    Hoehl, Melanie M; Weißert, Michael; Dannenberg, Arne; Nesch, Thomas; Paust, Nils; von Stetten, Felix; Zengerle, Roland; Slocum, Alexander H; Steigert, Juergen

    2014-06-01

    This paper introduces a disposable battery-driven heating system for loop-mediated isothermal DNA amplification (LAMP) inside a centrifugally-driven DNA purification platform (LabTube). We demonstrate LabTube-based fully automated DNA purification of as low as 100 cell-equivalents of verotoxin-producing Escherichia coli (VTEC) in water, milk and apple juice in a laboratory centrifuge, followed by integrated and automated LAMP amplification with a reduction of hands-on time from 45 to 1 min. The heating system consists of two parallel SMD thick film resistors and a NTC as heating and temperature sensing elements. They are driven by a 3 V battery and controlled by a microcontroller. The LAMP reagents are stored in the elution chamber and the amplification starts immediately after the eluate is purged into the chamber. The LabTube, including a microcontroller-based heating system, demonstrates contamination-free and automated sample-to-answer nucleic acid testing within a laboratory centrifuge. The heating system can be easily parallelized within one LabTube and it is deployable for a variety of heating and electrical applications.

  2. Impacts of optimal energy storage deployment and network reconfiguration on renewable integration level in distribution systems

    International Nuclear Information System (INIS)

    Santos, Sérgio F.; Fitiwi, Desta Z.; Cruz, Marco R.M.; Cabrita, Carlos M.P.; Catalão, João P.S.

    2017-01-01

    Highlights: • A dynamic and multi-objective stochastic mixed integer linear programming model is developed. • A new mechanism to quantify the impacts of network flexibility and ESS deployments on RES integration is presented. • Optimal integration of ESSs dramatically increases the level and the optimal exploitation of renewable DGs. • As high as 90% of RES integration level may be possible in distribution network systems. • Joint DG and ESS installations along with optimal network reconfiguration greatly contribute to voltage stability. - Abstract: Nowadays, there is a wide consensus about integrating more renewable energy sources-RESs to solve a multitude of global concerns such as meeting an increasing demand for electricity, reducing energy security and heavy dependence on fossil fuels for energy production, and reducing the overall carbon footprint of power production. Framed in this context, the coordination of RES integration with energy storage systems (ESSs), along with the network’s switching capability and/or reinforcement, is expected to significantly improve system flexibility, thereby increasing the capability of the system in accommodating large-scale RES power. Hence, this paper presents a novel mechanism to quantify the impacts of network switching and/or reinforcement as well as deployment of ESSs on the level of renewable power integrated in the system. To carry out this analysis, a dynamic and multi-objective stochastic mixed integer linear programming (S-MILP) model is developed, which jointly takes the optimal deployment of RES-based DGs and ESSs into account in coordination with distribution network reinforcement and/or reconfiguration. The IEEE 119-bus test system is used as a case study. Numerical results clearly show the capability of ESS deployment in dramatically increasing the level of renewable DGs integrated in the system. Although case-dependent, the impact of network reconfiguration on RES power integration is not

  3. Some classes of multivariate infinitely divisible distributions admitting stochastic integral representations

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler; Maejima, M.; Sato, K.

    2006-01-01

    The class of distributions on R generated by convolutions of Γ-distributions and the class generated by convolutions of mixtures of exponential distributions are generalized to higher dimensions and denoted by T(Rd) and B(Rd) . From the Lévy process {Xt(μ)} on Rd with distribution μ at t=1, Υ...... divisible distributions and of self-decomposable distributions on Rd , respectively. The relations with the mapping Φ from μ to the distribution at each time of the stationary process of Ornstein-Uhlenbeck type with background driving Lévy process {Xt(μ)} are studied. Developments of these results......(μ) is defined as the distribution of the stochastic integral ∫01log(1/t)dXt(μ) . This mapping is a generalization of the mapping Υ introduced by Barndorff-Nielsen and Thorbjørnsen in one dimension. It is proved that ϒ(ID(Rd))=B(Rd) and ϒ(L(Rd))=T(Rd) , where ID(Rd) and L(Rd) are the classes of infinitely...

  4. Integrated production-distribution planning optimization models: A review in collaborative networks context

    Directory of Open Access Journals (Sweden)

    Beatriz Andres

    2017-01-01

    Full Text Available Researchers in the area of collaborative networks are more and more aware of proposing collaborative approaches to address planning processes, due to the advantages associated when enterprises perform integrated planning models. Collaborative production-distribution planning, among the supply network actors, is considered a proper mechanism to support enterprises on dealing with uncertainties and dynamicity associated to the current markets. Enterprises, and especially SMEs, should be able to overcome the continuous changes of the market by increasing their agility. Carrying out collaborative planning allows enterprises to enhance their readiness and agility for facing the market turbulences. However, SMEs have limited access when incorporating optimization tools to deal with collaborative planning, reducing their ability to respond to the competition. The problem to solve is to provide SMEs affordable solutions to support collaborative planning. In this regard, new optimisation algorithms are required in order to improve the collaboration within the supply network partners. As part of the H2020 Cloud Collaborative Manufacturing Networks (C2NET research project, this paper presents a study on integrated production and distribution plans. The main objective of the research is to identify gaps in current optimization models, proposed to address integrated planning, taking into account the requirements and needs of the industry. Thus, the needs of the companies belonging to the industrial pilots, defined in the C2NET project, are identified; analysing how these needs are covered by the optimization models proposed in the literature, to deal with the integrated production-distribution planning.

  5. The complete two-loop integrated jet thrust distribution in soft-collinear effective theory

    International Nuclear Information System (INIS)

    Manteuffel, Andreas von; Schabinger, Robert M.; Zhu, Hua Xing

    2014-01-01

    In this work, we complete the calculation of the soft part of the two-loop integrated jet thrust distribution in e + e − annihilation. This jet mass observable is based on the thrust cone jet algorithm, which involves a veto scale for out-of-jet radiation. The previously uncomputed part of our result depends in a complicated way on the jet cone size, r, and at intermediate stages of the calculation we actually encounter a new class of multiple polylogarithms. We employ an extension of the coproduct calculus to systematically exploit functional relations and represent our results concisely. In contrast to the individual contributions, the sum of all global terms can be expressed in terms of classical polylogarithms. Our explicit two-loop calculation enables us to clarify the small r picture discussed in earlier work. In particular, we show that the resummation of the logarithms of r that appear in the previously uncomputed part of the two-loop integrated jet thrust distribution is inextricably linked to the resummation of the non-global logarithms. Furthermore, we find that the logarithms of r which cannot be absorbed into the non-global logarithms in the way advocated in earlier work have coefficients fixed by the two-loop cusp anomalous dimension. We also show that in many cases one can straightforwardly predict potentially large logarithmic contributions to the integrated jet thrust distribution at L loops by making use of analogous contributions to the simpler integrated hemisphere soft function

  6. On the relevance of efficient, integrated computer and network monitoring in HEP distributed online environment

    CERN Document Server

    Carvalho, D F; Delgado, V; Albert, J N; Bellas, N; Javello, J; Miere, Y; Ruffinoni, D; Smith, G

    1996-01-01

    Large Scientific Equipments are controlled by Computer System whose complexity is growing driven, on the one hand by the volume and variety of the information, its distributed nature, thhe sophistication of its trearment and, on the over hand by the fast evolution of the computer and network market. Some people call them generically Large-Scale Distributed Data Intensive Information Systems or Distributed Computer Control Systems (DCCS) for those systems dealing more with real time control. Taking advantage of (or forced by) the distributed architecture, the tasks are more and more often implemented as Client-Server applications. In this frame- work the monitoring of the computer nodes, the communications network and the applications becomes of primary importance for ensuring the safe running and guaranteed performance of the system. With the future generation of HEP experiments, such as those at the LHC in view, it is to integrate the various functions of DCCS monitoring into one general purpose Multi-layer ...

  7. On the Relevancy of Efficient, Integrated Computer and Network Monitoring in HEP Distributed Online Environment

    Science.gov (United States)

    Carvalho, D.; Gavillet, Ph.; Delgado, V.; Albert, J. N.; Bellas, N.; Javello, J.; Miere, Y.; Ruffinoni, D.; Smith, G.

    Large Scientific Equipments are controlled by Computer Systems whose complexity is growing driven, on the one hand by the volume and variety of the information, its distributed nature, the sophistication of its treatment and, on the other hand by the fast evolution of the computer and network market. Some people call them genetically Large-Scale Distributed Data Intensive Information Systems or Distributed Computer Control Systems (DCCS) for those systems dealing more with real time control. Taking advantage of (or forced by) the distributed architecture, the tasks are more and more often implemented as Client-Server applications. In this framework the monitoring of the computer nodes, the communications network and the applications becomes of primary importance for ensuring the safe running and guaranteed performance of the system. With the future generation of HEP experiments, such as those at the LHC in view, it is proposed to integrate the various functions of DCCS monitoring into one general purpose Multi-layer System.

  8. Delineating Hydrofacies Spatial Distribution by Integrating Ensemble Data Assimilation and Indicator Geostatistics

    Energy Technology Data Exchange (ETDEWEB)

    Song, Xuehang [Florida State Univ., Tallahassee, FL (United States); Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Chen, Xingyuan [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Ye, Ming [Florida State Univ., Tallahassee, FL (United States); Dai, Zhenxue [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Hammond, Glenn Edward [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-07-01

    This study develops a new framework of facies-based data assimilation for characterizing spatial distribution of hydrofacies and estimating their associated hydraulic properties. This framework couples ensemble data assimilation with transition probability-based geostatistical model via a parameterization based on a level set function. The nature of ensemble data assimilation makes the framework efficient and flexible to be integrated with various types of observation data. The transition probability-based geostatistical model keeps the updated hydrofacies distributions under geological constrains. The framework is illustrated by using a two-dimensional synthetic study that estimates hydrofacies spatial distribution and permeability in each hydrofacies from transient head data. Our results show that the proposed framework can characterize hydrofacies distribution and associated permeability with adequate accuracy even with limited direct measurements of hydrofacies. Our study provides a promising starting point for hydrofacies delineation in complex real problems.

  9. Modernizing Distribution System Restoration to Achieve Grid Resiliency Against Extreme Weather Events: An Integrated Solution

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Chen; Wang, Jianhui; Ton, Dan

    2017-07-07

    Recent severe power outages caused by extreme weather hazards have highlighted the importance and urgency of improving the resilience of the electric power grid. As the distribution grids still remain vulnerable to natural disasters, the power industry has focused on methods of restoring distribution systems after disasters in an effective and quick manner. The current distribution system restoration practice for utilities is mainly based on predetermined priorities and tends to be inefficient and suboptimal, and the lack of situational awareness after the hazard significantly delays the restoration process. As a result, customers may experience an extended blackout, which causes large economic loss. On the other hand, the emerging advanced devices and technologies enabled through grid modernization efforts have the potential to improve the distribution system restoration strategy. However, utilizing these resources to aid the utilities in better distribution system restoration decision-making in response to extreme weather events is a challenging task. Therefore, this paper proposes an integrated solution: a distribution system restoration decision support tool designed by leveraging resources developed for grid modernization. We first review the current distribution restoration practice and discuss why it is inadequate in response to extreme weather events. Then we describe how the grid modernization efforts could benefit distribution system restoration, and we propose an integrated solution in the form of a decision support tool to achieve the goal. The advantages of the solution include improving situational awareness of the system damage status and facilitating survivability for customers. The paper provides a comprehensive review of how the existing methodologies in the literature could be leveraged to achieve the key advantages. The benefits of the developed system restoration decision support tool include the optimal and efficient allocation of repair crews

  10. Guidelines for Implementing Advanced Distribution Management Systems-Requirements for DMS Integration with DERMS and Microgrids

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Jianhui [Argonne National Lab. (ANL), Argonne, IL (United States); Chen, Chen [Argonne National Lab. (ANL), Argonne, IL (United States); Lu, Xiaonan [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-08-01

    This guideline focuses on the integration of DMS with DERMS and microgrids connected to the distribution grid by defining generic and fundamental design and implementation principles and strategies. It starts by addressing the current status, objectives, and core functionalities of each system, and then discusses the new challenges and the common principles of DMS design and implementation for integration with DERMS and microgrids to realize enhanced grid operation reliability and quality power delivery to consumers while also achieving the maximum energy economics from the DER and microgrid connections.

  11. Multi-agents Based Modelling for Distribution Network Operation with Electric Vehicle Integration

    DEFF Research Database (Denmark)

    Hu, Junjie; Morais, Hugo; Zong, Yi

    2014-01-01

    Electric vehicles (EV) can become integral part of a smart grid because instead of just consuming power they are capable of providing valuable services to power systems. To integrate EVs smoothly into the power systems, a multi-agents system (MAS) with hierarchical organization structure...... and its role is to manage the distribution network safely by avoiding grid congestions and using congestion prices to coordinate the energy schedule of VPPs. VPP agents belong to the middle level and their roles are to manage the charge periods of the EVs. EV agents sit in the bottom level...

  12. Determining integral density distribution in the mach reflection of shock waves

    Science.gov (United States)

    Shevchenko, A. M.; Golubev, M. P.; Pavlov, A. A.; Pavlov, Al. A.; Khotyanovsky, D. V.; Shmakov, A. S.

    2017-05-01

    We present a method for and results of determination of the field of integral density in the structure of flow corresponding to the Mach interaction of shock waves at Mach number M = 3. The optical diagnostics of flow was performed using an interference technique based on self-adjusting Zernike filters (SA-AVT method). Numerical simulations were carried out using the CFS3D program package for solving the Euler and Navier-Stokes equations. Quantitative data on the distribution of integral density on the path of probing radiation in one direction of 3D flow transillumination in the region of Mach interaction of shock waves were obtained for the first time.

  13. On the Efficiency of Connection Charges---Part II: Integration of Distributed Energy Resources

    OpenAIRE

    Munoz-Alvarez, Daniel; Garcia-Franco, Juan F.; Tong, Lang

    2017-01-01

    This two-part paper addresses the design of retail electricity tariffs for distribution systems with distributed energy resources (DERs). Part I presents a framework to optimize an ex-ante two-part tariff for a regulated monopolistic retailer who faces stochastic wholesale prices on the one hand and stochastic demand on the other. In Part II, the integration of DERs is addressed by analyzing their endogenous effect on the optimal two-part tariff and the induced welfare gains. Two DER integrat...

  14. Impact and Cost Evaluation of Electric Vehicle Integration on Medium Voltage Distribution Networks

    DEFF Research Database (Denmark)

    Wu, Qiuwei; Cheng, Lin; Pineau, Ulysse

    2013-01-01

    This paper presents the analysis of the impact of electric vehicle (EV) integration on medium voltage (MV) distribution networks and the cost evaluation of replacing the overloaded grid components. A number of EV charging scenarios have been studied. A 10 kV grid from the Bornholm Island...... in the city area has been used to carry out case studies. The case study results show that the secondary transformers are the bottleneck of the MV distribution networks and the increase of EV penetration leads to the overloading of secondary transformers. The cost of the transformer replacement has been...

  15. Efficient Integration of Old and New Research Tools for Automating the Identification and Analysis of Seismic Reference Events

    Energy Technology Data Exchange (ETDEWEB)

    Wagner, Robert; Rivers, Wilmer

    2005-01-25

    any single computer program for seismic data analysis will not have all the capabilities needed to study reference events, since hese detailed studies will be highly specialized. It may be necessary to develop and test new algorithms, and then these special ;odes must be integrated with existing software to use their conventional data-processing routines. We have investigated two neans of establishing communications between the legacy and new codes: CORBA and XML/SOAP Web services. We have nvestigated making new Java code communicate with a legacy C-language program, geotool, running under Linux. Both methods vere successful, but both were difficult to implement. C programs on UNIX/Linux are poorly supported for Web services, compared vith the Java and .NET languages and platforms. Easier-to-use middleware will be required for scientists to construct distributed applications as easily as stand-alone ones. Considerable difficulty was encountered in modifying geotool, and this problem shows he need to use component-based user interfaces instead of large C-language codes where changes to one part of the program nay introduce side effects into other parts. We have nevertheless made bug fixes and enhancements to that legacy program, but t remains difficult to expand it through communications with external software.

  16. An efficient approach to bioconversion kinetic model generation based on automated microscale experimentation integrated with model driven experimental design

    DEFF Research Database (Denmark)

    Chen, B. H.; Micheletti, M.; Baganz, F.

    2009-01-01

    -erythrulose. Experiments were performed using automated microwell studies at the 150 or 800 mu L scale. The derived kinetic parameters were then verified in a second round of experiments where model predictions showed excellent agreement with experimental data obtained under conditions not included in the original......Reliable models of enzyme kinetics are required for the effective design of bioconversion processes. Kinetic expressions of the enzyme-catalysed reaction rate however, are frequently complex and establishing accurate values of kinetic parameters normally requires a large number of experiments....... These can be both time consuming and expensive when working with the types of non-natural chiral intermediates important in pharmaceutical syntheses. This paper presents ail automated microscale approach to the rapid and cost effective generation of reliable kinetic models useful for bioconversion process...

  17. HiRel: Hybrid Automated Reliability Predictor (HARP) integrated reliability tool system, (version 7.0). Volume 1: HARP introduction and user's guide

    Science.gov (United States)

    Bavuso, Salvatore J.; Rothmann, Elizabeth; Dugan, Joanne Bechta; Trivedi, Kishor S.; Mittal, Nitin; Boyd, Mark A.; Geist, Robert M.; Smotherman, Mark D.

    1994-01-01

    The Hybrid Automated Reliability Predictor (HARP) integrated Reliability (HiRel) tool system for reliability/availability prediction offers a toolbox of integrated reliability/availability programs that can be used to customize the user's application in a workstation or nonworkstation environment. HiRel consists of interactive graphical input/output programs and four reliability/availability modeling engines that provide analytical and simulative solutions to a wide host of reliable fault-tolerant system architectures and is also applicable to electronic systems in general. The tool system was designed to be compatible with most computing platforms and operating systems, and some programs have been beta tested, within the aerospace community for over 8 years. Volume 1 provides an introduction to the HARP program. Comprehensive information on HARP mathematical models can be found in the references.

  18. Uma abordagem metodológica para o desenvolvimento de sistemas automatizados e integrados de manufatura A methodological approach to automated and integrated manufacturing systems development

    Directory of Open Access Journals (Sweden)

    Marco Antonio Busetti de Paula

    2008-01-01

    Full Text Available Este trabalho apresenta uma metodologia de projeto aplicada a sistemas automatizados e integrados de manufatura. A metodologia consiste em um desenvolvimento cíclico de três etapas - modelagem, síntese e implementação - até o atendimento da aplicação demandada para o sistema real, resultando no projeto do sistema automatizado integrado. Esta forma de desenvolvimento permite uma revisão contínua dos resultados obtidos em cada etapa. Para testar e validar a metodologia, é apresentado um exemplo de re-projeto de um protótipo de sistema de manufatura em função da necessidade de inserção de um novo produto.This paper presents a methodology of project applied to automated and integrated manufacturing systems. The methodology consists of a cyclic three stages development - modeling, synthesis and implementation - till the accomplishment of the application required by the real system, resulting in the project of the automated and integrated system. This kind of development allows a continuous revision of the results of each stage. To submit to a test and to validate the methodology, it is given an example of a re-project of a prototype of a manufacturing system at the time of the introduction of a new product on the market.

  19. Use of spatially distributed time-integrated sediment sampling networks and distributed fine sediment modelling to inform catchment management.

    Science.gov (United States)

    Perks, M T; Warburton, J; Bracken, L J; Reaney, S M; Emery, S B; Hirst, S

    2017-11-01

    Under the EU Water Framework Directive, suspended sediment is omitted from environmental quality standards and compliance targets. This omission is partly explained by difficulties in assessing the complex dose-response of ecological communities. But equally, it is hindered by a lack of spatially distributed estimates of suspended sediment variability across catchments. In this paper, we demonstrate the inability of traditional, discrete sampling campaigns for assessing exposure to fine sediment. Sampling frequencies based on Environmental Quality Standard protocols, whilst reflecting typical manual sampling constraints, are unable to determine the magnitude of sediment exposure with an acceptable level of precision. Deviations from actual concentrations range between -35 and +20% based on the interquartile range of simulations. As an alternative, we assess the value of low-cost, suspended sediment sampling networks for quantifying suspended sediment transfer (SST). In this study of the 362 km 2 upland Esk catchment we observe that spatial patterns of sediment flux are consistent over the two year monitoring period across a network of 17 monitoring sites. This enables the key contributing sub-catchments of Butter Beck (SST: 1141 t km 2 yr -1 ) and Glaisdale Beck (SST: 841 t km 2 yr -1 ) to be identified. The time-integrated samplers offer a feasible alternative to traditional infrequent and discrete sampling approaches for assessing spatio-temporal changes in contamination. In conjunction with a spatially distributed diffuse pollution model (SCIMAP), time-integrated sediment sampling is an effective means of identifying critical sediment source areas in the catchment, which can better inform sediment management strategies for pollution prevention and control. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Research on data integration of Anshan 10 kV power distribution network system

    Energy Technology Data Exchange (ETDEWEB)

    Jiang, T.; Dang, N.; Liu, Y. [North China Electric Power Univ., Beijing (China); Wang, Z. [Anshan Electric Co. Ltd., Anshan City (China)

    2009-03-11

    Variations in data formats among electric power supervisory control systems are preventing the development of an integrated power distribution database. The use of standard XML protocols make web-based services platform and language independent, as they are based on open standards that allow software to communicate in a standardizing XML messaging system. This article discussed the use of web-based services as a safe interface for accessing distribution system databases. Web services technology was used to pass through the firewall of individual data servers while avoiding direct contact with the actual database and ensuring the security of the data server. A virtual machine technology was then applied in a testing environment to demonstrate the method. The test results confirmed that the method can be used to integrate data from various power systems. 8 refs., 5 figs.

  1. Development of an Integrated Approach to Routine Automation of Neutron Activation Analysis. Results of a Coordinated Research Project

    International Nuclear Information System (INIS)

    2018-04-01

    Neutron activation analysis (NAA) is a powerful technique for determining bulk composition of major and trace elements. Automation may contribute significantly to keep NAA competitive for end-users. It provides opportunities for a larger analytical capacity and a shorter overall turnaround time if large series of samples have to be analysed. This publication documents and disseminates the expertise generated on automation in NAA during a coordinated research project (CRP). The CRP participants presented different cost-effective designs of sample changers for gamma-ray spectrometry as well as irradiation devices, and were able to construct and successfully test these systems. They also implemented, expanded and improved quality control and quality assurance as cross-cutting topical area of their automated NAA procedures. The publication serves as a reference of interest to NAA practitioners, experts, and research reactor personnel, but also to various stakeholders and users interested in basic research and/or services provided by NAA. The individual country reports are available on the CD-ROM attached to this publication.

  2. Integrated North Sea grids: The costs, the benefits and their distribution between countries

    International Nuclear Information System (INIS)

    Konstantelos, Ioannis; Pudjianto, Danny; Strbac, Goran; De Decker, Jan; Joseph, Pieter; Flament, Aurore; Kreutzkamp, Paul; Genoese, Fabio; Rehfeldt, Leif; Wallasch, Anna-Kathrin; Gerdes, Gerhard; Jafar, Muhammad; Yang, Yongtao; Tidemand, Nicolaj; Jansen, Jaap; Nieuwenhout, Frans; Welle, Adriaan van der; Veum, Karina

    2017-01-01

    A large number of offshore wind farms and interconnectors are expected to be constructed in the North Sea region over the coming decades, creating substantial opportunities for the deployment of integrated network solutions. Creating interconnected offshore grids that combine cross-border links and connections of offshore plants to shore offers multiple economic and environmental advantages for Europe's energy system. However, despite evidence that integrated solutions can be more beneficial than traditional radial connection practices, no such projects have been deployed yet. In this paper we quantify costs and benefits of integrated projects and investigate to which extent the cost-benefit sharing mechanism between participating countries can impede or encourage the development of integrated projects. Three concrete interconnection case studies in the North Sea area are analysed in detail using a national-level power system model. Model outputs are used to compute the net benefit of all involved stakeholders under different allocation schemes. Given the asymmetric distribution of costs and benefits, we recommend to consistently apply the Positive Net Benefit Differential mechanism as a starting point for negotiations on the financial closure of investments in integrated offshore infrastructure. - Highlights: • Three North Sea offshore gird case studies are analysed. • They are shown to have substantial net benefit over non-integrated network designs. • Asymmetric net benefit sharing between countries is shown to be a barrier. • Positive Net Benefit Differential method alleviates asymmetric benefits.

  3. Secure, Autonomous, Intelligent Controller for Integrating Distributed Emergency Response Satellite Operations

    Science.gov (United States)

    Ivancic, William D.; Paulsen, Phillip E.; Miller, Eric M.; Sage, Steen P.

    2013-01-01

    This report describes a Secure, Autonomous, and Intelligent Controller for Integrating Distributed Emergency Response Satellite Operations. It includes a description of current improvements to existing Virtual Mission Operations Center technology being used by US Department of Defense and originally developed under NASA funding. The report also highlights a technology demonstration performed in partnership with the United States Geological Service for Earth Resources Observation and Science using DigitalGlobe(Registered TradeMark) satellites to obtain space-based sensor data.

  4. Integration Platform As Central Service Of Data Replication In Distributed Medical System

    Directory of Open Access Journals (Sweden)

    Wiesław Wajs

    2007-01-01

    Full Text Available The paper presents the application of Java Integration Platform (JIP to data replicationin the distributed medical system. After an introductory part on the medical system’s architecture,the focus shifts to a comparison of different approaches that exist with regard totransferring data between the system’s components. A description is given of the historicaldata processing and of the whole area of the JIP application to the medical system.

  5. Two alternate proofs of Wang's lune formula for sparse distributed memory and an integral approximation

    Science.gov (United States)

    Jaeckel, Louis A.

    1988-01-01

    In Kanerva's Sparse Distributed Memory, writing to and reading from the memory are done in relation to spheres in an n-dimensional binary vector space. Thus it is important to know how many points are in the intersection of two spheres in this space. Two proofs are given of Wang's formula for spheres of unequal radii, and an integral approximation for the intersection in this case.

  6. A modular, prospective, semi-automated drug safety monitoring system for use in a distributed data environment.

    Science.gov (United States)

    Gagne, Joshua J; Wang, Shirley V; Rassen, Jeremy A; Schneeweiss, Sebastian

    2014-06-01

    The aim of this study was to develop and test a semi-automated process for conducting routine active safety monitoring for new drugs in a network of electronic healthcare databases. We built a modular program that semi-automatically performs cohort identification, confounding adjustment, diagnostic checks, aggregation and effect estimation across multiple databases, and application of a sequential alerting algorithm. During beta-testing, we applied the system to five databases to evaluate nine examples emulating prospective monitoring with retrospective data (five pairs for which we expected signals, two negative controls, and two examples for which it was uncertain whether a signal would be expected): cerivastatin versus atorvastatin and rhabdomyolysis; paroxetine versus tricyclic antidepressants and gastrointestinal bleed; lisinopril versus angiotensin receptor blockers and angioedema; ciprofloxacin versus macrolide antibiotics and Achilles tendon rupture; rofecoxib versus non-selective non-steroidal anti-inflammatory drugs (ns-NSAIDs) and myocardial infarction; telithromycin versus azithromycin and hepatotoxicity; rosuvastatin versus atorvastatin and diabetes and rhabdomyolysis; and celecoxib versus ns-NSAIDs and myocardial infarction. We describe the program, the necessary inputs, and the assumed data environment. In beta-testing, the system generated four alerts, all among positive control examples (i.e., lisinopril and angioedema; rofecoxib and myocardial infarction; ciprofloxacin and tendon rupture; and cerivastatin and rhabdomyolysis). Sequential effect estimates for each example were consistent in direction and magnitude with existing literature. Beta-testing across nine drug-outcome examples demonstrated the feasibility of the proposed semi-automated prospective monitoring approach. In retrospective assessments, the system identified an increased risk of myocardial infarction with rofecoxib and an increased risk of rhabdomyolysis with cerivastatin years

  7. A path integral methodology for obtaining thermodynamic properties of nonadiabatic systems using Gaussian mixture distributions

    Science.gov (United States)

    Raymond, Neil; Iouchtchenko, Dmitri; Roy, Pierre-Nicholas; Nooijen, Marcel

    2018-05-01

    We introduce a new path integral Monte Carlo method for investigating nonadiabatic systems in thermal equilibrium and demonstrate an approach to reducing stochastic error. We derive a general path integral expression for the partition function in a product basis of continuous nuclear and discrete electronic degrees of freedom without the use of any mapping schemes. We separate our Hamiltonian into a harmonic portion and a coupling portion; the partition function can then be calculated as the product of a Monte Carlo estimator (of the coupling contribution to the partition function) and a normalization factor (that is evaluated analytically). A Gaussian mixture model is used to evaluate the Monte Carlo estimator in a computationally efficient manner. Using two model systems, we demonstrate our approach to reduce the stochastic error associated with the Monte Carlo estimator. We show that the selection of the harmonic oscillators comprising the sampling distribution directly affects the efficiency of the method. Our results demonstrate that our path integral Monte Carlo method's deviation from exact Trotter calculations is dominated by the choice of the sampling distribution. By improving the sampling distribution, we can drastically reduce the stochastic error leading to lower computational cost.

  8. Integrated Nationwide Electronic Health Records system: Semi-distributed architecture approach.

    Science.gov (United States)

    Fragidis, Leonidas L; Chatzoglou, Prodromos D; Aggelidis, Vassilios P

    2016-11-14

    The integration of heterogeneous electronic health records systems by building an interoperable nationwide electronic health record system provides undisputable benefits in health care, like superior health information quality, medical errors prevention and cost saving. This paper proposes a semi-distributed system architecture approach for an integrated national electronic health record system incorporating the advantages of the two dominant approaches, the centralized architecture and the distributed architecture. The high level design of the main elements for the proposed architecture is provided along with diagrams of execution and operation and data synchronization architecture for the proposed solution. The proposed approach effectively handles issues related to redundancy, consistency, security, privacy, availability, load balancing, maintainability, complexity and interoperability of citizen's health data. The proposed semi-distributed architecture offers a robust interoperability framework without healthcare providers to change their local EHR systems. It is a pragmatic approach taking into account the characteristics of the Greek national healthcare system along with the national public administration data communication network infrastructure, for achieving EHR integration with acceptable implementation cost.

  9. Policy and network regulation for the integration of distribution generation and renewables for electricity supply

    International Nuclear Information System (INIS)

    Ten Donkelaar, M.; Van Oostvoorn, F.

    2005-08-01

    This study has analysed the existing policy and regulation aimed at the integration of an increased share of Distributed Generation (DG) in electricity supply systems in the European Union. It illustrates the state of the art and progress in the development of support mechanisms and network regulation for large-scale integration of DG. Through a benchmark study a systematic comparison has been made of different DG support schemes and distribution network regulation in EU Member States to a predefined standard, the level playing field. This level playing field has been defined as the situation where energy markets, policy and regulation provide neutral incentives to central versus distributed generation, which results in an economically more efficient electricity supply to the consumer. In current regulation and policy a certain discrepancy can be noticed between the actual regulation and policy support systems in a number of countries, the medium to long term targets and the ideal situation described according to the level playing field objective. Policies towards DG and RES are now mainly aimed at removing short-term barriers, increasing the production share of DG/RES, but often ignoring the more complex barriers of integrating DG/RES that is created by the economic network regulation in current electricity markets

  10. Using an Integrated Distributed Test Architecture to Develop an Architecture for Mars

    Science.gov (United States)

    Othon, William L.

    2016-01-01

    The creation of a crew-rated spacecraft architecture capable of sending humans to Mars requires the development and integration of multiple vehicle systems and subsystems. Important new technologies will be identified and matured within each technical discipline to support the mission. Architecture maturity also requires coordination with mission operations elements and ground infrastructure. During early architecture formulation, many of these assets will not be co-located and will required integrated, distributed test to show that the technologies and systems are being developed in a coordinated way. When complete, technologies must be shown to function together to achieve mission goals. In this presentation, an architecture will be described that promotes and advances integration of disparate systems within JSC and across NASA centers.

  11. Integration of distributed plant lifecycle data using ISO 15926 and Web services

    International Nuclear Information System (INIS)

    Kim, Byung Chul; Teijgeler, Hans; Mun, Duhwan; Han, Soonhung

    2011-01-01

    Highlights: → The ISO 15926 parts that provide implementation methods are under development. → A prototype of an ISO 15926-based data repository called a facade was implemented. → The prototype facade has the advantages of data interoperability and integration. → These are obtained through the features of ISO 15926 and Web services. - Abstract: Considering the financial, safety, and environmental risks related to industrial installations, it is of paramount importance that all relevant lifecycle information is readily available. Parts of this lifecycle information are stored in a plethora of computer systems, often scattered around the world and in many native formats and languages. These parts can create a complete, holistic set of lifecycle data only when they are integrated together. At present, no software is available that can integrate these parts into one coherent, distributed, and up-to-date set. The ISO 15926 standard has been developed, and in part is still under development, to overcome this problem. In this paper, the authors discuss a prototype of an ISO 15926-based data repository called a facade, and its Web services are implemented for storing the equipment data of a nuclear power plant and servicing the data to interested organizations. This prototype is for a proof-of-concept study regarding the ISO 15926 parts that are currently under development and that are expected to provide implementation methods for the integration of distributed plant systems.

  12. Study concerning the power plant control and safety equipment by integrated distributed systems

    International Nuclear Information System (INIS)

    Optea, I.; Oprea, M.; Stanescu, P.

    1995-01-01

    The paper deals with the trends existing in the field of nuclear control and safety equipment and systems, proposing a high-efficiency integrated system. In order to enhance the safety of the plant and reliability of the structure system and components, we present a concept based on the latest computer technology with an open, distributed system, connected by a local area network with high redundancy. A modern conception for the control and safety system is to integrate all the information related to the reactor protection, active engineered safeguard and auxiliary systems parameters, offering a fast flow of information between all the agencies concerned so that situations can be quickly assessed. The integrated distributed control is based on a high performance operating system for realtime applications, flexible enough for transparent networking and modular for demanding configurations. The general design considerations for nuclear reactors instrumentation reliability and testing methods for real-time functions under dynamic regime are presented. Taking into account the fast progress in information technology, we consider the replacement of the old instrumentation of Cernavoda-1 NPP by a modern integrated system as an economical and efficient solution for the next units. (Author) 20 Refs

  13. Statistical distribution of components of energy eigenfunctions: from nearly-integrable to chaotic

    International Nuclear Information System (INIS)

    Wang, Jiaozi; Wang, Wen-ge

    2016-01-01

    We study the statistical distribution of components in the non-perturbative parts of energy eigenfunctions (EFs), in which main bodies of the EFs lie. Our numerical simulations in five models show that deviation of the distribution from the prediction of random matrix theory (RMT) is useful in characterizing the process from nearly-integrable to chaotic, in a way somewhat similar to the nearest-level-spacing distribution. But, the statistics of EFs reveals some more properties, as described below. (i) In the process of approaching quantum chaos, the distribution of components shows a delay feature compared with the nearest-level-spacing distribution in most of the models studied. (ii) In the quantum chaotic regime, the distribution of components always shows small but notable deviation from the prediction of RMT in models possessing classical counterparts, while, the deviation can be almost negligible in models not possessing classical counterparts. (iii) In models whose Hamiltonian matrices possess a clear band structure, tails of EFs show statistical behaviors obviously different from those in the main bodies, while, the difference is smaller for Hamiltonian matrices without a clear band structure.

  14. Delineating Facies Spatial Distribution by Integrating Ensemble Data Assimilation and Indicator Geostatistics with Level Set Transformation.

    Energy Technology Data Exchange (ETDEWEB)

    Hammond, Glenn Edward; Song, Xuehang; Ye, Ming; Dai, Zhenxue; Zachara, John; Chen, Xingyuan

    2017-03-01

    A new approach is developed to delineate the spatial distribution of discrete facies (geological units that have unique distributions of hydraulic, physical, and/or chemical properties) conditioned not only on direct data (measurements directly related to facies properties, e.g., grain size distribution obtained from borehole samples) but also on indirect data (observations indirectly related to facies distribution, e.g., hydraulic head and tracer concentration). Our method integrates for the first time ensemble data assimilation with traditional transition probability-based geostatistics. The concept of level set is introduced to build shape parameterization that allows transformation between discrete facies indicators and continuous random variables. The spatial structure of different facies is simulated by indicator models using conditioning points selected adaptively during the iterative process of data assimilation. To evaluate the new method, a two-dimensional semi-synthetic example is designed to estimate the spatial distribution and permeability of two distinct facies from transient head data induced by pumping tests. The example demonstrates that our new method adequately captures the spatial pattern of facies distribution by imposing spatial continuity through conditioning points. The new method also reproduces the overall response in hydraulic head field with better accuracy compared to data assimilation with no constraints on spatial continuity on facies.

  15. Distribution of costs induced by the integration of RES-E power

    International Nuclear Information System (INIS)

    Barth, Ruediger; Weber, Christoph; Swider, Derk J.

    2008-01-01

    This article focuses on the distribution of costs induced by the integration of electricity generation from renewable energy sources (RES-E). The treatment to distribute these costs on different market actors is crucial for its development. For this purpose, individual actors of electricity markets and several cost categories are identified. According to the defined cost structure, possible treatments to distribute the individual cost categories on different relevant actors are described. Finally, an evaluation of the cost distribution treatments based on an economic analysis is given. Economic efficiency recommends that clearly attributable (shallow) grid connection as well as (deep) grid costs are charged to the corresponding RES-E producer and that the RES-E producers are also charged the regulating power costs. However, deep grid integration costs should be updated to reflect evolving scarcities. Also regulating power costs should reflect actual scarcity and thus be symmetric and based on real-time prices, taking into account the overall system imbalance. Moreover, the time span between the closure of the spot market and actual delivery should be chosen as short as possible to enable accurate RES-E production forecasts

  16. Integrated Cost-Benefit Assessment of Customer-Driven Distributed Generatio

    Directory of Open Access Journals (Sweden)

    Čedomir Zeljković

    2014-06-01

    Full Text Available Distributed generation (DG has the potential to bring respectable benefits to electricity customers, distribution utilities and community in general. Among the customer benefits, the most important are the electricity bill reduction, reliability improvement, use of recovered heat, and qualifying for financial incentives. In this paper, an integrated cost-benefit methodology for assessment of customer-driven DG is presented. Target customers are the industrial and commercial end-users that are critically dependent on electricity supply, due to high consumption, high power peak demand or high electricity supply reliability requirements. Stochastic inputs are represented by the appropriate probability models and then the Monte Carlo simulation is employed for each investment alternative. The obtained probability distributions for the prospective profit are used to assess the risk, compare the alternatives and make decisions.

  17. Numerical study on coolant flow distribution at the core inlet for an integral pressurized water reactor

    Energy Technology Data Exchange (ETDEWEB)

    Sun, Lin; Peng, Min Jun; Xia, Genglei; Lv, Xing; Li, Ren [Fundamental Science on Nuclear Safety and Simulation Technology Laboratory, Harbin Engineering University, Harbin (China)

    2017-02-15

    When an integral pressurized water reactor is operated under low power conditions, once-through steam generator group operation strategy is applied. However, group operation strategy will cause nonuniform coolant flow distribution at the core inlet and lower plenum. To help coolant flow mix more uniformly, a flow mixing chamber (FMC) has been designed. In this paper, computational fluid dynamics methods have been used to investigate the coolant distribution by the effect of FMC. Velocity and temperature characteristics under different low power conditions and optimized FMC configuration have been analyzed. The results illustrate that the FMC can help improve the nonuniform coolant temperature distribution at the core inlet effectively; at the same time, the FMC will induce more resistance in the downcomer and lower plenum.

  18. Integrated Scheduling of Production and Distribution with Release Dates and Capacitated Deliveries

    Directory of Open Access Journals (Sweden)

    Xueling Zhong

    2016-01-01

    Full Text Available This paper investigates an integrated scheduling of production and distribution model in a supply chain consisting of a single machine, a customer, and a sufficient number of homogeneous capacitated vehicles. In this model, the customer places a set of orders, each of which has a given release date. All orders are first processed nonpreemptively on the machine and then batch delivered to the customer. Two variations of the model with different objective functions are studied: one is to minimize the arrival time of the last order plus total distribution cost and the other is to minimize total arrival time of the orders plus total distribution cost. For the former one, we provide a polynomial-time exact algorithm. For the latter one, due to its NP-hard property, we provide a heuristic with a worst-case ratio bound of 2.

  19. Automated security management

    CERN Document Server

    Al-Shaer, Ehab; Xie, Geoffrey

    2013-01-01

    In this contributed volume, leading international researchers explore configuration modeling and checking, vulnerability and risk assessment, configuration analysis, and diagnostics and discovery. The authors equip readers to understand automated security management systems and techniques that increase overall network assurability and usability. These constantly changing networks defend against cyber attacks by integrating hundreds of security devices such as firewalls, IPSec gateways, IDS/IPS, authentication servers, authorization/RBAC servers, and crypto systems. Automated Security Managemen

  20. Distributed XQuery-based integration and visualization of multimodality brain mapping data

    Directory of Open Access Journals (Sweden)

    Landon T Detwiler

    2009-01-01

    Full Text Available This paper addresses the need for relatively small groups of collaborating investigators to integrate distributed and heterogeneous data about the brain. Although various national efforts facilitate large-scale data sharing, these approaches are generally too “heavyweight” for individual or small groups of investigators, with the result that most data sharing among collaborators continues to be ad hoc. Our approach to this problem is to create a “lightweight” distributed query architecture, in which data sources are accessible via web services that accept arbitrary query languages but return XML results. A Distributed XQuery Processor (DXQP accepts distributed XQueries in which subqueries are shipped to the remote data sources to be executed, with the resulting XML integrated by DXQP. A web-based application called DXBrain accesses DXQP, allowing a user to create, save and execute distributed XQueries, and to view the results in various formats including a 3-D brain visualization. Example results are presented using distributed brain mapping data sources obtained in studies of language organization in the brain, but any other XML source could be included. The advantage of this approach is that it is very easy to add and query a new source, the tradeoff being that the user needs to understand XQuery and the schemata of the underlying sources. For small numbers of known sources this burden is not onerous for a knowledgeable user, leading to the conclusion that the system helps to fill the gap between ad hoc local methods and large scale but complex national data sharing efforts.

  1. Flaw distributions and use of ISI data in RPV integrity evaluations

    International Nuclear Information System (INIS)

    Dimitrijevic, V.; Ammirato, F.

    1993-01-01

    A probabilistic method for developing post-inspection flaw distributions has been developed that explicitly accounts for the capability of the inspection procedure to detect and size flaws. This methodology has been used to develop flaw distributions for calculating reactor vessel failure probability under postulated pressurized thermal shock (PTS) conditions. Realistic flaw distributions are important because plant-specific PTS safety assessments are very sensitive to assumptions made about major flaw parameters such as density, size, shape, and location. PTS analysis made in the past do not consider ISI. Two main reasons are (1) lack of a general and approved methodology which provides directions for involvement of ISI results in developing new flaw parameters and (2) lack of confidence in the capability of ISI procedures to detect critical flaws that may be present near the clad-to-base metal interface of the vessel, the location of most concern for PTS conditions. Recent developments in ISI practice, however, have led to substantial improvement in ISI capability and provide a basis for using ISI data to develop plant-specific post-inspection flaw distributions for vessel integrity evaluations. The key components of this evaluation are (1) the generic (preinspection) flaw distribution, (2) a probabilistic flaw detection model, and (3) Bayesian updating of the prior flaw distribution with the detection model to develop a post-inspection flaw distribution. Destructive analysis of RPV weld material was performed to develop data to support the pre-inspection flaw distributions. Since the probability of detection (POD) plays such an important role in the analysis and a high POD is needed to make significant reductions in probability of failure, a procedure was developed to achieve and demonstrate POD greater than 0.9 by using a combination of independent inspection techniques

  2. Assessment of the integration capability of system architectures from a complex and distributed software systems perspective

    Science.gov (United States)

    Leuchter, S.; Reinert, F.; Müller, W.

    2014-06-01

    Procurement and design of system architectures capable of network centric operations demand for an assessment scheme in order to compare different alternative realizations. In this contribution an assessment method for system architectures targeted at the C4ISR domain is presented. The method addresses the integration capability of software systems from a complex and distributed software system perspective focusing communication, interfaces and software. The aim is to evaluate the capability to integrate a system or its functions within a system-of-systems network. This method uses approaches from software architecture quality assessment and applies them on the system architecture level. It features a specific goal tree of several dimensions that are relevant for enterprise integration. These dimensions have to be weighed against each other and totalized using methods from the normative decision theory in order to reflect the intention of the particular enterprise integration effort. The indicators and measurements for many of the considered quality features rely on a model based view on systems, networks, and the enterprise. That means it is applicable to System-of-System specifications based on enterprise architectural frameworks relying on defined meta-models or domain ontologies for defining views and viewpoints. In the defense context we use the NATO Architecture Framework (NAF) to ground respective system models. The proposed assessment method allows evaluating and comparing competing system designs regarding their future integration potential. It is a contribution to the system-of-systems engineering methodology.

  3. The Integration of Renewable Energy Sources into Electric Power Distribution Systems, Vol. II Utility Case Assessments

    Energy Technology Data Exchange (ETDEWEB)

    Zaininger, H.W.

    1994-01-01

    Electric utility distribution system impacts associated with the integration of renewable energy sources such as photovoltaics (PV) and wind turbines (WT) are considered in this project. The impacts are expected to vary from site to site according to the following characteristics: the local solar insolation and/or wind characteristics, renewable energy source penetration level, whether battery or other energy storage systems are applied, and local utility distribution design standards and planning practices. Small, distributed renewable energy sources are connected to the utility distribution system like other, similar kW- and MW-scale equipment and loads. Residential applications are expected to be connected to single-phase 120/240-V secondaries. Larger kW-scale applications may be connected to three+phase secondaries, and larger hundred-kW and y-scale applications, such as MW-scale windfarms, or PV plants, may be connected to electric utility primary systems via customer-owned primary and secondary collection systems. In any case, the installation of small, distributed renewable energy sources is expected to have a significant impact on local utility distribution primary and secondary system economics. Small, distributed renewable energy sources installed on utility distribution systems will also produce nonsite-specific utility generation system benefits such as energy and capacity displacement benefits, in addition to the local site-specific distribution system benefits. Although generation system benefits are not site-specific, they are utility-specific, and they vary significantly among utilities in different regions. In addition, transmission system benefits, environmental benefits and other benefits may apply. These benefits also vary significantly among utilities and regions. Seven utility case studies considering PV, WT, and battery storage were conducted to identify a range of potential renewable energy source distribution system applications. The

  4. D-MSR: A Distributed Network Management Scheme for Real-Time Monitoring and Process Control Applications in Wireless Industrial Automation

    Science.gov (United States)

    Zand, Pouria; Dilo, Arta; Havinga, Paul

    2013-01-01

    Current wireless technologies for industrial applications, such as WirelessHART and ISA100.11a, use a centralized management approach where a central network manager handles the requirements of the static network. However, such a centralized approach has several drawbacks. For example, it cannot cope with dynamicity/disturbance in large-scale networks in a real-time manner and it incurs a high communication overhead and latency for exchanging management traffic. In this paper, we therefore propose a distributed network management scheme, D-MSR. It enables the network devices to join the network, schedule their communications, establish end-to-end connections by reserving the communication resources for addressing real-time requirements, and cope with network dynamicity (e.g., node/edge failures) in a distributed manner. According to our knowledge, this is the first distributed management scheme based on IEEE 802.15.4e standard, which guides the nodes in different phases from joining until publishing their sensor data in the network. We demonstrate via simulation that D-MSR can address real-time and reliable communication as well as the high throughput requirements of industrial automation wireless networks, while also achieving higher efficiency in network management than WirelessHART, in terms of delay and overhead. PMID:23807687

  5. D-MSR: a distributed network management scheme for real-time monitoring and process control applications in wireless industrial automation.

    Science.gov (United States)

    Zand, Pouria; Dilo, Arta; Havinga, Paul

    2013-06-27

    Current wireless technologies for industrial applications, such as WirelessHART and ISA100.11a, use a centralized management approach where a central network manager handles the requirements of the static network. However, such a centralized approach has several drawbacks. For example, it cannot cope with dynamicity/disturbance in large-scale networks in a real-time manner and it incurs a high communication overhead and latency for exchanging management traffic. In this paper, we therefore propose a distributed network management scheme, D-MSR. It enables the network devices to join the network, schedule their communications, establish end-to-end connections by reserving the communication resources for addressing real-time requirements, and cope with network dynamicity (e.g., node/edge failures) in a distributed manner. According to our knowledge, this is the first distributed management scheme based on IEEE 802.15.4e standard, which guides the nodes in different phases from joining until publishing their sensor data in the network. We demonstrate via simulation that D-MSR can address real-time and reliable communication as well as the high throughput requirements of industrial automation wireless networks, while also achieving higher efficiency in network management than WirelessHART, in terms of delay and overhead.

  6. Marketing automation supporting sales

    OpenAIRE

    Sandell, Niko

    2016-01-01

    The past couple of decades has been a time of major changes in marketing. Digitalization has become a permanent part of marketing and at the same time enabled efficient collection of data. Personalization and customization of content are playing a crucial role in marketing when new customers are acquired. This has also created a need for automation to facilitate the distribution of targeted content. As a result of successful marketing automation more information of the customers is gathered ...

  7. Development of an automated chip culture system with integrated on-line monitoring for maturation culture of retinal pigment epithelial cells

    Directory of Open Access Journals (Sweden)

    Mee-Hae Kim

    2017-10-01

    Full Text Available In cell manufacturing, the establishment of a fully automated, microfluidic, cell culture system that can be used for long-term cell cultures, as well as for process optimization is highly desirable. This study reports the development of a novel chip bioreactor system that can be used for automated long-term maturation cultures of retinal pigment epithelial (RPE cells. The system consists of an incubation unit, a medium supply unit, a culture observation unit, and a control unit. In the incubation unit, the chip contains a closed culture vessel (2.5 mm diameter, working volume 9.1 μL, which can be set to 37 °C and 5% CO2, and uses a gas-permeable resin (poly- dimethylsiloxane as the vessel wall. RPE cells were seeded at 5.0 × 104 cells/cm2 and the medium was changed every day by introducing fresh medium using the medium supply unit. Culture solutions were stored either in the refrigerator or the freezer, and fresh medium was prepared before any medium change by warming to 37 °C and mixing. Automated culture was allowed to continue for 30 days to allow maturation of the RPE cells. This chip culture system allows for the long-term, bubble-free, culture of RPE cells, while also being able to observe cells in order to elucidate their cell morphology or show the presence of tight junctions. This culture system, along with an integrated on-line monitoring system, can therefore be applied to long-term cultures of RPE cells, and should contribute to process control in RPE cell manufacturing.

  8. Fully Automated Simultaneous Integrated Boosted-Intensity Modulated Radiation Therapy Treatment Planning Is Feasible for Head-and-Neck Cancer: A Prospective Clinical Study

    Energy Technology Data Exchange (ETDEWEB)

    Wu Binbin, E-mail: binbin.wu@gunet.georgetown.edu [Department of Radiation Oncology and Molecular Radiation Science, Johns Hopkins University, Baltimore, Maryland (United States); Department of Radiation Medicine, Georgetown University Hospital, Washington, DC (United States); McNutt, Todd [Department of Radiation Oncology and Molecular Radiation Science, Johns Hopkins University, Baltimore, Maryland (United States); Zahurak, Marianna [Department of Oncology Biostatistics, Johns Hopkins University, Baltimore, Maryland (United States); Simari, Patricio [Autodesk Research, Toronto, ON (Canada); Pang, Dalong [Department of Radiation Medicine, Georgetown University Hospital, Washington, DC (United States); Taylor, Russell [Department of Computer Science, Johns Hopkins University, Baltimore, Maryland (United States); Sanguineti, Giuseppe [Department of Radiation Oncology and Molecular Radiation Science, Johns Hopkins University, Baltimore, Maryland (United States)

    2012-12-01

    Purpose: To prospectively determine whether overlap volume histogram (OVH)-driven, automated simultaneous integrated boosted (SIB)-intensity-modulated radiation therapy (IMRT) treatment planning for head-and-neck cancer can be implemented in clinics. Methods and Materials: A prospective study was designed to compare fully automated plans (APs) created by an OVH-driven, automated planning application with clinical plans (CPs) created by dosimetrists in a 3-dose-level (70 Gy, 63 Gy, and 58.1 Gy), head-and-neck SIB-IMRT planning. Because primary organ sparing (cord, brain, brainstem, mandible, and optic nerve/chiasm) always received the highest priority in clinical planning, the study aimed to show the noninferiority of APs with respect to PTV coverage and secondary organ sparing (parotid, brachial plexus, esophagus, larynx, inner ear, and oral mucosa). The sample size was determined a priori by a superiority hypothesis test that had 85% power to detect a 4% dose decrease in secondary organ sparing with a 2-sided alpha level of 0.05. A generalized estimating equation (GEE) regression model was used for statistical comparison. Results: Forty consecutive patients were accrued from July to December 2010. GEE analysis indicated that in APs, overall average dose to the secondary organs was reduced by 1.16 (95% CI = 0.09-2.33) with P=.04, overall average PTV coverage was increased by 0.26% (95% CI = 0.06-0.47) with P=.02 and overall average dose to the primary organs was reduced by 1.14 Gy (95% CI = 0.45-1.8) with P=.004. A physician determined that all APs could be delivered to patients, and APs were clinically superior in 27 of 40 cases. Conclusions: The application can be implemented in clinics as a fast, reliable, and consistent way of generating plans that need only minor adjustments to meet specific clinical needs.

  9. Fully Automated Simultaneous Integrated Boosted–Intensity Modulated Radiation Therapy Treatment Planning Is Feasible for Head-and-Neck Cancer: A Prospective Clinical Study

    International Nuclear Information System (INIS)

    Wu Binbin; McNutt, Todd; Zahurak, Marianna; Simari, Patricio; Pang, Dalong; Taylor, Russell; Sanguineti, Giuseppe

    2012-01-01

    Purpose: To prospectively determine whether overlap volume histogram (OVH)–driven, automated simultaneous integrated boosted (SIB)-intensity-modulated radiation therapy (IMRT) treatment planning for head-and-neck cancer can be implemented in clinics. Methods and Materials: A prospective study was designed to compare fully automated plans (APs) created by an OVH-driven, automated planning application with clinical plans (CPs) created by dosimetrists in a 3-dose-level (70 Gy, 63 Gy, and 58.1 Gy), head-and-neck SIB-IMRT planning. Because primary organ sparing (cord, brain, brainstem, mandible, and optic nerve/chiasm) always received the highest priority in clinical planning, the study aimed to show the noninferiority of APs with respect to PTV coverage and secondary organ sparing (parotid, brachial plexus, esophagus, larynx, inner ear, and oral mucosa). The sample size was determined a priori by a superiority hypothesis test that had 85% power to detect a 4% dose decrease in secondary organ sparing with a 2-sided alpha level of 0.05. A generalized estimating equation (GEE) regression model was used for statistical comparison. Results: Forty consecutive patients were accrued from July to December 2010. GEE analysis indicated that in APs, overall average dose to the secondary organs was reduced by 1.16 (95% CI = 0.09-2.33) with P=.04, overall average PTV coverage was increased by 0.26% (95% CI = 0.06-0.47) with P=.02 and overall average dose to the primary organs was reduced by 1.14 Gy (95% CI = 0.45-1.8) with P=.004. A physician determined that all APs could be delivered to patients, and APs were clinically superior in 27 of 40 cases. Conclusions: The application can be implemented in clinics as a fast, reliable, and consistent way of generating plans that need only minor adjustments to meet specific clinical needs.

  10. Integrating Data Distribution and Data Assimilation Between the OOI CI and the NOAA DIF

    Science.gov (United States)

    Meisinger, M.; Arrott, M.; Clemesha, A.; Farcas, C.; Farcas, E.; Im, T.; Schofield, O.; Krueger, I.; Klacansky, I.; Orcutt, J.; Peach, C.; Chave, A.; Raymer, D.; Vernon, F.

    2008-12-01

    The Ocean Observatories Initiative (OOI) is an NSF funded program to establish the ocean observing infrastructure of the 21st century benefiting research and education. It is currently approaching final design and promises to deliver cyber and physical observatory infrastructure components as well as substantial core instrumentation to study environmental processes of the ocean at various scales, from coastal shelf-slope exchange processes to the deep ocean. The OOI's data distribution network lies at the heart of its cyber- infrastructure, which enables a multitude of science and education applications, ranging from data analysis, to processing, visualization and ontology supported query and mediation. In addition, it fundamentally supports a class of applications exploiting the knowledge gained from analyzing observational data for objective-driven ocean observing applications, such as automatically triggered response to episodic environmental events and interactive instrument tasking and control. The U.S. Department of Commerce through NOAA operates the Integrated Ocean Observing System (IOOS) providing continuous data in various formats, rates and scales on open oceans and coastal waters to scientists, managers, businesses, governments, and the public to support research and inform decision-making. The NOAA IOOS program initiated development of the Data Integration Framework (DIF) to improve management and delivery of an initial subset of ocean observations with the expectation of achieving improvements in a select set of NOAA's decision-support tools. Both OOI and NOAA through DIF collaborate on an effort to integrate the data distribution, access and analysis needs of both programs. We present details and early findings from this collaboration; one part of it is the development of a demonstrator combining web-based user access to oceanographic data through ERDDAP, efficient science data distribution, and scalable, self-healing deployment in a cloud computing

  11. INTEGRATION OF COMPUTER TECHNOLOGIES SMK: AUTOMATION OF THE PRODUCTION CERTIFICA-TION PROCEDURE AND FORMING OF SHIPPING DOCUMENTS

    Directory of Open Access Journals (Sweden)

    S. A. Pavlenko

    2009-01-01

    Full Text Available Integration of informational computer technologies allowed to reorganize and optimize some processes due to decrease of circulation of documents, unification of documentation forms and others.

  12. Home Automation

    OpenAIRE

    Ahmed, Zeeshan

    2010-01-01

    In this paper I briefly discuss the importance of home automation system. Going in to the details I briefly present a real time designed and implemented software and hardware oriented house automation research project, capable of automating house's electricity and providing a security system to detect the presence of unexpected behavior.

  13. Integration of GIS, Geostatistics, and 3-D Technology to Assess the Spatial Distribution of Soil Moisture

    Science.gov (United States)

    Betts, M.; Tsegaye, T.; Tadesse, W.; Coleman, T. L.; Fahsi, A.

    1998-01-01

    The spatial and temporal distribution of near surface soil moisture is of fundamental importance to many physical, biological, biogeochemical, and hydrological processes. However, knowledge of these space-time dynamics and the processes which control them remains unclear. The integration of geographic information systems (GIS) and geostatistics together promise a simple mechanism to evaluate and display the spatial and temporal distribution of this vital hydrologic and physical variable. Therefore, this research demonstrates the use of geostatistics and GIS to predict and display soil moisture distribution under vegetated and non-vegetated plots. The research was conducted at the Winfred Thomas Agricultural Experiment Station (WTAES), Hazel Green, Alabama. Soil moisture measurement were done on a 10 by 10 m grid from tall fescue grass (GR), alfalfa (AA), bare rough (BR), and bare smooth (BS) plots. Results indicated that variance associated with soil moisture was higher for vegetated plots than non-vegetated plots. The presence of vegetation in general contributed to the spatial variability of soil moisture. Integration of geostatistics and GIS can improve the productivity of farm lands and the precision of farming.

  14. Spatial distribution of dust in galaxies from the Integral field unit data

    Science.gov (United States)

    Zafar, Tayyaba; Sophie Dubber, Andrew Hopkins

    2018-01-01

    An important characteristic of the dust is it can be used as a tracer of stars (and gas) and tell us about the composition of galaxies. Sub-mm and infrared studies can accurately determine the total dust mass and its spatial distribution in massive, bright galaxies. However, faint and distant galaxies are hampered by resolution to dust spatial dust distribution. In the era of integral-field spectrographs (IFS), Balmer decrement is a useful quantity to infer the spatial extent of the dust in distant and low-mass galaxies. We conducted a study to estimate the spatial distribution of dust using the Sydney-Australian Astronomical Observatory (AAO) Multi-object Integral field spectrograph (SAMI) galaxies. Our methodology is unique to exploit the potential of IFS and using the spatial and spectral information together to study dust in galaxies of various morphological types. The spatial extent and content of dust are compared with the star-formation rate, reddening, and inclination of galaxies. We find a right correlation of dust spatial extent with the star-formation rate. The results also indicate a decrease in dust extent radius from Late Spirals to Early Spirals.

  15. Multi-agent system for energy resource scheduling of integrated microgrids in a distributed system

    International Nuclear Information System (INIS)

    Logenthiran, T.; Srinivasan, Dipti; Khambadkone, Ashwin M.

    2011-01-01

    This paper proposes a multi-agent system for energy resource scheduling of an islanded power system with distributed resources, which consists of integrated microgrids and lumped loads. Distributed intelligent multi-agent technology is applied to make the power system more reliable, efficient and capable of exploiting and integrating alternative sources of energy. The algorithm behind the proposed energy resource scheduling has three stages. The first stage is to schedule each microgrid individually to satisfy its internal demand. The next stage involves finding the best possible bids for exporting power to the network and compete in a whole sale energy market. The final stage is to reschedule each microgrid individually to satisfy the total demand, which is the addition of internal demand and the demand from the results of the whole sale energy market simulation. The simulation results of a power system with distributed resources comprising three microgrids and five lumped loads show that the proposed multi-agent system allows efficient management of micro-sources with minimum operational cost. The case studies demonstrate that the system is successfully monitored, controlled and operated by means of the developed multi-agent system. (author)

  16. Multi-agent system for energy resource scheduling of integrated microgrids in a distributed system

    Energy Technology Data Exchange (ETDEWEB)

    Logenthiran, T.; Srinivasan, Dipti; Khambadkone, Ashwin M. [Department of Electrical and Computer Engineering, National University of Singapore, 4 Engineering Drive 3, Singapore 117576 (Singapore)

    2011-01-15

    This paper proposes a multi-agent system for energy resource scheduling of an islanded power system with distributed resources, which consists of integrated microgrids and lumped loads. Distributed intelligent multi-agent technology is applied to make the power system more reliable, efficient and capable of exploiting and integrating alternative sources of energy. The algorithm behind the proposed energy resource scheduling has three stages. The first stage is to schedule each microgrid individually to satisfy its internal demand. The next stage involves finding the best possible bids for exporting power to the network and compete in a whole sale energy market. The final stage is to reschedule each microgrid individually to satisfy the total demand, which is the addition of internal demand and the demand from the results of the whole sale energy market simulation. The simulation results of a power system with distributed resources comprising three microgrids and five lumped loads show that the proposed multi-agent system allows efficient management of micro-sources with minimum operational cost. The case studies demonstrate that the system is successfully monitored, controlled and operated by means of the developed multi-agent system. (author)

  17. Integration of distributed energy resources into low voltage grid: A market-based multiperiod optimization model

    Energy Technology Data Exchange (ETDEWEB)

    Mashhour, Elahe; Moghaddas-Tafreshi, S.M. [Faculty of Electrical Engineering, K.N. Toosi University of Technology, Seyd Khandan, P.O. Box 16315-1355, Shariati, Tehran (Iran)

    2010-04-15

    This paper develops a multiperiod optimization model for an interconnected micro grid with hierarchical control that participates in wholesale energy market to maximize its benefit (i.e. revenues-costs). In addition to the operational constraints of distributed energy resources (DER) including both inter-temporal and non-inter-temporal types, the adequacy and steady-state security constraints of micro grid and its power losses are incorporated in the optimization model. In the presented model, DER are integrated into low voltage grid considering both technical and economical aspects. This integration as a micro grid can participate in wholesale energy market as an entity with dual role including producer and consumer based on the direction of exchanged power. The developed model is evaluated by testing on a micro grid considering different cases and the results are analyzed. (author)

  18. Integrating a dynamic data federation into the ATLAS distributed data management system

    CERN Document Server

    Berghaus, Frank; The ATLAS collaboration

    2018-01-01

    Input data for applications that run in cloud computing centres can be stored at remote repositories, typically with multiple copies of the most popular data stored at many sites. Locating and retrieving the remote data can be challenging, and we believe that federating the storage can address this problem. In this approach, the closest copy of the data is used based on geographical or other information. Currently, we are using the dynamic data federation, Dynafed, a software solution developed by CERN IT. Dynafed supports several industry standards for connection protocols, such as Amazon S3, Microsoft Azure and HTTP with WebDAV extensions. Dynafed functions as an abstraction layer under which protocol-dependent authentication details are hidden from the user, requiring the user to only provide an X509 certificate. We have set up an instance of Dynafed and integrated it into the ATLAS distributed data management system, Rucio. We report on the challenges faced during the installation and integration.

  19. Multi-Sensor Integration to Map Odor Distribution for the Detection of Chemical Sources

    Directory of Open Access Journals (Sweden)

    Xiang Gao

    2016-07-01

    Full Text Available This paper addresses the problem of mapping odor distribution derived from a chemical source using multi-sensor integration and reasoning system design. Odor localization is the problem of finding the source of an odor or other volatile chemical. Most localization methods require a mobile vehicle to follow an odor plume along its entire path, which is time consuming and may be especially difficult in a cluttered environment. To solve both of the above challenges, this paper proposes a novel algorithm that combines data from odor and anemometer sensors, and combine sensors’ data at different positions. Initially, a multi-sensor integration method, together with the path of airflow was used to map the pattern of odor particle movement. Then, more sensors are introduced at specific regions to determine the probable location of the odor source. Finally, the results of odor source location simulation and a real experiment are presented.

  20. Highly localized distributed Brillouin scattering response in a photonic integrated circuit

    Science.gov (United States)

    Zarifi, Atiyeh; Stiller, Birgit; Merklein, Moritz; Li, Neuton; Vu, Khu; Choi, Duk-Yong; Ma, Pan; Madden, Stephen J.; Eggleton, Benjamin J.

    2018-03-01

    The interaction of optical and acoustic waves via stimulated Brillouin scattering (SBS) has recently reached on-chip platforms, which has opened new fields of applications ranging from integrated microwave photonics and on-chip narrow-linewidth lasers, to phonon-based optical delay and signal processing schemes. Since SBS is an effect that scales exponentially with interaction length, on-chip implementation on a short length scale is challenging, requiring carefully designed waveguides with optimized opto-acoustic overlap. In this work, we use the principle of Brillouin optical correlation domain analysis to locally measure the SBS spectrum with high spatial resolution of 800 μm and perform a distributed measurement of the Brillouin spectrum along a spiral waveguide in a photonic integrated circuit. This approach gives access to local opto-acoustic properties of the waveguides, including the Brillouin frequency shift and linewidth, essential information for the further development of high quality photonic-phononic waveguides for SBS applications.

  1. Calculations of Neutron Flux Distributions by Means of Integral Transport Methods

    Energy Technology Data Exchange (ETDEWEB)

    Carlvik, I

    1967-05-15

    Flux distributions have been calculated mainly in one energy group, for a number of systems representing geometries interesting for reactor calculations. Integral transport methods of two kinds were utilised, collision probabilities (CP) and the discrete method (DIT). The geometries considered comprise the three one-dimensional geometries, planes, sphericals and annular, and further a square cell with a circular fuel rod and a rod cluster cell with a circular outer boundary. For the annular cells both methods (CP and DIT) were used and the results were compared. The purpose of the work is twofold, firstly to demonstrate the versatility and efficacy of integral transport methods and secondly to serve as a guide for anybody who wants to use the methods.

  2. Two efficient heuristics to solve the integrated load distribution and production planning problem

    International Nuclear Information System (INIS)

    Gajpal, Yuvraj; Nourelfath, Mustapha

    2015-01-01

    This paper considers a multi-period production system where a set of machines are arranged in parallel. The machines are unreliable and the failure rate of machine depends on the load assigned to the machine. The expected production rate of the system is considered to be a non-monotonic function of its load. Because of the machine failure rate, the total production output depends on the combination of loads assigned to different machines. We consider the integration of load distribution decisions with production planning decision. The product demands are considered to be known in advance. The objective is to minimize the sum of holding costs, backorder costs, production costs, setup costs, capacity change costs and unused capacity costs while satisfying the demand over specified time horizon. The constraint is not to exceed available repair resources required to repair the machine breakdown. The paper develops two heuristics to solve the integrated load distribution and production planning problem. The first heuristic consists of a three-phase approach, while the second one is based on tabu search metaheuristic. The efficiency of the proposed heuristics is tested through the randomly generated problem instances. - Highlights: • The expected performance of the system is a non-monotonic function of its load. • We consider the integration of load distribution and production planning decisions. • The paper proposes three phase and tabu search based heuristics to solve the problem. • Lower bound has been developed for checking the effectiveness of the heuristics. • The efficiency of the heuristic is tested through randomly generated instances.

  3. Integration of relational and textual biomedical sources. A pilot experiment using a semi-automated method for logical schema acquisition.

    Science.gov (United States)

    García-Remesal, M; Maojo, V; Billhardt, H; Crespo, J

    2010-01-01

    Bringing together structured and text-based sources is an exciting challenge for biomedical informaticians, since most relevant biomedical sources belong to one of these categories. In this paper we evaluate the feasibility of integrating relational and text-based biomedical sources using: i) an original logical schema acquisition method for textual databases developed by the authors, and ii) OntoFusion, a system originally designed by the authors for the integration of relational sources. We conducted an integration experiment involving a test set of seven differently structured sources covering the domain of genetic diseases. We used our logical schema acquisition method to generate schemas for all textual sources. The sources were integrated using the methods and tools provided by OntoFusion. The integration was validated using a test set of 500 queries. A panel of experts answered a questionnaire to evaluate i) the quality of the extracted schemas, ii) the query processing performance of the integrated set of sources, and iii) the relevance of the retrieved results. The results of the survey show that our method extracts coherent and representative logical schemas. Experts' feedback on the performance of the integrated system and the relevance of the retrieved results was also positive. Regarding the validation of the integration, the system successfully provided correct results for all queries in the test set. The results of the experiment suggest that text-based sources including a logical schema can be regarded as equivalent to structured databases. Using our method, previous research and existing tools designed for the integration of structured databases can be reused - possibly subject to minor modifications - to integrate differently structured sources.

  4. AMIC: an expandable integrated analog front-end for light distribution moments analysis

    Energy Technology Data Exchange (ETDEWEB)

    Spaggiari, M; Herrero, V; Lerche, C W; Aliaga, R; Monzo, J M; Gadea, R, E-mail: michele.spaggiari@gmail.com [Instituto de Instrumentacion para Imagen Molecular (I3M), Universidad Politecnica de Valencia, Camino de Vera, 46022, Valencia (Spain)

    2011-01-15

    In this article we introduce AMIC (Analog Moments Integrated Circuit), a novel analog Application Specific Integrated Circuit (ASIC) front-end for Positron Emission Tomography (PET) applications. Its working principle is based on mathematical analysis of light distribution through moments calculation. Each moment provides useful information about light distribution, such as energy, position, depth of interaction, skewness (deformation due to border effect) etc. A current buffer delivers a copy of each input current to several processing blocks. The current preamplifier is designed in order to achieve unconditional stability under high input capacitance, thus allowing the use of both Photo-Multiplier Tubes (PMT) and Silicon Photo-Multipliers (SiPM). Each processing block implements an analog current filtering by multiplying each input current by a programmable 8-bit coefficient. The latter is implemented through a high linear MOS current divider ladder, whose high sensitivity to variations in output voltages requires the integration of an extremely stable fully differential current collector. Output currents are then summed and sent to the output stage, that provides both a buffered output current and a linear rail-to-rail voltage for further digitalization. Since computation is purely additive, the 64 input channels of AMIC do not represent a limitation in the number of the detector's outputs. Current outputs of various AMIC structures can be combined as inputs of a final AMIC, thus providing a fully expandable structure. In this version of AMIC, 8 programmable blocks for moments calculation are integrated, as well as an I2C interface in order to program every coefficient. Extracted layout simulation results demonstrate that the information provided by moment calculation in AMIC helps to improve tridimensional positioning of the detected event. A two-detector test-bench is now being used for AMIC prototype characterization and preliminary results are presented.

  5. Distributed Energy Systems Integration and Demand Optimization for Autonomous Operations and Electric Grid Transactions

    Energy Technology Data Exchange (ETDEWEB)

    Ghatikar, Girish [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Greenlots, San Francisco, CA (United States); Mashayekh, Salman [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Stadler, Michael [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Center for Energy and Innovation Technologies (Austria); Yin, Rongxin [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Liu, Zhenhua [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-11-29

    Distributed power systems in the U.S. and globally are evolving to provide reliable and clean energy to consumers. In California, existing regulations require significant increases in renewable generation, as well as identification of customer-side distributed energy resources (DER) controls, communication technologies, and standards for interconnection with the electric grid systems. As DER deployment expands, customer-side DER control and optimization will be critical for system flexibility and demand response (DR) participation, which improves the economic viability of DER systems. Current DER systems integration and communication challenges include leveraging the existing DER and DR technology and systems infrastructure, and enabling optimized cost, energy and carbon choices for customers to deploy interoperable grid transactions and renewable energy systems at scale. Our paper presents a cost-effective solution to these challenges by exploring communication technologies and information models for DER system integration and interoperability. This system uses open standards and optimization models for resource planning based on dynamic-pricing notifications and autonomous operations within various domains of the smart grid energy system. It identifies architectures and customer engagement strategies in dynamic DR pricing transactions to generate feedback information models for load flexibility, load profiles, and participation schedules. The models are tested at a real site in California—Fort Hunter Liggett (FHL). Furthermore, our results for FHL show that the model fits within the existing and new DR business models and networked systems for transactive energy concepts. Integrated energy systems, communication networks, and modeling tools that coordinate supply-side networks and DER will enable electric grid system operators to use DER for grid transactions in an integrated system.

  6. A Distributed Model Predictive Control approach for the integration of flexible loads, storage and renewables

    DEFF Research Database (Denmark)

    Ferrarini, Luca; Mantovani, Giancarlo; Costanzo, Giuseppe Tommaso

    2014-01-01

    This paper presents an innovative solution based on distributed model predictive controllers to integrate the control and management of energy consumption, energy storage, PV and wind generation at customer side. The overall goal is to enable an advanced prosumer to autoproduce part of the energy...... he needs with renewable sources and, at the same time, to optimally exploit the thermal and electrical storages, to trade off its comfort requirements with different pricing schemes (including real-time pricing), and apply optimal control techniques rather than sub-optimal heuristics....

  7. Integrated Spectral Energy Distributions and Absorption Feature Indices of Single Stellar Populations

    OpenAIRE

    Zhang, Fenghui; Han, Zhanwen; Li, Lifang; Hurley, Jarrod R.

    2004-01-01

    Using evolutionary population synthesis, we present integrated spectral energy distributions and absorption-line indices defined by the Lick Observatory image dissector scanner (referred to as Lick/IDS) system, for an extensive set of instantaneous burst single stellar populations (SSPs). The ages of the SSPs are in the range 1-19 Gyr and the metallicities [Fe/H] are in the range -2.3 - 0.2. Our models use the rapid single stellar evolution algorithm of Hurley, Pols and Tout for the stellar e...

  8. Analysis of thevenin equivalent network of a distribution system for solar integration studies

    DEFF Research Database (Denmark)

    Yang, Guangya; Mattesen, Majken; Kjaer, Søren Bækhøj

    2012-01-01

    generations and expected to play a significant role in the future sustainable energy system. Currently one of the main issues for solar integration is the voltage regulation problem in the LV grid, as to the small X/R ratios. Hence, the voltage control techniques developed for the MV and HV networks may need...... to be further evaluated before applied for the LV grid. For the inverter voltage control design, it is useful to develop a realistic Thevenin equivalent model for the grid to ease the analysis. In this paper, case studies are performed based on the analysis of a realistic distribution network for the design...

  9. Experimental integration of quantum key distribution and gigabit-capable passive optical network

    Science.gov (United States)

    Sun, Wei; Wang, Liu-Jun; Sun, Xiang-Xiang; Mao, Yingqiu; Yin, Hua-Lei; Wang, Bi-Xiao; Chen, Teng-Yun; Pan, Jian-Wei

    2018-01-01

    Quantum key distribution (QKD) ensures information-theoretic security for the distribution of random bits between two remote parties. To extend QKD applications to fiber-to-the-home optical communications, such as gigabit-capable passive optical networks (GPONs), an effective method is the use of wavelength-division multiplexing. However, the Raman scattering noise from intensive classical traffic and the huge loss introduced by the beam splitter in a GPON severely limits the performance of QKD. Here, we demonstrate the integration of QKD and a commercial GPON system with fiber lengths up to 14 km, in which the maximum splitting ratio of the beam splitter reaches 1:64. By placing the QKD transmitter on the optical line terminal side, we reduce the Raman noise collected at the QKD receiver. Using a bypass structure, the loss of the beam splitter is circumvented effectively. Our results pave the way to extending the applications of QKD to last-mile communications.

  10. Integrating Renewable Energy into the Transmission and Distribution System of the U. S. Virgin Islands

    Energy Technology Data Exchange (ETDEWEB)

    Burman, K.; Olis, D.; Gevorgian, V.; Warren, A.; Butt, R.; Lilienthal, P.; Glassmire, J.

    2011-09-01

    This report focuses on the economic and technical feasibility of integrating renewable energy technologies into the U.S. Virgin Islands transmission and distribution systems. The report includes three main areas of analysis: 1) the economics of deploying utility-scale renewable energy technologies on St. Thomas/St. John and St. Croix; 2) potential sites for installing roof- and ground-mount PV systems and wind turbines and the impact renewable generation will have on the electrical subtransmission and distribution infrastructure, and 3) the feasibility of a 100- to 200-megawatt power interconnection of the Puerto Rico Electric Power Authority (PREPA), Virgin Islands Water and Power Authority (WAPA), and British Virgin Islands (BVI) grids via a submarine cable system.

  11. Automated sub-5 nm image registration in integrated correlative fluorescence and electron microscopy using cathodoluminescence pointers

    Science.gov (United States)

    Haring, Martijn T.; Liv, Nalan; Zonnevylle, A. Christiaan; Narvaez, Angela C.; Voortman, Lenard M.; Kruit, Pieter; Hoogenboom, Jacob P.

    2017-03-01

    In the biological sciences, data from fluorescence and electron microscopy is correlated to allow fluorescence biomolecule identification within the cellular ultrastructure and/or ultrastructural analysis following live-cell imaging. High-accuracy (sub-100 nm) image overlay requires the addition of fiducial markers, which makes overlay accuracy dependent on the number of fiducials present in the region of interest. Here, we report an automated method for light-electron image overlay at high accuracy, i.e. below 5 nm. Our method relies on direct visualization of the electron beam position in the fluorescence detection channel using cathodoluminescence pointers. We show that image overlay using cathodoluminescence pointers corrects for image distortions, is independent of user interpretation, and does not require fiducials, allowing image correlation with molecular precision anywhere on a sample.

  12. The integration of automated knowledge acquisition with computer-aided software engineering for space shuttle expert systems

    Science.gov (United States)

    Modesitt, Kenneth L.

    1990-01-01

    A prediction was made that the terms expert systems and knowledge acquisition would begin to disappear over the next several years. This is not because they are falling into disuse; it is rather that practitioners are realizing that they are valuable adjuncts to software engineering, in terms of problem domains addressed, user acceptance, and in development methodologies. A specific problem was discussed, that of constructing an automated test analysis system for the Space Shuttle Main Engine. In this domain, knowledge acquisition was part of requirements systems analysis, and was performed with the aid of a powerful inductive ESBT in conjunction with a computer aided software engineering (CASE) tool. The original prediction is not a very risky one -- it has already been accomplished.

  13. Integrating multiple distribution models to guide conservation efforts of an endangered toad

    Science.gov (United States)

    Treglia, Michael L.; Fisher, Robert N.; Fitzgerald, Lee A.

    2015-01-01

    Species distribution models are used for numerous purposes such as predicting changes in species’ ranges and identifying biodiversity hotspots. Although implications of distribution models for conservation are often implicit, few studies use these tools explicitly to inform conservation efforts. Herein, we illustrate how multiple distribution models developed using distinct sets of environmental variables can be integrated to aid in identification sites for use in conservation. We focus on the endangered arroyo toad (Anaxyrus californicus), which relies on open, sandy streams and surrounding floodplains in southern California, USA, and northern Baja California, Mexico. Declines of the species are largely attributed to habitat degradation associated with vegetation encroachment, invasive predators, and altered hydrologic regimes. We had three main goals: 1) develop a model of potential habitat for arroyo toads, based on long-term environmental variables and all available locality data; 2) develop a model of the species’ current habitat by incorporating recent remotely-sensed variables and only using recent locality data; and 3) integrate results of both models to identify sites that may be employed in conservation efforts. We used a machine learning technique, Random Forests, to develop the models, focused on riparian zones in southern California. We identified 14.37% and 10.50% of our study area as potential and current habitat for the arroyo toad, respectively. Generally, inclusion of remotely-sensed variables reduced modeled suitability of sites, thus many areas modeled as potential habitat were not modeled as current habitat. We propose such sites could be made suitable for arroyo toads through active management, increasing current habitat by up to 67.02%. Our general approach can be employed to guide conservation efforts of virtually any species with sufficient data necessary to develop appropriate distribution models.

  14. Integrating Smart Resources in ROS-based systems to distribute services

    Directory of Open Access Journals (Sweden)

    Eduardo MUNERA

    2017-03-01

    Full Text Available Mobile robots need to manage a lot of sensors and actuators using micro-controllers.To do complexes tasks, a highly computation central unit is also needed. In many cases, a robot is a intelligent distributed system formed with a central unit, which manages and distributes several specific tasks to some micro-controller embedded systems onboard.Now these embedded systems are also evolving to more complex systems that are developed not only for executing simple tasks but offering some advanced algorithmsjust as complex data processing, adaptive execution, or fault-tolerance and alarm rising mechanisms. To manage these types of embedded systems a paradigm, calledSmart Resource has been developed. Smart Resources topology has been raised to manage resources which execution relies on a physical embedded hardware. TheseSmart Resources are defined as a list of distributed services that can configure its execution in order to accomplish a context and quality requirements. In order to provide a more general implementation Smart Resources are integrated into the RobotOperating System (ROS. Paper presents a solution based on the Turtlebot platformrunning ROS. The solution shows how robots can make use of all the functions andmechanisms provided by the ROS and the distribution, reliability and adaptability ofthe Smart Resources. In addition it is also addressed the flexibility and scalability ofimplementation by combining real and simulated devices into the same platform

  15. On the relevancy of efficient, integrated computer and network monitoring in HEP distributed online environment

    International Nuclear Information System (INIS)

    Carvalho, D.; Gavillet, Ph.; Delgado, V.; Javello, J.; Miere, Y.; Ruffinoni, D.; Albert, J.N.; Bellas, N.; Smith, G.

    1996-01-01

    Large Scientific Equipment are controlled by Computer Systems whose complexity is growing driven, on the one hand by the volume and variety of the information, its distributed nature, the sophistication of its treatment and, on the other hand by the fast evolution of the computer and network market. Some people call them generically Large-Scale Distributed Data Intensive Information Systems or Distributed Computer Control Systems (DCCS) for those systems dealing more with real time control. Taking advantage of (or forced by) the distributed architecture, the tasks are more and more often implemented as Client-Server applications. In this framework the monitoring of the computer nodes, the communications network and the applications becomes of primary importance for ensuring the the safe running and guaranteed performance of the system. With the future generation of HEP experiments, such as those at the LHC in view, it is proposed to integrate the various functions of DCCS monitoring into one general purpose Multi-layer System. (author)

  16. Maximized integration of photovoltaics into distribution grids using smart charging strategies for electric vehicles; Maximierte PV-Integration in Niederspannungsnetzen durch intelligente Nutzung von Elektrofahrzeugen

    Energy Technology Data Exchange (ETDEWEB)

    Troeschel, Martin; Scherfke, Stefan; Schuette, Steffen; Appelrath, H Juergen; Sonnenschein, Michael [OFFIS - Institut fuer Informatik, Oldenburg (Germany). Bereich Energie

    2011-07-01

    An ICT-based integration of electric vehicles (EV) offers a promising potential on the distribution grid level, especially regarding synergies with renewable and distributed energy systems. Of special interest are (a) the charging of EV with electric power from renewable energy sources and (b) a preferentially local usage of feed-in from distributed energy systems. Regarding a systemic analysis of electromobility, a stable and reliable operation of the electric power grid is a major constraint for the integration of EV and maximized usage of renewable energy. In the model project GridSurfer, we conducted simulation-based analyses on the integration of EV. In this contribution, we present results from an analysis of future Smart-Grid-scenarios with special regard to rural areas and distribution grids in north-western Germany. (orig.)

  17. Advancing a Distributive-Bargaining and Integrative-Negotiation Integral System: A Values-Based Negotiation Model (VBM

    Directory of Open Access Journals (Sweden)

    Ivan Gan

    2017-09-01

    Full Text Available The proposed values-based negotiation model (VBM agrees with and extends principled negotiation’s recognition of personal values and emotions as important negotiation elements. First, building upon Martin Buber’s existentialist treatment of religion and secularism, VBM centers on religion as one of many possible sources of personal values that informs respectful and mutually beneficial interactions without needing one to necessarily be religious. Just as one need not be a Buddhist or a Hindu to practice yoga, negotiators of any theological outlook can profit from a model grounded in broad, common tenets drawn from a range of organized religions. Second, VBM distinguishes feelings from emotions because the long-lasting and intrinsically stimulated effects of feelings have greater implications on the perception of negotiated outcomes. VBM negotiators view negotiations as a constitutive prosocial process whereby parties consider the outcome important enough to invest time and energy. Negotiators who use VBM appeal to the goodness of their counterparts by doing good first so that both parties avoid a win-lose outcome. This counterintuitive move contradicts the self-centered but understandably normal human behavior of prioritizing one’s own interests before others’ interests. However, when one appeals to the goodness of one’s Buberian Thou counterparts, he or she stimulates positive emotions that promote understanding. Third, VBM provides a framework that draws upon an individual’s personal values (religious or otherwise and reconfigures the distributive-bargaining-and-integrative-negotiation distinction so that negotiators can freely apply distributive tactics to claim maximum intangible and tangible outcomes without compromising on their personal values or valuable relationships.

  18. Systematic review automation technologies

    Science.gov (United States)

    2014-01-01

    Systematic reviews, a cornerstone of evidence-based medicine, are not produced quickly enough to support clinical practice. The cost of production, availability of the requisite expertise and timeliness are often quoted as major contributors for the delay. This detailed survey of the state of the art of information systems designed to support or automate individual tasks in the systematic review, and in particular systematic reviews of randomized controlled clinical trials, reveals trends that see the convergence of several parallel research projects. We surveyed literature describing informatics systems that support or automate the processes of systematic review or each of the tasks of the systematic review. Several projects focus on automating, simplifying and/or streamlining specific tasks of the systematic review. Some tasks are already fully automated while others are still largely manual. In this review, we describe each task and the effect that its automation would have on the entire systematic review process, summarize the existing information system support for each task, and highlight where further research is needed for realizing automation for the task. Integration of the systems that automate systematic review tasks may lead to a revised systematic review workflow. We envisage the optimized workflow will lead to system in which each systematic review is described as a computer program that automatically retrieves relevant trials, appraises them, extracts and synthesizes data, evaluates the risk of bias, performs meta-analysis calculations, and produces a report in real time. PMID:25005128

  19. The CTTC 5G End-to-End Experimental Platform : Integrating Heterogeneous Wireless/Optical Networks, Distributed Cloud, and IoT Devices

    OpenAIRE

    Munoz, Raul; Mangues-Bafalluy, Josep; Vilalta, Ricard; Verikoukis, Christos; Alonso-Zarate, Jesus; Bartzoudis, Nikolaos; Georgiadis, Apostolos; Payaro, Miquel; Perez-Neira, Ana; Casellas, Ramon; Martinez, Ricardo; Nunez-Martinez, Jose; Requena Esteso, Manuel; Pubill, David; Font-Bach, Oriol

    2016-01-01

    The Internet of Things (IoT) will facilitate a wide variety of applications in different domains, such as smart cities, smart grids, industrial automation (Industry 4.0), smart driving, assistance of the elderly, and home automation. Billions of heterogeneous smart devices with different application requirements will be connected to the networks and will generate huge aggregated volumes of data that will be processed in distributed cloud infrastructures. On the other hand, there is also a gen...

  20. Automated Vehicles Symposium 2014

    CERN Document Server

    Beiker, Sven; Road Vehicle Automation 2

    2015-01-01

    This paper collection is the second volume of the LNMOB series on Road Vehicle Automation. The book contains a comprehensive review of current technical, socio-economic, and legal perspectives written by experts coming from public authorities, companies and universities in the U.S., Europe and Japan. It originates from the Automated Vehicle Symposium 2014, which was jointly organized by the Association for Unmanned Vehicle Systems International (AUVSI) and the Transportation Research Board (TRB) in Burlingame, CA, in July 2014. The contributions discuss the challenges arising from the integration of highly automated and self-driving vehicles into the transportation system, with a focus on human factors and different deployment scenarios. This book is an indispensable source of information for academic researchers, industrial engineers, and policy makers interested in the topic of road vehicle automation.