WorldWideScience

Sample records for automated optimal coordination

  1. Optimal Control and Coordination of Connected and Automated Vehicles at Urban Traffic Intersections

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Yue J. [Boston University; Malikopoulos, Andreas [ORNL; Cassandras, Christos G. [Boston University

    2016-01-01

    We address the problem of coordinating online a continuous flow of connected and automated vehicles (CAVs) crossing two adjacent intersections in an urban area. We present a decentralized optimal control framework whose solution yields for each vehicle the optimal acceleration/deceleration at any time in the sense of minimizing fuel consumption. The solu- tion, when it exists, allows the vehicles to cross the intersections without the use of traffic lights, without creating congestion on the connecting road, and under the hard safety constraint of collision avoidance. The effectiveness of the proposed solution is validated through simulation considering two intersections located in downtown Boston, and it is shown that coordination of CAVs can reduce significantly both fuel consumption and travel time.

  2. Optimized coordinates for anharmonic vibrational structure theories.

    Science.gov (United States)

    Yagi, Kiyoshi; Keçeli, Murat; Hirata, So

    2012-11-28

    A procedure to determine optimal vibrational coordinates is developed on the basis of an earlier idea of Thompson and Truhlar [J. Chem. Phys. 77, 3031 (1982)]. For a given molecule, these coordinates are defined as the unitary transform of the normal coordinates that minimizes the energy of the vibrational self-consistent-field (VSCF) method for the ground state. They are justified by the fact that VSCF in these coordinates becomes exact in two limiting cases: harmonic oscillators, where the optimized coordinates are normal, and noninteracting anharmonic oscillators, in which the optimized coordinates are localized on individual oscillators. A robust and general optimization algorithm is developed, which decomposes the transformation matrix into a product of Jacobi matrices, determines the rotation angle of each Jacobi matrix that minimizes the energy, and iterates the process until a minimum in the whole high dimension is reached. It is shown that the optimized coordinates are neither entirely localized nor entirely delocalized (or normal) in any of the molecules (the water, water dimer, and ethylene molecules) examined (apart from the aforementioned limiting cases). Rather, high-frequency stretching modes tend to be localized, whereas low-frequency skeletal vibrations remain normal. On the basis of these coordinates, we introduce two new vibrational structure methods: optimized-coordinate VSCF (oc-VSCF) and optimized-coordinate vibrational configuration interaction (oc-VCI). For the modes that become localized, oc-VSCF is found to outperform VSCF, whereas, for both classes of modes, oc-VCI exhibits much more rapid convergence than VCI with respect to the rank of excitations. We propose a rational configuration selection for oc-VCI when the optimized coordinates are localized. The use of the optimized coordinates in VCI with this configuration selection scheme reduces the mean absolute errors in the frequencies of the fundamentals and the first overtones

  3. Hybrid Optimized and Localized Vibrational Coordinates.

    Science.gov (United States)

    Klinting, Emil Lund; König, Carolin; Christiansen, Ove

    2015-11-01

    We present a new type of vibrational coordinates denoted hybrid optimized and localized coordinates (HOLCs) aiming at a good set of rectilinear vibrational coordinates supporting fast convergence in vibrational stucture calculations. The HOLCs are obtained as a compromise between the recently promoted optimized coordinates (OCs) and localized coordinates (LCs). The three sets of coordinates are generally different from each other and differ from standard normal coordinates (NCs) as well. In determining the HOLCs, we optimize the vibrational self-consistent field (VSCF) energy with respect to orthogonal transformation of the coordinates, which is similar to determining OCs but for HOLCs we additionally introduce a penalty for delocalization, by using a measure of localization similar to that employed in determining LCs. The same theory and implementation covers OCs, LCs, and HOLCs. It is shown that varying one penalty parameter allows for connecting OCs and LCs. The HOLCs are compared to NCs, OCs, and LCs in their nature and performance as basis for vibrational coupled cluster (VCC) response calculations of vibrational anharmonic energies for a small set of simple systems comprising water, formaldehyde, and ethylene. It is found that surprisingly good results can be obtained with HOLCs by using potential energy surfaces as simple as quadratic Taylor expansions. Quite similar coordinates are found for the already established OCs but obtaining these OCs requires much more elaborate and expensive potential energy surfaces and localization is generally not guaranteed. The ability to compute HOLCs for somewhat larger systems is demonstrated for coumarin and the alanine quadramer. The good agreement between HOLCs and OCs, together with the much easier applicability of HOLCs for larger systems, suggests that HOLCs may be a pragmatically very interesting option for anharmonic calculations on medium to large molecular systems.

  4. Optimal Coordination of Automatic Line Switches for Distribution Systems

    Directory of Open Access Journals (Sweden)

    Jyh-Cherng Gu

    2012-04-01

    Full Text Available For the Taiwan Power Company (Taipower, the margins of coordination times between the lateral circuit breakers (LCB of underground 4-way automatic line switches and the protection equipment of high voltage customers are often too small. This could lead to sympathy tripping by the feeder circuit breaker (FCB of the distribution feeder and create difficulties in protection coordination between upstream and downstream protection equipment, identification of faults, and restoration operations. In order to solve the problem, it is necessary to reexamine the protection coordination between LCBs and high voltage customers’ protection equipment, and between LCBs and FCBs, in order to bring forth new proposals for settings and operations. This paper applies linear programming to optimize the protection coordination of protection devices, and proposes new time current curves (TCCs for the overcurrent (CO and low-energy overcurrent (LCO relays used in normally open distribution systems by performing simulations in the Electrical Transient Analyzer Program (ETAP environment. The simulation results show that the new TCCs solve the coordination problems among high voltage customer, lateral, feeder, bus-interconnection, and distribution transformer. The new proposals also satisfy the requirements of Taipower on protection coordination of the distribution feeder automation system (DFAS. Finally, the authors believe that the system configuration, operation experience, and relevant criteria mentioned in this paper may serve as valuable references for other companies or utilities when building DFAS of their own.

  5. Automated selection of LEDs by luminance and chromaticity coordinate

    CERN Document Server

    Fischer, Ulrich H P; Reinboth, Christian

    2010-01-01

    The increased use of LEDs for lighting purposes has led to the development of numerous applications requiring a pre-selection of LEDs by their luminance and / or their chromaticity coordinate. This paper demonstrates how a manual pre-selection process can be realized using a relatively simple configuration. Since a manual selection service can only be commercially viable as long as only small quantities of LEDs need to be sorted, an automated solution suggests itself. This paper introduces such a solution, which has been developed by Harzoptics in close cooperation with Rundfunk Gernrode. The paper also discusses current challenges in measurement technology as well as market trends.

  6. Optimal coordinated voltage control of power systems

    Institute of Scientific and Technical Information of China (English)

    LI Yan-jun; HILL David J.; WU Tie-jun

    2006-01-01

    An immune algorithm solution is proposed in this paper to deal with the problem of optimal coordination of local physically based controllers in order to preserve or retain mid and long term voltage stability. This problem is in fact a global coordination control problem which involves not only sequencing and timing different control devices but also tuning the parameters of controllers. A multi-stage coordinated control scheme is presented, aiming at retaining good voltage levels with minimal control efforts and costs after severe disturbances in power systems. A self-pattern-recognized vaccination procedure is developed to transfer effective heuristic information into the new generation of solution candidates to speed up the convergence of the search procedure to global optima. An example of four bus power system case study is investigated to show the effectiveness and efficiency of the proposed algorithm, compared with several existing approaches such as differential dynamic programming and tree-search.

  7. Optimizing Vibrational Coordinates To Modulate Intermode Coupling.

    Science.gov (United States)

    Zimmerman, Paul M; Smereka, Peter

    2016-04-12

    The choice of coordinate system strongly affects the convergence properties of vibrational structure computations. Two methods for efficient generation of improved vibrational coordinates are presented and justified by analysis of a model anharmonic two-mode Hessian and numerical computations on polyatomic molecules. To produce optimal coordinates, metrics which quantify off-diagonal couplings over a grid of Hessian matrices are minimized through unitary rotations of the vibrational basis. The first proposed metric minimizes the total squared off-diagonal coupling, and the second minimizes the total squared change in off-diagonal coupling. In this procedure certain anharmonic modes tend to localize, for example X-H stretches. The proposed methods do not rely on prior fitting of the potential energy, vibrational structure computations, or localization metrics, so they are unique from previous vibrational coordinate generation algorithms and are generally applicable to polyatomic molecules. Fitting the potential to the approximate n-mode representation in the optimized bases for all-trans polyenes shows that off-diagonal anharmonic couplings are substantially reduced by the new choices of coordinate system. Convergence of vibrational energies is examined in detail for ethylene, and it is shown that coupling-optimized modes converge in vibrational configuration interaction computations to within 1 cm(-1) using only 3-mode couplings, where normal modes require 4-mode couplings for convergence. Comparison of the vibrational configuration interaction convergence with respect to excitation level for the two proposed metrics shows that minimization of the total off-diagonal coupling is most effective for low-cost vibrational structure computations.

  8. Automated Cache Performance Analysis And Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Mohror, Kathryn [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-12-23

    While there is no lack of performance counter tools for coarse-grained measurement of cache activity, there is a critical lack of tools for relating data layout to cache behavior to application performance. Generally, any nontrivial optimizations are either not done at all, or are done ”by hand” requiring significant time and expertise. To the best of our knowledge no tool available to users measures the latency of memory reference instructions for partic- ular addresses and makes this information available to users in an easy-to-use and intuitive way. In this project, we worked to enable the Open|SpeedShop performance analysis tool to gather memory reference latency information for specific instructions and memory ad- dresses, and to gather and display this information in an easy-to-use and intuitive way to aid performance analysts in identifying problematic data structures in their codes. This tool was primarily designed for use in the supercomputer domain as well as grid, cluster, cloud-based parallel e-commerce, and engineering systems and middleware. Ultimately, we envision a tool to automate optimization of application cache layout and utilization in the Open|SpeedShop performance analysis tool. To commercialize this soft- ware, we worked to develop core capabilities for gathering enhanced memory usage per- formance data from applications and create and apply novel methods for automatic data structure layout optimizations, tailoring the overall approach to support existing supercom- puter and cluster programming models and constraints. In this Phase I project, we focused on infrastructure necessary to gather performance data and present it in an intuitive way to users. With the advent of enhanced Precise Event-Based Sampling (PEBS) counters on recent Intel processor architectures and equivalent technology on AMD processors, we are now in a position to access memory reference information for particular addresses. Prior to the introduction of PEBS counters

  9. Modeling, Instrumentation, Automation, and Optimization of Water Resource Recovery Facilities.

    Science.gov (United States)

    Sweeney, Michael W; Kabouris, John C

    2016-10-01

    A review of the literature published in 2015 on topics relating to water resource recovery facilities (WRRF) in the areas of modeling, automation, measurement and sensors and optimization of wastewater treatment (or water resource reclamation) is presented.

  10. Automated Cache Performance Analysis And Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Mohror, Kathryn [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-12-23

    While there is no lack of performance counter tools for coarse-grained measurement of cache activity, there is a critical lack of tools for relating data layout to cache behavior to application performance. Generally, any nontrivial optimizations are either not done at all, or are done ”by hand” requiring significant time and expertise. To the best of our knowledge no tool available to users measures the latency of memory reference instructions for partic- ular addresses and makes this information available to users in an easy-to-use and intuitive way. In this project, we worked to enable the Open|SpeedShop performance analysis tool to gather memory reference latency information for specific instructions and memory ad- dresses, and to gather and display this information in an easy-to-use and intuitive way to aid performance analysts in identifying problematic data structures in their codes. This tool was primarily designed for use in the supercomputer domain as well as grid, cluster, cloud-based parallel e-commerce, and engineering systems and middleware. Ultimately, we envision a tool to automate optimization of application cache layout and utilization in the Open|SpeedShop performance analysis tool. To commercialize this soft- ware, we worked to develop core capabilities for gathering enhanced memory usage per- formance data from applications and create and apply novel methods for automatic data structure layout optimizations, tailoring the overall approach to support existing supercom- puter and cluster programming models and constraints. In this Phase I project, we focused on infrastructure necessary to gather performance data and present it in an intuitive way to users. With the advent of enhanced Precise Event-Based Sampling (PEBS) counters on recent Intel processor architectures and equivalent technology on AMD processors, we are now in a position to access memory reference information for particular addresses. Prior to the introduction of PEBS counters

  11. Optimizing a Drone Network to Deliver Automated External Defibrillators.

    Science.gov (United States)

    Boutilier, Justin J; Brooks, Steven C; Janmohamed, Alyf; Byers, Adam; Buick, Jason E; Zhan, Cathy; Schoellig, Angela P; Cheskes, Sheldon; Morrison, Laurie J; Chan, Timothy C Y

    2017-03-02

    Background -Public access defibrillation programs can improve survival after out-of-hospital cardiac arrest (OHCA), but automated external defibrillators (AEDs) are rarely available for bystander use at the scene. Drones are an emerging technology that can deliver an AED to the scene of an OHCA for bystander use. We hypothesize that a drone network designed with the aid of a mathematical model combining both optimization and queuing can reduce the time to AED arrival. Methods -We applied our model to 53,702 OHCAs that occurred in the eight regions of the Toronto Regional RescuNET between January 1st 2006 and December 31st 2014. Our primary analysis quantified the drone network size required to deliver an AED one, two, or three minutes faster than historical median 911 response times for each region independently. A secondary analysis quantified the reduction in drone resources required if RescuNET was treated as one large coordinated region. Results -The region-specific analysis determined that 81 bases and 100 drones would be required to deliver an AED ahead of median 911 response times by three minutes. In the most urban region, the 90th percentile of the AED arrival time was reduced by 6 minutes and 43 seconds relative to historical 911 response times in the region. In the most rural region, the 90th percentile was reduced by 10 minutes and 34 seconds. A single coordinated drone network across all regions required 39.5% fewer bases and 30.0% fewer drones to achieve similar AED delivery times. Conclusions -An optimized drone network designed with the aid of a novel mathematical model can substantially reduce the AED delivery time to an OHCA event.

  12. PARAMETER COORDINATION AND ROBUST OPTIMIZATION FOR MULTIDISCIPLINARY DESIGN

    Institute of Scientific and Technical Information of China (English)

    HU Jie; PENG Yinghong; XIONG Guangleng

    2006-01-01

    A new parameter coordination and robust optimization approach for multidisciplinary design is presented. Firstly, the constraints network model is established to support engineering change, coordination and optimization. In this model, interval boxes are adopted to describe the uncertainty of design parameters quantitatively to enhance the design robustness. Secondly, the parameter coordination method is presented to solve the constraints network model, monitor the potential conflicts due to engineering changes, and obtain the consistency solution space corresponding to the given product specifications. Finally, the robust parameter optimization model is established, and genetic arithmetic is used to obtain the robust optimization parameter. An example of bogie design is analyzed to show the scheme to be effective.

  13. Optimization based automated curation of metabolic reconstructions

    Directory of Open Access Journals (Sweden)

    Maranas Costas D

    2007-06-01

    Full Text Available Abstract Background Currently, there exists tens of different microbial and eukaryotic metabolic reconstructions (e.g., Escherichia coli, Saccharomyces cerevisiae, Bacillus subtilis with many more under development. All of these reconstructions are inherently incomplete with some functionalities missing due to the lack of experimental and/or homology information. A key challenge in the automated generation of genome-scale reconstructions is the elucidation of these gaps and the subsequent generation of hypotheses to bridge them. Results In this work, an optimization based procedure is proposed to identify and eliminate network gaps in these reconstructions. First we identify the metabolites in the metabolic network reconstruction which cannot be produced under any uptake conditions and subsequently we identify the reactions from a customized multi-organism database that restores the connectivity of these metabolites to the parent network using four mechanisms. This connectivity restoration is hypothesized to take place through four mechanisms: a reversing the directionality of one or more reactions in the existing model, b adding reaction from another organism to provide functionality absent in the existing model, c adding external transport mechanisms to allow for importation of metabolites in the existing model and d restore flow by adding intracellular transport reactions in multi-compartment models. We demonstrate this procedure for the genome- scale reconstruction of Escherichia coli and also Saccharomyces cerevisiae wherein compartmentalization of intra-cellular reactions results in a more complex topology of the metabolic network. We determine that about 10% of metabolites in E. coli and 30% of metabolites in S. cerevisiae cannot carry any flux. Interestingly, the dominant flow restoration mechanism is directionality reversals of existing reactions in the respective models. Conclusion We have proposed systematic methods to identify and

  14. Agent Technology Application in Automating the Coordination and Decision-Making in Supply Chain

    Institute of Scientific and Technical Information of China (English)

    JIE Hui; JI Jian-hua

    2005-01-01

    Coordinating all the activities among all the parties involved in supply chain can be a daunting task. This paper put forth the viewpoint of applying agent technology to automate the coordination and decision-making tasks in a typical home PC industry supply chain. The main features of the proposed approach, which differentiate it cesses and issues faced by parties in the supply chain. A prototype and the overall process flow were also described.

  15. Kinematically optimal robot placement for minimum time coordinated motion

    Energy Technology Data Exchange (ETDEWEB)

    Feddema, J.T.

    1995-10-01

    This paper describes an algorithm for determining the optimal placement of a robotic manipulator within a workcell for minimum time coordinated motion. The algorithm uses a simple principle of coordinated motion to estimate the time of a joint interpolated motion. Specifically, the coordinated motion profile is limited by the slowest axis. Two and six degree of freedom (DOF) examples are presented. In experimental tests on a FANUC S-800 arm, the optimal placement of the robot can improve cycle time of a robotic operation by as much as 25%. In high volume processes where the robot motion is currently the limiting factor, this increased throughput can result in substantial cost savings.

  16. Automated firewall analytics design, configuration and optimization

    CERN Document Server

    Al-Shaer, Ehab

    2014-01-01

    This book provides a comprehensive and in-depth study of automated firewall policy analysis for designing, configuring and managing distributed firewalls in large-scale enterpriser networks. It presents methodologies, techniques and tools for researchers as well as professionals to understand the challenges and improve the state-of-the-art of managing firewalls systematically in both research and application domains. Chapters explore set-theory, managing firewall configuration globally and consistently, access control list with encryption, and authentication such as IPSec policies. The author

  17. Optimized and Automated design of Plasma Diagnostics for Additive Manufacture

    Science.gov (United States)

    Stuber, James; Quinley, Morgan; Melnik, Paul; Sieck, Paul; Smith, Trevor; Chun, Katherine; Woodruff, Simon

    2016-10-01

    Despite having mature designs, diagnostics are usually custom designed for each experiment. Most of the design can be now be automated to reduce costs (engineering labor, and capital cost). We present results from scripted physics modeling and parametric engineering design for common optical and mechanical components found in many plasma diagnostics and outline the process for automated design optimization that employs scripts to communicate data from online forms through proprietary and open-source CAD and FE codes to provide a design that can be sent directly to a printer. As a demonstration of design automation, an optical beam dump, baffle and optical components are designed via an automated process and printed. Supported by DOE SBIR Grant DE-SC0011858.

  18. Optimization of Aimpoints for Coordinate Seeking Weapons

    Science.gov (United States)

    2015-09-01

    process. The program works by first taking in the number of weapons used and arranging them in a fixed uniform spacing on a circle centered on the...MATLAB program is used as the coding tool for the development of this algorithm and the optimization process. The program works by first taking in the...number of weapons used and arranging them in a fixed uniform spacing on a circle centered on the assumed target location. Then, the weapon

  19. Distributed optimal coordination for distributed energy resources in power systems

    DEFF Research Database (Denmark)

    Wu, Di; Yang, Tao; Stoorvogel, A.

    2017-01-01

    Driven by smart grid technologies, distributed energy resources (DERs) have been rapidly developing in recent years for improving reliability and efficiency of distribution systems. Emerging DERs require effective and efficient coordination in order to reap their potential benefits. In this paper......, we consider an optimal DER coordination problem over multiple time periods subject to constraints at both system and device levels. Fully distributed algorithms are proposed to dynamically and automatically coordinate distributed generators with multiple/single storages. With the proposed algorithms......, the coordination agent at each DER maintains only a set of variables and updates them through information exchange with a few neighbors. We show that the proposed algorithms with properly chosen parameters solve the DER coordination problem as long as the underlying communication network is connected...

  20. Coordination and Emergence in the Cellular Automated Fashion Game

    CERN Document Server

    Cao, Zhigang; Qu, Xinglong; Yang, Mingmin; Yang, Xiaoguang

    2012-01-01

    We investigate a heterogeneous cellular automaton, where there are two types of agents, conformists and rebels. Each agent has to choose between two actions, 0 and 1. A conformist likes to choose an action that most of her neighbors choose, while in contrast a rebel wants to be different with most of her neighbors. Theoretically, this model is equivalent to the matching pennies game on regular networks. We study the dynamical process by assuming that each agent takes a myopic updating rule. An uniform updating probability is also introduced for each agent to study the whole spectrum from synchronous updating to asynchronous updating. Our model characterizes the phenomenon of fashion very well and has a great potential in the study of the finance and stock markets. A large number of simulations show that in most case agents can reach extraordinarily high degree of coordination. This process is also quite fast and steady. Considering that these dynamics are really simple, agents are selfish, myopic, and have ve...

  1. Multi Objective Optimization of Coordinated Scheduling of Cranes and Vehicles at Container Terminals

    Directory of Open Access Journals (Sweden)

    Seyed Mahdi Homayouni

    2013-01-01

    Full Text Available According to previous researches, automated guided vehicles and quay cranes in container terminals have a high potential synergy. In this paper, a mixed integer programming model is formulated to optimize the coordinated scheduling of cranes and vehicles in container terminals. Objectives of the model are to minimize total traveling time of the vehicles and delays in tasks of cranes. A genetic algorithm is developed to solve the problem in reasonable computational time. The most appropriate control parameters for the proposed genetic algorithm are investigated in a medium size numerical test case. It is shown that balanced crossover and mutation rates have the best performance in finding a near optimal solution for the problem. Then, ten small size test cases are solved to evaluate the performance of the proposed optimization methods. The results show the applicability of the genetic algorithm since it can find near optimal solutions, precisely and accurately.

  2. Towards Automated Design, Analysis and Optimization of Declarative Curation Workflows

    Directory of Open Access Journals (Sweden)

    Tianhong Song

    2014-10-01

    Full Text Available Data curation is increasingly important. Our previous work on a Kepler curation package has demonstrated advantages that come from automating data curation pipelines by using workflow systems. However, manually designed curation workflows can be error-prone and inefficient due to a lack of user understanding of the workflow system, misuse of actors, or human error. Correcting problematic workflows is often very time-consuming. A more proactive workflow system can help users avoid such pitfalls. For example, static analysis before execution can be used to detect the potential problems in a workflow and help the user to improve workflow design. In this paper, we propose a declarative workflow approach that supports semi-automated workflow design, analysis and optimization. We show how the workflow design engine helps users to construct data curation workflows, how the workflow analysis engine detects different design problems of workflows and how workflows can be optimized by exploiting parallelism.

  3. Automated Multivariate Optimization Tool for Energy Analysis: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Ellis, P. G.; Griffith, B. T.; Long, N.; Torcellini, P. A.; Crawley, D.

    2006-07-01

    Building energy simulations are often used for trial-and-error evaluation of ''what-if'' options in building design--a limited search for an optimal solution, or ''optimization''. Computerized searching has the potential to automate the input and output, evaluate many options, and perform enough simulations to account for the complex interactions among combinations of options. This paper describes ongoing efforts to develop such a tool. The optimization tool employs multiple modules, including a graphical user interface, a database, a preprocessor, the EnergyPlus simulation engine, an optimization engine, and a simulation run manager. Each module is described and the overall application architecture is summarized.

  4. Weighted Constraint Satisfaction for Smart Home Automation and Optimization

    Directory of Open Access Journals (Sweden)

    Noel Nuo Wi Tay

    2016-01-01

    Full Text Available Automation of the smart home binds together services of hardware and software to provide support for its human inhabitants. The rise of web technologies offers applicable concepts and technologies for service composition that can be exploited for automated planning of the smart home, which can be further enhanced by implementation based on service oriented architecture (SOA. SOA supports loose coupling and late binding of devices, enabling a more declarative approach in defining services and simplifying home configurations. One such declarative approach is to represent and solve automated planning through constraint satisfaction problem (CSP, which has the advantage of handling larger domains of home states. But CSP uses hard constraints and thus cannot perform optimization and handle contradictory goals and partial goal fulfillment, which are practical issues smart environments will face if humans are involved. This paper extends this approach to Weighted Constraint Satisfaction Problem (WCSP. Branch and bound depth first search is used, where its lower bound is estimated by bacterial memetic algorithm (BMA on a relaxed version of the original optimization problem. Experiments up to 16-step planning of home services demonstrate the applicability and practicality of the approach, with the inclusion of local search for trivial service combinations in BMA that produces performance enhancements. Besides, this work aims to set the groundwork for further research in the field.

  5. Automated parameterization of intermolecular pair potentials using global optimization techniques

    Science.gov (United States)

    Krämer, Andreas; Hülsmann, Marco; Köddermann, Thorsten; Reith, Dirk

    2014-12-01

    In this work, different global optimization techniques are assessed for the automated development of molecular force fields, as used in molecular dynamics and Monte Carlo simulations. The quest of finding suitable force field parameters is treated as a mathematical minimization problem. Intricate problem characteristics such as extremely costly and even abortive simulations, noisy simulation results, and especially multiple local minima naturally lead to the use of sophisticated global optimization algorithms. Five diverse algorithms (pure random search, recursive random search, CMA-ES, differential evolution, and taboo search) are compared to our own tailor-made solution named CoSMoS. CoSMoS is an automated workflow. It models the parameters' influence on the simulation observables to detect a globally optimal set of parameters. It is shown how and why this approach is superior to other algorithms. Applied to suitable test functions and simulations for phosgene, CoSMoS effectively reduces the number of required simulations and real time for the optimization task.

  6. A coordinate-wise optimization algorithm for the Fused Lasso

    OpenAIRE

    Höfling, Holger; Binder, Harald; Schumacher, Martin

    2010-01-01

    L1 -penalized regression methods such as the Lasso (Tibshirani 1996) that achieve both variable selection and shrinkage have been very popular. An extension of this method is the Fused Lasso (Tibshirani and Wang 2007), which allows for the incorporation of external information into the model. In this article, we develop new and fast algorithms for solving the Fused Lasso which are based on coordinate-wise optimization. This class of algorithms has recently been applied very successfully to so...

  7. Optimal design of coordination control strategy for distributed generation system

    Institute of Scientific and Technical Information of China (English)

    WANG Ai-hua; Norapon Kanjanapadit

    2009-01-01

    This paper presents a novel design procedure for optimizing the power distribution strategy in distributed generation system. A coordinating controller, responsible to distribute the total load power request among multiple DG units, is suggested based on the conception of hierarchical control structure in the dynamic system.The optimal control problem was formulated as a nonlinear optimization problem subject to set of constraints.The resulting problem was solved using the Kutm-Tucker method. Computer simulation results demonstrate that the proposed method can provide better efficiency in terms of reducing total costs compared to existing methods.In addition, the proposed optimal load distribution strategy can be easily implemented in real-time thanks to the simplicity of closed-form solutions.

  8. Optimal Protection Coordination for Microgrid under Different Operating Modes

    Directory of Open Access Journals (Sweden)

    Ming-Ta Yang

    2013-01-01

    Full Text Available Significant consequences result when a microgrid is connected to a distribution system. This study discusses the impacts of bolted three-phase faults and bolted single line-to-ground faults on the protection coordination of a distribution system connected by a microgrid which operates in utility-only mode or in grid-connected mode. The power system simulation software is used to build the test system. The linear programming method is applied to optimize the coordination of relays, and the relays coordination simulation software is used to verify if the coordination time intervals (CTIs of the primary/backup relay pairs are adequate. In addition, this study also proposes a relays protection coordination strategy when the microgrid operates in islanding mode during a utility power outage. Because conventional CO/LCO relays are not capable of detecting high impedance fault, intelligent electrical device (IED combined with wavelet transformer and neural network is proposed to accurately detect high impedance fault and identify the fault phase.

  9. Geometry optimization made simple with translation and rotation coordinates

    Science.gov (United States)

    Wang, Lee-Ping; Song, Chenchen

    2016-06-01

    The effective description of molecular geometry is important for theoretical studies of intermolecular interactions. Here we introduce a new translation-rotation-internal coordinate (TRIC) system which explicitly includes the collective translations and rotations of molecules, or parts of molecules such as monomers or ligands, as degrees of freedom. The translations are described as the centroid position and the orientations are represented with the exponential map parameterization of quaternions. When TRIC is incorporated into geometry optimization calculations, the performance is consistently superior to existing coordinate systems for a diverse set of systems including water clusters, organic semiconductor donor-acceptor complexes, and small proteins, all of which are characterized by nontrivial intermolecular interactions. The method also introduces a new way to scan the molecular orientations while allowing orthogonal degrees of freedom to relax. Our findings indicate that an explicit description of molecular translation and rotation is a natural way to traverse the many-dimensional potential energy surface.

  10. Automated assay optimization with integrated statistics and smart robotics.

    Science.gov (United States)

    Taylor, P B; Stewart, F P; Dunnington, D J; Quinn, S T; Schulz, C K; Vaidya, K S; Kurali, E; Lane, T R; Xiong, W C; Sherrill, T P; Snider, J S; Terpstra, N D; Hertzberg, R P

    2000-08-01

    The transition from manual to robotic high throughput screening (HTS) in the last few years has made it feasible to screen hundreds of thousands of chemical entities against a biological target in less than a month. This rate of HTS has increased the visibility of bottlenecks, one of which is assay optimization. In many organizations, experimental methods are generated by therapeutic teams associated with specific targets and passed on to the HTS group. The resulting assays frequently need to be further optimized to withstand the rigors and time frames inherent in robotic handling. Issues such as protein aggregation, ligand instability, and cellular viability are common variables in the optimization process. The availability of robotics capable of performing rapid random access tasks has made it possible to design optimization experiments that would be either very difficult or impossible for a person to carry out. Our approach to reducing the assay optimization bottleneck has been to unify the highly specific fields of statistics, biochemistry, and robotics. The product of these endeavors is a process we have named automated assay optimization (AAO). This has enabled us to determine final optimized assay conditions, which are often a composite of variables that we would not have arrived at by examining each variable independently. We have applied this approach to both radioligand binding and enzymatic assays and have realized benefits in both time and performance that we would not have predicted a priori. The fully developed AAO process encompasses the ability to download information to a robot and have liquid handling methods automatically created. This evolution in smart robotics has proven to be an invaluable tool for maintaining high-quality data in the context of increasing HTS demands.

  11. An optimized method for automated analysis of algal pigments by HPLC

    NARCIS (Netherlands)

    van Leeuwe, M. A.; Villerius, L. A.; Roggeveld, J.; Visser, R. J. W.; Stefels, J.

    2006-01-01

    A recent development in algal pigment analysis by high-performance liquid chromatography (HPLC) is the application of automation. An optimization of a complete sampling and analysis protocol applied specifically in automation has not yet been performed. In this paper we show that automation can only

  12. Automating Initial Guess Generation for High Fidelity Trajectory Optimization Tools

    Science.gov (United States)

    Villa, Benjamin; Lantoine, Gregory; Sims, Jon; Whiffen, Gregory

    2013-01-01

    Many academic studies in spaceflight dynamics rely on simplified dynamical models, such as restricted three-body models or averaged forms of the equations of motion of an orbiter. In practice, the end result of these preliminary orbit studies needs to be transformed into more realistic models, in particular to generate good initial guesses for high-fidelity trajectory optimization tools like Mystic. This paper reviews and extends some of the approaches used in the literature to perform such a task, and explores the inherent trade-offs of such a transformation with a view toward automating it for the case of ballistic arcs. Sample test cases in the libration point regimes and small body orbiter transfers are presented.

  13. Self-Organization and Self-Coordination in Welding Automation with Collaborating Teams of Industrial Robots

    Directory of Open Access Journals (Sweden)

    Günther Starke

    2016-11-01

    Full Text Available In welding automation, growing interest can be recognized in applying teams of industrial robots to perform manufacturing processes through collaboration. Although robot teamwork can increase profitability and cost-effectiveness in production, the programming of the robots is still a problem. It is extremely time consuming and requires special expertise in synchronizing the activities of the robots to avoid any collision. Therefore, a research project has been initiated to solve those problems. This paper will present strategies, concepts, and research results in applying robot operating system (ROS and ROS-based solutions to overcome existing technical deficits through the integration of self-organization capabilities, autonomous path planning, and self-coordination of the robots’ work. The new approach should contribute to improving the application of robot teamwork and collaboration in the manufacturing sector at a higher level of flexibility and reduced need for human intervention.

  14. Development of An Optimization Method for Determining Automation Rate in Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seung Min; Seong, Poong Hyun [KAIST, Daejeon (Korea, Republic of); Kim, Jong Hyun [KEPCO, Ulsan (Korea, Republic of)

    2014-08-15

    Since automation was introduced in various industrial fields, it has been known that automation provides positive effects like greater efficiency and fewer human errors, and negative effect defined as out-of-the-loop (OOTL). Thus, before introducing automation in nuclear field, the estimation of the positive and negative effects of automation on human operators should be conducted. In this paper, by focusing on CPS, the optimization method to find an appropriate proportion of automation is suggested by integrating the suggested cognitive automation rate and the concepts of the level of ostracism. The cognitive automation rate estimation method was suggested to express the reduced amount of human cognitive loads, and the level of ostracism was suggested to express the difficulty in obtaining information from the automation system and increased uncertainty of human operators' diagnosis. The maximized proportion of automation that maintains the high level of attention for monitoring the situation is derived by an experiment, and the automation rate is estimated by the suggested automation rate estimation method. It is expected to derive an appropriate inclusion proportion of the automation system avoiding the OOTL problem and having maximum efficacy at the same time.

  15. Optimizing ELISAs for precision and robustness using laboratory automation and statistical design of experiments.

    Science.gov (United States)

    Joelsson, Daniel; Moravec, Phil; Troutman, Matthew; Pigeon, Joseph; DePhillips, Pete

    2008-08-20

    Transferring manual ELISAs to automated platforms requires optimizing the assays for each particular robotic platform. These optimization experiments are often time consuming and difficult to perform using a traditional one-factor-at-a-time strategy. In this manuscript we describe the development of an automated process using statistical design of experiments (DOE) to quickly optimize immunoassays for precision and robustness on the Tecan EVO liquid handler. By using fractional factorials and a split-plot design, five incubation time variables and four reagent concentration variables can be optimized in a short period of time.

  16. An Automated DAKOTA and VULCAN-CFD Framework with Application to Supersonic Facility Nozzle Flowpath Optimization

    Science.gov (United States)

    Axdahl, Erik L.

    2015-01-01

    Removing human interaction from design processes by using automation may lead to gains in both productivity and design precision. This memorandum describes efforts to incorporate high fidelity numerical analysis tools into an automated framework and applying that framework to applications of practical interest. The purpose of this effort was to integrate VULCAN-CFD into an automated, DAKOTA-enabled framework with a proof-of-concept application being the optimization of supersonic test facility nozzles. It was shown that the optimization framework could be deployed on a high performance computing cluster with the flow of information handled effectively to guide the optimization process. Furthermore, the application of the framework to supersonic test facility nozzle flowpath design and optimization was demonstrated using multiple optimization algorithms.

  17. Optimizing Patient-centered Communication and Multidisciplinary Care Coordination in Emergency Diagnostic Imaging: A Research Agenda.

    Science.gov (United States)

    Sabbatini, Amber K; Merck, Lisa H; Froemming, Adam T; Vaughan, William; Brown, Michael D; Hess, Erik P; Applegate, Kimberly E; Comfere, Nneka I

    2015-12-01

    Patient-centered emergency diagnostic imaging relies on efficient communication and multispecialty care coordination to ensure optimal imaging utilization. The construct of the emergency diagnostic imaging care coordination cycle with three main phases (pretest, test, and posttest) provides a useful framework to evaluate care coordination in patient-centered emergency diagnostic imaging. This article summarizes findings reached during the patient-centered outcomes session of the 2015 Academic Emergency Medicine consensus conference "Diagnostic Imaging in the Emergency Department: A Research Agenda to Optimize Utilization." The primary objective was to develop a research agenda focused on 1) defining component parts of the emergency diagnostic imaging care coordination process, 2) identifying gaps in communication that affect emergency diagnostic imaging, and 3) defining optimal methods of communication and multidisciplinary care coordination that ensure patient-centered emergency diagnostic imaging. Prioritized research questions provided the framework to define a research agenda for multidisciplinary care coordination in emergency diagnostic imaging.

  18. Optimal Coordinated Strategy Analysis for the Procurement Logistics of a Steel Group

    Directory of Open Access Journals (Sweden)

    Lianbo Deng

    2014-01-01

    Full Text Available This paper focuses on the optimization of an internal coordinated procurement logistics system in a steel group and the decision on the coordinated procurement strategy by minimizing the logistics costs. Considering the coordinated procurement strategy and the procurement logistics costs, the aim of the optimization model was to maximize the degree of quality satisfaction and to minimize the procurement logistics costs. The model was transformed into a single-objective model and solved using a simulated annealing algorithm. In the algorithm, the supplier of each subsidiary was selected according to the evaluation result for independent procurement. Finally, the effect of different parameters on the coordinated procurement strategy was analysed. The results showed that the coordinated strategy can clearly save procurement costs; that the strategy appears to be more cooperative when the quality requirement is not stricter; and that the coordinated costs have a strong effect on the coordinated procurement strategy.

  19. Review of Automated Design and Optimization of MEMS

    DEFF Research Database (Denmark)

    Achiche, Sofiane; Fan, Zhun; Bolognini, Francesca

    2007-01-01

    In recent years MEMS saw a very rapid development. Although many advances have been reached, due to the multiphysics nature of MEMS, their design is still a difficult task carried on mainly by hand calculation. In order to help to overtake such difficulties, attempts to automate MEMS design were...... carried out. This paper presents a review of these techniques. The design task of MEMS is usually divided into four main stages: System Level, Device Level, Physical Level and the Process Level. The state of the art o automated MEMS design in each of these levels is investigated....

  20. Application of Advanced Particle Swarm Optimization Techniques to Wind-thermal Coordination

    DEFF Research Database (Denmark)

    Singh, Sri Niwas; Østergaard, Jacob; Yadagiri, J.

    2009-01-01

    wind-thermal coordination algorithm is necessary to determine the optimal proportion of wind and thermal generator capacity that can be integrated into the system. In this paper, four versions of Particle Swarm Optimization (PSO) techniques are proposed for solving wind-thermal coordination problem....... A pseudo code based algorithm is suggested to deal with the equality constraints of the problem for accelerating the optimization process. The simulation results show that the proposed PSO methods are capable of obtaining higher quality solutions efficiently in wind-thermal coordination problems....

  1. Toward an Integrated Framework for Automated Development and Optimization of Online Advertising Campaigns

    OpenAIRE

    Thomaidou, Stamatina; Vazirgiannis, Michalis; Liakopoulos, Kyriakos

    2012-01-01

    Creating and monitoring competitive and cost-effective pay-per-click advertisement campaigns through the web-search channel is a resource demanding task in terms of expertise and effort. Assisting or even automating the work of an advertising specialist will have an unrivaled commercial value. In this paper we propose a methodology, an architecture, and a fully functional framework for semi- and fully- automated creation, monitoring, and optimization of cost-efficient pay-per-click campaigns ...

  2. Lyapunov-based Low-thrust Optimal Orbit Transfer: An approach in Cartesian coordinates

    CERN Document Server

    Zhang, Hantian; Cao, Qingjie

    2014-01-01

    This paper presents a simple approach to low-thrust optimal-fuel and optimal-time transfer problems between two elliptic orbits using the Cartesian coordinates system. In this case, an orbit is described by its specific angular momentum and Laplace vectors with a free injection point. Trajectory optimization with the pseudospectral method and nonlinear programming are supported by the initial guess generated from the Chang-Chichka-Marsden Lyapunov-based transfer controller. This approach successfully solves several low-thrust optimal problems. Numerical results show that the Lyapunov-based initial guess overcomes the difficulty in optimization caused by the strong oscillation of variables in the Cartesian coordinates system. Furthermore, a comparison of the results shows that obtaining the optimal transfer solution through the polynomial approximation by utilizing Cartesian coordinates is easier than using orbital elements, which normally produce strongly nonlinear equations of motion. In this paper, the Eart...

  3. A PLM-based automated inspection planning system for coordinate measuring machine

    Science.gov (United States)

    Zhao, Haibin; Wang, Junying; Wang, Boxiong; Wang, Jianmei; Chen, Huacheng

    2006-11-01

    With rapid progress of Product Lifecycle Management (PLM) in manufacturing industry, automatic generation of inspection planning of product and the integration with other activities in product lifecycle play important roles in quality control. But the techniques for these purposes are laggard comparing with techniques of CAD/CAM. Therefore, an automatic inspection planning system for Coordinate Measuring Machine (CMM) was developed to improve the automatization of measuring based on the integration of inspection system in PLM. Feature information representation is achieved based on a PLM canter database; measuring strategy is optimized through the integration of multi-sensors; reasonable number and distribution of inspection points are calculated and designed with the guidance of statistic theory and a synthesis distribution algorithm; a collision avoidance method is proposed to generate non-collision inspection path with high efficiency. Information mapping is performed between Neutral Interchange Files (NIFs), such as STEP, DML, DMIS, XML, etc., to realize information integration with other activities in the product lifecycle like design, manufacturing and inspection execution, etc. Simulation was carried out to demonstrate the feasibility of the proposed system. As a result, the inspection process is becoming simpler and good result can be got based on the integration in PLM.

  4. Simulation-Based Optimization for Storage Allocation Problem of Outbound Containers in Automated Container Terminals

    Directory of Open Access Journals (Sweden)

    Ning Zhao

    2015-01-01

    Full Text Available Storage allocation of outbound containers is a key factor of the performance of container handling system in automated container terminals. Improper storage plans of outbound containers make QC waiting inevitable; hence, the vessel handling time will be lengthened. A simulation-based optimization method is proposed in this paper for the storage allocation problem of outbound containers in automated container terminals (SAPOBA. A simulation model is built up by Timed-Colored-Petri-Net (TCPN, used to evaluate the QC waiting time of storage plans. Two optimization approaches, based on Particle Swarm Optimization (PSO and Genetic Algorithm (GA, are proposed to form the complete simulation-based optimization method. Effectiveness of this method is verified by experiment, as the comparison of the two optimization approaches.

  5. Advanced Coordinating Control System for Power Plant

    Institute of Scientific and Technical Information of China (English)

    WU Peng; WEI Shuangying

    2006-01-01

    The coordinating control system is popular used in power plant. This paper describes the advanced coordinating control by control methods and optimal operation, introduces their principals and features by using the examples of power plant operation. It is wealthy for automation application in optimal power plant operation.

  6. A sensitivity-based coordination method for optimization of product families

    Science.gov (United States)

    Zou, Jun; Yao, Wei-Xing; Xia, Tian-Xiang

    2016-07-01

    This article provides an introduction to a decomposition-based method for the optimization of product families with predefined platforms. To improve the efficiency of the system coordinator, a new sensitivity-based coordination method (SCM) is proposed. The key idea in SCM is that the system level coordinates share variables by using sensitivity information to make trade-offs between the product subsystems. The coordinated shared variables are determined by minimizing the performance deviation with respect to the optimal design of subproblems and constraint violation incurred by sharing. Each subproblem has a significant degree of independence and can be solved in a simultaneous way. The numerical performance of SCM is investigated, and the results suggest that the new approach is robust and leads to a substantial reduction in computational effort compared with the analytical target cascading method. Then, the proposed methodology is applied to the structural optimization of a family of automotive body side-frames.

  7. Optimal number of stimulation contacts for coordinated reset neuromodulation

    Directory of Open Access Journals (Sweden)

    Borys eLysyansky

    2013-07-01

    Full Text Available In this computational study we investigatecoordinated reset (CR neuromodulation designed for an effective controlof synchronization by multi-site stimulation of neuronal target populations. This method was suggested to effectively counteract pathological neuronal synchronycharacteristic for several neurological disorders. We studyhow many stimulation sites are required for optimal CR-induced desynchronization. We found that a moderate increase of the number of stimulation sitesmay significantly prolong the post-stimulation desynchronized transientafter the stimulation is completely switched off. This can, in turn,reduce the amount of the administered stimulation current for theintermittent ON-OFF CR stimulation protocol, where time intervalswith stimulation ON are recurrently followed by time intervals withstimulation OFF. In addition, we found that the optimal number ofstimulation sites essentially depends on how strongly the administeredcurrent decays within the neuronal tissue with increasing distancefrom the stimulation site. In particular, for a broad spatial stimulationprofile, i.e., for a weak spatial decay rate of the stimulation current,CR stimulation can optimally be delivered via a small number of stimulationsites. Our findings may contribute to an optimization of therapeutic applications of CR neuromodulation.

  8. Multi-objective intelligent coordinating optimization blending system based on qualitative and quantitative synthetic model

    Institute of Scientific and Technical Information of China (English)

    WANG Ya-lin; MA Jie; GUI Wei-hua; YANG Chun-hua; ZHANG Chuan-fu

    2006-01-01

    A multi-objective intelligent coordinating optimization strategy based on qualitative and quantitative synthetic model for Pb-Zn sintering blending process was proposed to obtain optimal mixture ratio. The mechanism and neural network quantitative models for predicting compositions and rule models for expert reasoning were constructed based on statistical data and empirical knowledge. An expert reasoning method based on these models were proposed to solve blending optimization problem, including multi-objective optimization for the first blending process and area optimization for the second blending process, and to determine optimal mixture ratio which will meet the requirement of intelligent coordination. The results show that the qualified rates of agglomerate Pb, Zn and S compositions are increased by 7.1%, 6.5% and 6.9%, respectively, and the fluctuation of sintering permeability is reduced by 7.0 %, which effectively stabilizes the agglomerate compositions and the permeability.

  9. Novel Particle Swarm Optimization and Its Application in Calibrating the Underwater Transponder Coordinates

    Directory of Open Access Journals (Sweden)

    Zheping Yan

    2014-01-01

    Full Text Available A novel improved particle swarm algorithm named competition particle swarm optimization (CPSO is proposed to calibrate the Underwater Transponder coordinates. To improve the performance of the algorithm, TVAC algorithm is introduced into CPSO to present an extension competition particle swarm optimization (ECPSO. The proposed method is tested with a set of 10 standard optimization benchmark problems and the results are compared with those obtained through existing PSO algorithms, basic particle swarm optimization (BPSO, linear decreasing inertia weight particle swarm optimization (LWPSO, exponential inertia weight particle swarm optimization (EPSO, and time-varying acceleration coefficient (TVAC. The results demonstrate that CPSO and ECPSO manifest faster searching speed, accuracy, and stability. The searching performance for multimodulus function of ECPSO is superior to CPSO. At last, calibration of the underwater transponder coordinates is present using particle swarm algorithm, and novel improved particle swarm algorithm shows better performance than other algorithms.

  10. Axon Membrane Skeleton Structure is Optimized for Coordinated Sodium Propagation

    CERN Document Server

    Zhang, Yihao; Li, He; Tzingounis, Anastasios V; Lykotrafitis, George

    2016-01-01

    Axons transmit action potentials with high fidelity and minimal jitter. This unique capability is likely the result of the spatiotemporal arrangement of sodium channels along the axon. Super-resolution microscopy recently revealed that the axon membrane skeleton is structured as a series of actin rings connected by spectrin filaments that are held under entropic tension. Sodium channels also exhibit a periodic distribution pattern, as they bind to ankyrin G, which associates with spectrin. Here, we elucidate the relationship between the axon membrane skeleton structure and the function of the axon. By combining cytoskeletal dynamics and continuum diffusion modeling, we show that spectrin filaments under tension minimize the thermal fluctuations of sodium channels and prevent overlap of neighboring channel trajectories. Importantly, this axon skeletal arrangement allows for a highly reproducible band-like activation of sodium channels leading to coordinated sodium propagation along the axon.

  11. Temporal mammogram image registration using optimized curvilinear coordinates.

    Science.gov (United States)

    Abdel-Nasser, Mohamed; Moreno, Antonio; Puig, Domenec

    2016-04-01

    Registration of mammograms plays an important role in breast cancer computer-aided diagnosis systems. Radiologists usually compare mammogram images in order to detect abnormalities. The comparison of mammograms requires a registration between them. A temporal mammogram registration method is proposed in this paper. It is based on the curvilinear coordinates, which are utilized to cope both with global and local deformations in the breast area. Temporal mammogram pairs are used to validate the proposed method. After registration, the similarity between the mammograms is maximized, and the distance between manually defined landmarks is decreased. In addition, a thorough comparison with the state-of-the-art mammogram registration methods is performed to show its effectiveness.

  12. Hybrid optimal online-overnight charging coordination of plug-in electric vehicles in smart grid

    Science.gov (United States)

    Masoum, Mohammad A. S.; Nabavi, Seyed M. H.

    2016-10-01

    Optimal coordinated charging of plugged-in electric vehicles (PEVs) in smart grid (SG) can be beneficial for both consumers and utilities. This paper proposes a hybrid optimal online followed by overnight charging coordination of high and low priority PEVs using discrete particle swarm optimization (DPSO) that considers the benefits of both consumers and electric utilities. Objective functions are online minimization of total cost (associated with grid losses and energy generation) and overnight valley filling through minimization of the total load levels. The constraints include substation transformer loading, node voltage regulations and the requested final battery state of charge levels (SOCreq). The main challenge is optimal selection of the overnight starting time (toptimal-overnight,start) to guarantee charging of all vehicle batteries to the SOCreq levels before the requested plug-out times (treq) which is done by simultaneously solving the online and overnight objective functions. The online-overnight PEV coordination approach is implemented on a 449-node SG; results are compared for uncoordinated and coordinated battery charging as well as a modified strategy using cost minimizations for both online and overnight coordination. The impact of toptimal-overnight,start on performance of the proposed PEV coordination is investigated.

  13. Path optimization by a variational reaction coordinate method. II. Improved computational efficiency through internal coordinates and surface interpolation.

    Science.gov (United States)

    Birkholz, Adam B; Schlegel, H Bernhard

    2016-05-14

    Reaction path optimization is being used more frequently as an alternative to the standard practice of locating a transition state and following the path downhill. The Variational Reaction Coordinate (VRC) method was proposed as an alternative to chain-of-states methods like nudged elastic band and string method. The VRC method represents the path using a linear expansion of continuous basis functions, allowing the path to be optimized variationally by updating the expansion coefficients to minimize the line integral of the potential energy gradient norm, referred to as the Variational Reaction Energy (VRE) of the path. When constraints are used to control the spacing of basis functions and to couple the minimization of the VRE with the optimization of one or more individual points along the path (representing transition states and intermediates), an approximate path as well as the converged geometries of transition states and intermediates along the path are determined in only a few iterations. This algorithmic efficiency comes at a high per-iteration cost due to numerical integration of the VRE derivatives. In the present work, methods for incorporating redundant internal coordinates and potential energy surface interpolation into the VRC method are described. With these methods, the per-iteration cost, in terms of the number of potential energy surface evaluations, of the VRC method is reduced while the high algorithmic efficiency is maintained.

  14. Coordinated Optimization of Aircraft Routes and Locations of Ground Sensors

    Science.gov (United States)

    2014-09-17

    optimization approaches allowing minimization of the total cost subject to linear inequality constraints can be formulated in terms of the binary...the calculations are not trivial. ERDC/CRREL TR-14-20 9 3.1 Coverage matrix for aircraft Coverage matrix A enters the inequality coverage...of target (a person versus a car): 8 ± 2 px • Identification of the target (a woman versus a man, a specific car): 13 ± 3 px; These criteria

  15. Automated Optimization of Walking Parameters for the Nao Humanoid Robot

    NARCIS (Netherlands)

    Girardi, N.; Kooijman, C.; Wiggers, A.J.; Visser, A.

    2013-01-01

    This paper describes a framework for optimizing walking parameters for a Nao humanoid robot. In this case an omnidirectional walk is learned. The parameters are learned in simulation with an evolutionary approach. The best performance was obtained for a combination of a low mutation rate and a high

  16. Optimal Energy Management of Multi-Microgrids with Sequentially Coordinated Operations

    Directory of Open Access Journals (Sweden)

    Nah-Oak Song

    2015-08-01

    Full Text Available We propose an optimal electric energy management of a cooperative multi-microgrid community with sequentially coordinated operations. The sequentially coordinated operations are suggested to distribute computational burden and yet to make the optimal 24 energy management of multi-microgrids possible. The sequential operations are mathematically modeled to find the optimal operation conditions and illustrated with physical interpretation of how to achieve optimal energy management in the cooperative multi-microgrid community. This global electric energy optimization of the cooperative community is realized by the ancillary internal trading between the microgrids in the cooperative community which reduces the extra cost from unnecessary external trading by adjusting the electric energy production amounts of combined heat and power (CHP generators and amounts of both internal and external electric energy trading of the cooperative community. A simulation study is also conducted to validate the proposed mathematical energy management models.

  17. A new comprehensive genetic algorithm method for optimal overcurrent relays coordination

    Energy Technology Data Exchange (ETDEWEB)

    Razavi, Farzad; Abyaneh, Hossein Askarian; Mohammadi, Reza [Department of Electrical Engineering, Amirkabir University of Technology (Iran); Al-Dabbagh, Majid [Hydro Tasmania Consulting (Australia); Torkaman, Hossein [Department of Electrical Engineering, Shahid Beheshti University (Iran)

    2008-04-15

    For optimal co-ordination of overcurrent relays, linear programming techniques such as simplex, two-phase simplex and dual simplex are used. Another way of optimal coordination program is using artificial intelligent system such as genetic algorithm (GA). In this paper, a powerful optimal coordination method based on GA is introduced. The objective function (OF) is developed to solve the problems of miscoordination and continuous or discrete time setting multiplier (TSM) or time dial setting (TDS). In other words; the novelty of the paper is the modification of the existing objective function of GA, by introducing a new parameter and adding a new term to OF, to handle miscoordination problems both for continues and discrete TSM or TDS. The method is applied to two different power system networks and from the obtained results it is revealed that the new method is efficient, accurate and flexible. (author)

  18. Analytical study on coordinative optimization of convection in tubes with variable heat flux

    Institute of Scientific and Technical Information of China (English)

    YUAN Zhongxian; ZHANG Jianguo; JIANG Mingjian

    2004-01-01

    The laminar heat transfer in the thermal entrance region in round tubes, which has a variable surface heat flux boundary condition, is analytically studied. The results show that the heat transfer coefficient is closely related to the wall temperature gradient along the tube axis. The greater the gradient, the higher the heat transfer rate. Furthermore, the coordination of the velocity and the temperature gradient fields is also analysed under different surface heat fluxes. The validity of the field coordination principle is verified by checking the correlation of heat transfer coefficient and the coordination degree. The results also demonstrate that optimizing the thermal boundary condition is a way to enhance heat transfer.

  19. ORDER-PICKING OPTIMIZATION FOR AUTOMATED PICKING SYSTEM WITH PARALLEL DISPENSERS

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Based on the characteristics of parallel dispensers in automated picking system, an order-picking optimization problem is presented. Firstly, the working principle of parallel dispensers is introduced, which implies the time cost of picking each order is influenced by the order-picking sequence. So the order-picking optimization problem can be classified as a dynamic traveling salesman problem (TSP). Then a mathematical model of the problem is established and an improved max-min ant system (MMAS) is adopted to solve the model. The improvement includes two aspects. One is that the initial assignment of ants depends on a probabilistic formula instead of a random deployment; the other is that the heuristic factor is expressed by the extra picking time of each order instead of the total. At last, an actual simulation is made on an automated picking system with parallel dispensers. The simulation results proved the optimization value and the validity of improvement on MMAS.

  20. Two-phase strategy of controlling motor coordination determined by task performance optimality.

    Science.gov (United States)

    Shimansky, Yury P; Rand, Miya K

    2013-02-01

    A quantitative model of optimal coordination between hand transport and grip aperture has been derived in our previous studies of reach-to-grasp movements without utilizing explicit knowledge of the optimality criterion or motor plant dynamics. The model's utility for experimental data analysis has been demonstrated. Here we show how to generalize this model for a broad class of reaching-type, goal-directed movements. The model allows for measuring the variability of motor coordination and studying its dependence on movement phase. The experimentally found characteristics of that dependence imply that execution noise is low and does not affect motor coordination significantly. From those characteristics it is inferred that the cost of neural computations required for information acquisition and processing is included in the criterion of task performance optimality as a function of precision demand for state estimation and decision making. The precision demand is an additional optimized control variable that regulates the amount of neurocomputational resources activated dynamically. It is shown that an optimal control strategy in this case comprises two different phases. During the initial phase, the cost of neural computations is significantly reduced at the expense of reducing the demand for their precision, which results in speed-accuracy tradeoff violation and significant inter-trial variability of motor coordination. During the final phase, neural computations and thus motor coordination are considerably more precise to reduce the cost of errors in making a contact with the target object. The generality of the optimal coordination model and the two-phase control strategy is illustrated on several diverse examples.

  1. Geometry Optimization of Crystals by the Quasi-Independent Curvilinear Coordinate Approximation

    CERN Document Server

    Németh, K

    2005-01-01

    The quasi-independent curvilinear coordinate approximation (QUICCA) method [K. N\\'emeth and M. Challacombe, J. Chem. Phys. {\\bf 121}, 2877, (2004)] is extended to the optimization of crystal structures. We demonstrate that QUICCA is valid under periodic boundary conditions, enabling simultaneous relaxation of the lattice and atomic coordinates, as illustrated by tight optimization of polyethylene, hexagonal boron-nitride, a (10,0) carbon-nanotube, hexagonal ice, quartz and sulfur at the $\\Gamma$-point RPBE/STO-3G level of theory.

  2. Constraints Adjustment and Objectives Coordination of Satisfying Optimal Control Applied to Heavy Oil Fractionators

    Institute of Scientific and Technical Information of China (English)

    邹涛; 李少远

    2005-01-01

    In this paper, the feasibility and objectives coordination of real-time optimization (RTO) are systemically investigated under soft constraints. The reason for requiring soft constraints adjustment and objective relaxation simultaneously is that the result is not satisfactory when the feasible region is apart from the desired working point or the optimization problem is infeasible. The mixed logic method is introduced to describe the priority of the constraints and objectives, thereby the soft constraints adjustment and objectives coordination are solved together in RTO. A case study on the Shell heavy oil fractionators benchmark problem illustrating the method is finally presented.

  3. Automation for pattern library creation and in-design optimization

    Science.gov (United States)

    Deng, Rock; Zou, Elain; Hong, Sid; Wang, Jinyan; Zhang, Yifan; Sweis, Jason; Lai, Ya-Chieh; Ding, Hua; Huang, Jason

    2015-03-01

    contain remedies built in so that fixing happens either automatically or in a guided manner. Building a comprehensive library of patterns is a very difficult task especially when a new technology node is being developed or the process keeps changing. The main dilemma is not having enough representative layouts to use for model simulation where pattern locations can be marked and extracted. This paper will present an automatic pattern library creation flow by using a few known yield detractor patterns to systematically expand the pattern library and generate optimized patterns. We will also look at the specific fixing hints in terms of edge movements, additive, or subtractive changes needed during optimization. Optimization will be shown for both the digital physical implementation and custom design methods.

  4. Dynamic Coordinated Shifting Control of Automated Mechanical Transmissions without a Clutch in a Plug-In Hybrid Electric Vehicle

    Directory of Open Access Journals (Sweden)

    Xinlei Liu

    2012-08-01

    Full Text Available On the basis of the shifting process of automated mechanical transmissions (AMTs for traditional hybrid electric vehicles (HEVs, and by combining the features of electric machines with fast response speed, the dynamic model of the hybrid electric AMT vehicle powertrain is built up, the dynamic characteristics of each phase of shifting process are analyzed, and a control strategy in which torque and speed of the engine and electric machine are coordinatively controlled to achieve AMT shifting control for a plug-in hybrid electric vehicle (PHEV without clutch is proposed. In the shifting process, the engine and electric machine are well controlled, and the shift jerk and power interruption and restoration time are reduced. Simulation and real car test results show that the proposed control strategy can more efficiently improve the shift quality for PHEVs equipped with AMTs.

  5. Growing string method with interpolation and optimization in internal coordinates: method and examples.

    Science.gov (United States)

    Zimmerman, Paul M

    2013-05-14

    The growing string method (GSM) has proven especially useful for locating chemical reaction paths at low computational cost. While many string methods use Cartesian coordinates, these methods can be substantially improved by changes in the coordinate system used for interpolation and optimization steps. The quality of the interpolation scheme is especially important because it determines how close the initial path is to the optimized reaction path, and this strongly affects the rate of convergence. In this article, a detailed description of the generation of internal coordinates (ICs) suitable for use in GSM as reactive tangents and in string optimization is given. Convergence of reaction paths is smooth because the IC tangent and orthogonal directions are better representations of chemical bonding compared to Cartesian coordinates. This is not only important quantitatively for reducing computational cost but also allows reaction paths to be described with smoothly varying chemically relevant coordinates. Benchmark computations with challenging reactions are compared to previous versions of GSM and show significant speedups. Finally, a climbing image scheme is included to improve the quality of the transition state approximation, ensuring high reliability of the method.

  6. Optimization of a filter-lysis protocol to purify rat testicular homogenates for automated spermatid counting.

    Science.gov (United States)

    Pacheco, Sara E; Anderson, Linnea M; Boekelheide, Kim

    2012-01-01

    Quantifying testicular homogenization-resistant spermatid heads (HRSH) is a powerful indicator of spermatogenesis. These counts have traditionally been performed manually using a hemocytometer, but this method can be time consuming and biased. We aimed to develop a protocol to reduce debris for the application of automated counting, which would allow for efficient and unbiased quantification of rat HRSH. We developed a filter-lysis protocol that effectively removes debris from rat testicular homogenates. After filtering and lysing the homogenates, we found no statistical differences between manual (classic and filter-lysis) and automated (filter-lysis) counts using 1-way analysis of variance with Bonferroni's multiple comparison test. In addition, Pearson's correlation coefficients were calculated to compare the counting methods, and there was a strong correlation between the classic manual counts and the filter-lysis manual (r = 0.85, P = .002) and the filter-lysis automated (r = 0.89, P = .0005) counts. We also tested the utility of the automated method in a low-dose exposure model known to decrease HRSH. Adult Fischer 344 rats exposed to 0.33% 2,5-hexanedione in the drinking water for 12 weeks demonstrated decreased body (P = .02) and testes (P = .002) weights. In addition, there was a significant reduction in the number of HRSH per testis (P = .002) when compared to controls. A filterlysis protocol was optimized to purify rat testicular homogenates for automated HRSH counts. Automated counting systems yield unbiased data and can be applied to detect changes in the testis after low-dose toxicant exposure.

  7. Coordinated Optimization of Distributed Energy Resources and Smart Loads in Distribution Systems: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Rui; Zhang, Yingchen

    2016-08-01

    Distributed energy resources (DERs) and smart loads have the potential to provide flexibility to the distribution system operation. A coordinated optimization approach is proposed in this paper to actively manage DERs and smart loads in distribution systems to achieve the optimal operation status. A three-phase unbalanced Optimal Power Flow (OPF) problem is developed to determine the output from DERs and smart loads with respect to the system operator's control objective. This paper focuses on coordinating PV systems and smart loads to improve the overall voltage profile in distribution systems. Simulations have been carried out in a 12-bus distribution feeder and results illustrate the superior control performance of the proposed approach.

  8. Coordinated Optimization of Distributed Energy Resources and Smart Loads in Distribution Systems

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Rui; Zhang, Yingchen

    2016-11-14

    Distributed energy resources (DERs) and smart loads have the potential to provide flexibility to the distribution system operation. A coordinated optimization approach is proposed in this paper to actively manage DERs and smart loads in distribution systems to achieve the optimal operation status. A three-phase unbalanced Optimal Power Flow (OPF) problem is developed to determine the output from DERs and smart loads with respect to the system operator's control objective. This paper focuses on coordinating PV systems and smart loads to improve the overall voltage profile in distribution systems. Simulations have been carried out in a 12-bus distribution feeder and results illustrate the superior control performance of the proposed approach.

  9. A Novel Optimization Tool for Automated Design of Integrated Circuits based on MOSGA

    Directory of Open Access Journals (Sweden)

    Maryam Dehbashian

    2011-11-01

    Full Text Available In this paper a novel optimization method based on Multi-Objective Gravitational Search Algorithm (MOGSA is presented for automated design of analog integrated circuits. The recommended method firstly simulates a selected circuit using a simulator and then simulated results are optimized by MOGSA algorithm. Finally this process continues to meet its optimum result. The main programs of the proposed method have been implemented in MATLAB while analog circuits are simulated by HSPICE software. To show the capability of this method, its proficiency will be examined in the optimization of analog integrated circuits design. In this paper, an analog circuit sizing scheme -Optimum Automated Design of a Temperature independent Differential Op-amp using Widlar Current Source- is illustrated as a case study. The computer results obtained from implementing this method indicate that the design specifications are closely met. Moreover, according to various design criteria, this tool by proposing a varied set of answers can give more options to designers to choose a desirable scheme among other suggested results. MOGSA, the proposal algorithm, introduces a novel method in multi objective optimization on the basis of Gravitational Search Algorithm in which the concept of “Pareto-optimality” is used to determine “non-dominated” positions as well as an external repository to keep these positions. To ensure the accuracy of MOGSA performance, this algorithm is validated using several standard test functions from some specialized literatures. Final results indicate that our method is highly competitive with current multi objective optimization algorithms.

  10. Determination of the Optimized Automation Rate considering Effects of Automation on Human Operators in Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seung Min; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Daejon (Korea, Republic of); Kim, Jong Hyun [KEPCO International Nuclear Graduate School, Seosaeng (Korea, Republic of); Kim, Man Cheol [Chung-Ang University, Seoul (Korea, Republic of)

    2015-05-15

    Automation refers to the use of a device or a system to perform a function previously performed by a human operator. It is introduced to reduce the human errors and to enhance the performance in various industrial fields, including the nuclear industry. However, these positive effects are not always achieved in complex systems such as nuclear power plants (NPPs). An excessive introduction of automation can generate new roles for human operators and change activities in unexpected ways. As more automation systems are accepted, the ability of human operators to detect automation failures and resume manual control is diminished. This disadvantage of automation is called the Out-of-the- Loop (OOTL) problem. We should consider the positive and negative effects of automation at the same time to determine the appropriate level of the introduction of automation. Thus, in this paper, we suggest an estimation method to consider the positive and negative effects of automation at the same time to determine the appropriate introduction of automation. This concept is limited in that it does not consider the effects of automation on human operators. Thus, a new estimation method for automation rate was suggested to overcome this problem.

  11. Optimal Coordinated Control of Power Extraction in LES of a Wind Farm with Entrance Effects

    Directory of Open Access Journals (Sweden)

    Jay P. Goit

    2016-01-01

    Full Text Available We investigate the use of optimal coordinated control techniques in large eddy simulations of wind farm boundary layer interaction with the aim of increasing the total energy extraction in wind farms. The individual wind turbines are considered as flow actuators, and their energy extraction is dynamically regulated in time, so as to optimally influence the flow field. We extend earlier work on wind farm optimal control in the fully-developed regime (Goit and Meyers 2015, J. Fluid Mech. 768, 5–50 to a ‘finite’ wind farm case, in which entrance effects play an important role. For the optimal control, a receding horizon framework is employed in which turbine thrust coefficients are optimized in time and per turbine. Optimization is performed with a conjugate gradient method, where gradients of the cost functional are obtained using adjoint large eddy simulations. Overall, the energy extraction is increased 7% by the optimal control. This increase in energy extraction is related to faster wake recovery throughout the farm. For the first row of turbines, the optimal control increases turbulence levels and Reynolds stresses in the wake, leading to better wake mixing and an inflow velocity for the second row that is significantly higher than in the uncontrolled case. For downstream rows, the optimal control mainly enhances the sideways mean transport of momentum. This is different from earlier observations by Goit and Meyers (2015 in the fully-developed regime, where mainly vertical transport was enhanced.

  12. Optimal combined overcurrent and distance relays co-ordination using a new genetic algorithm method

    Energy Technology Data Exchange (ETDEWEB)

    Kamangar, S.S.H.; Abyaneh, H.A.; Chabanloo, R.M. [Amirkabir Univ. of Technology, Tehran (Iran, Islamic Republic of). Dept. of Electrical Engineering; Razavi, F. [Tafresh Univ. (Iran, Islamic Republic of). Dept. of Electrical Engineering

    2010-04-15

    This paper introduced a new method to optimize the coordination of overcurrent (OC) relays using genetic algorithm (GA). GA is an intelligent optimization technique that can adjust the setting of relays without being based on an initial guess or trapped in the local minimum values, which is the disadvantage of linear programming techniques, such as simplex, 2-phase simplex, and dual simplex techniques. The objective function (OF) of GA is modified by adding a new term to OF to fulfill the coordination of both OC and distance relays. Two power network systems were analyzed using the new computer program, and the results that were obtained show that the method is both efficient and accurate. Transmission and subtransmission protection systems commonly use OC and distance relays. 12 refs., 6 tabs., 5 figs.

  13. Automated Software Testing Using Metahurestic Technique Based on An Ant Colony Optimization

    CERN Document Server

    Srivastava, Praveen Ranjan

    2011-01-01

    Software testing is an important and valuable part of the software development life cycle. Due to time, cost and other circumstances, exhaustive testing is not feasible that's why there is a need to automate the software testing process. Testing effectiveness can be achieved by the State Transition Testing (STT) which is commonly used in real time, embedded and web-based type of software systems. Aim of the current paper is to present an algorithm by applying an ant colony optimization technique, for generation of optimal and minimal test sequences for behavior specification of software. Present paper approach generates test sequence in order to obtain the complete software coverage. This paper also discusses the comparison between two metaheuristic techniques (Genetic Algorithm and Ant Colony optimization) for transition based testing

  14. The optimization of total laboratory automation by simulation of a pull-strategy.

    Science.gov (United States)

    Yang, Taho; Wang, Teng-Kuan; Li, Vincent C; Su, Chia-Lo

    2015-01-01

    Laboratory results are essential for physicians to diagnose medical conditions. Because of the critical role of medical laboratories, an increasing number of hospitals use total laboratory automation (TLA) to improve laboratory performance. Although the benefits of TLA are well documented, systems occasionally become congested, particularly when hospitals face peak demand. This study optimizes TLA operations. Firstly, value stream mapping (VSM) is used to identify the non-value-added time. Subsequently, batch processing control and parallel scheduling rules are devised and a pull mechanism that comprises a constant work-in-process (CONWIP) is proposed. Simulation optimization is then used to optimize the design parameters and to ensure a small inventory and a shorter average cycle time (CT). For empirical illustration, this approach is applied to a real case. The proposed methodology significantly improves the efficiency of laboratory work and leads to a reduction in patient waiting times and increased service level.

  15. A Grid Model for the Design, Coordination and Dimensional Optimization in Architecture.

    OpenAIRE

    Léonard, Daniel; Malcurat, Olivier

    2008-01-01

    Our article treats layout grids in architecture and their use by the architects for the purposes not only ofdesign but also of dimensional coordination and optimization. It initially proposes to define anarchitectural grid model as well as a set of operations to construct them. Then, it discusses this model andits capacity to assist the designers in their everyday work of (re)dimensioning 3 .The architectural grid, as an instrument of design, is omnipresent in the work of architects whatever ...

  16. Optimal use of visual information in adolescents and young adults with developmental coordination disorder

    OpenAIRE

    2014-01-01

    Recent reports offer contrasting views on whether or not the use of online visual control is impaired in individuals with developmental coordination disorder (DCD). This study explored the optimal temporal basis for processing and using visual information in adolescents and young adults with DCD. Participants were 22 adolescents and young adults (12 males and 10 females; M = 19 years, SD = 3). Half had been diagnosed with DCD as children and still performed poorly on the movement assessment b...

  17. OPTIMAL SUBSTRUCTURE OF SET-VALUED SOLUTIONS OF NORMAL-FORM GAMES AND COORDINATION

    Institute of Scientific and Technical Information of China (English)

    Norimasa KOBAYASHI; Kyoichi KIJIMA

    2009-01-01

    A number of solution concepts of normal-form games have been proposed in the literature on subspaces of action profiles that have Nash type stability. While the literature mainly focuses on the minimal of such stable subspaces, this paper argues that non-minimal stable subspaces represent well the multi-agent situations to which neither Nash equilibrium nor rationalizability may be applied with satisfaction. As a theoretical support, the authors prove the optimal substructure of stable subspaces regarding the restriction of a game. It is further argued that the optimal substructure characterizes hierarchical diversity of coordination and interim phases in learning.

  18. Energy Coordinative Optimization of Wind-Storage-Load Microgrids Based on Short-Term Prediction

    Directory of Open Access Journals (Sweden)

    Changbin Hu

    2015-02-01

    Full Text Available According to the topological structure of wind-storage-load complementation microgrids, this paper proposes a method for energy coordinative optimization which focuses on improvement of the economic benefits of microgrids in the prediction framework. First of all, the external characteristic mathematical model of distributed generation (DG units including wind turbines and storage batteries are established according to the requirements of the actual constraints. Meanwhile, using the minimum consumption costs from the external grid as the objective function, a grey prediction model with residual modification is introduced to output the predictive wind turbine power and load at specific periods. Second, based on the basic framework of receding horizon optimization, an intelligent genetic algorithm (GA is applied to figure out the optimum solution in the predictive horizon for the complex non-linear coordination control model of microgrids. The optimum results of the GA are compared with the receding solution of mixed integer linear programming (MILP. The obtained results show that the method is a viable approach for energy coordinative optimization of microgrid systems for energy flow and reasonable schedule. The effectiveness and feasibility of the proposed method is verified by examples.

  19. Automated in-situ optimization of bimorph mirrors at Diamond Light Source

    Science.gov (United States)

    Sutter, John P.; Alcock, Simon G.; Sawhney, Kawal J. S.

    2011-09-01

    Bimorph mirrors are used on many synchrotron beamlines to focus or collimate light. They are highly adaptable because not only their overall figure but also their local slope errors can be corrected. However, the optimization procedure is complex. At Diamond Light Source, highly repeatable and accurate pencil beam measurements are used to determine a mirror's slope errors. These data are then used by automated scripts to calculate the necessary corrections. This procedure may be applied to any type of active mirror, but for hard X-ray mirrors, diffraction from the slits must be considered.

  20. RootGraph: a graphic optimization tool for automated image analysis of plant roots.

    Science.gov (United States)

    Cai, Jinhai; Zeng, Zhanghui; Connor, Jason N; Huang, Chun Yuan; Melino, Vanessa; Kumar, Pankaj; Miklavcic, Stanley J

    2015-11-01

    This paper outlines a numerical scheme for accurate, detailed, and high-throughput image analysis of plant roots. In contrast to existing root image analysis tools that focus on root system-average traits, a novel, fully automated and robust approach for the detailed characterization of root traits, based on a graph optimization process is presented. The scheme, firstly, distinguishes primary roots from lateral roots and, secondly, quantifies a broad spectrum of root traits for each identified primary and lateral root. Thirdly, it associates lateral roots and their properties with the specific primary root from which the laterals emerge. The performance of this approach was evaluated through comparisons with other automated and semi-automated software solutions as well as against results based on manual measurements. The comparisons and subsequent application of the algorithm to an array of experimental data demonstrate that this method outperforms existing methods in terms of accuracy, robustness, and the ability to process root images under high-throughput conditions.

  1. Automation of Optimized Gabor Filter Parameter Selection for Road Cracks Detection

    Directory of Open Access Journals (Sweden)

    Haris Ahmad Khan

    2016-03-01

    Full Text Available Automated systems for road crack detection are extremely important in road maintenance for vehicle safety and traveler’s comfort. Emerging cracks in roads need to be detected and accordingly repaired as early as possible to avoid further damage thus reducing rehabilitation cost. In this paper, a robust method for Gabor filter parameters optimization for automatic road crack detection is discussed. Gabor filter has been used in previous literature for similar applications. However, there is a need for automatic selection of optimized Gabor filter parameters due to variation in texture of roads and cracks. The problem of change of background, which in fact is road texture, is addressed through a learning process by using synthetic road crack generation for Gabor filter parameter tuning. Tuned parameters are then tested on real cracks and a thorough quantitative analysis is performed for performance evaluation.

  2. Automated Discovery of Elementary Chemical Reaction Steps Using Freezing String and Berny Optimization Methods.

    Science.gov (United States)

    Suleimanov, Yury V; Green, William H

    2015-09-08

    We present a simple protocol which allows fully automated discovery of elementary chemical reaction steps using in cooperation double- and single-ended transition-state optimization algorithms--the freezing string and Berny optimization methods, respectively. To demonstrate the utility of the proposed approach, the reactivity of several single-molecule systems of combustion and atmospheric chemistry importance is investigated. The proposed algorithm allowed us to detect without any human intervention not only "known" reaction pathways, manually detected in the previous studies, but also new, previously "unknown", reaction pathways which involve significant atom rearrangements. We believe that applying such a systematic approach to elementary reaction path finding will greatly accelerate the discovery of new chemistry and will lead to more accurate computer simulations of various chemical processes.

  3. Automated Discovery of Elementary Chemical Reaction Steps Using Freezing String and Berny Optimization Methods

    CERN Document Server

    Suleimanov, Yury V

    2015-01-01

    We present a simple protocol which allows fully automated discovery of elementary chemical reaction steps using in cooperation single- and double-ended transition-state optimization algorithms - the freezing string and Berny optimization methods, respectively. To demonstrate the utility of the proposed approach, the reactivity of several systems of combustion and atmospheric chemistry importance is investigated. The proposed algorithm allowed us to detect without any human intervention not only "known" reaction pathways, manually detected in the previous studies, but also new, previously "unknown", reaction pathways which involve significant atom rearrangements. We believe that applying such a systematic approach to elementary reaction path finding will greatly accelerate the possibility of discovery of new chemistry and will lead to more accurate computer simulations of various chemical processes.

  4. Automated ARGET ATRP Accelerates Catalyst Optimization for the Synthesis of Thiol-Functionalized Polymers.

    Science.gov (United States)

    Siegwart, Daniel J; Leiendecker, Matthias; Langer, Robert; Anderson, Daniel G

    2012-02-14

    Conventional synthesis of polymers by ATRP is relatively low throughput, involving iterative optimization of conditions in an inert atmosphere. Automated, high-throughput controlled radical polymerization was developed to accelerate catalyst optimization and production of disulfide-functionalized polymers without the need of an inert gas. Using ARGET ATRP, polymerization conditions were rapidly identified for eight different monomers, including the first ARGET ATRP of 2-(diethylamino)ethyl methacrylate and di(ethylene glycol) methyl ether methacrylate. In addition, butyl acrylate, oligo(ethylene glycol) methacrylate 300 and 475, 2-(dimethylamino)ethyl methacrylate, styrene, and methyl methacrylate were polymerized using bis(2-hydroxyethyl) disulfide bis(2-bromo-2-methylpropionate) as the initiator, tris(2-pyridylmethyl)amine as the ligand, and tin(II) 2-ethylhexanoate as the reducing agent. The catalyst and reducing agent concentration was optimized specifically for each monomer, and then a library of polymers was synthesized systematically using the optimized conditions. The disulfide-functionalized chains could be cleaved to two thiol-terminated chains upon exposure to dithiothreitol, which may have utility for the synthesis of polymer bioconjugates. Finally, we demonstrated that these new conditions translated perfectly to conventional batch polymerization. We believe the methods developed here may prove generally useful to accelerate the systematic optimization of a variety of chemical reactions and polymerizations.

  5. A coordinated dispatch model for electricity and heat in a Microgrid via particle swarm optimization

    DEFF Research Database (Denmark)

    Xu, Lizhong; Yang, Guangya; Xu, Zhao

    2013-01-01

    This paper develops a coordinated electricity and heat dispatching model for Microgrid under day-ahead environment. In addition to operational constraints, network loss and physical limits are addressed in this model, which are always ignored in previous work. As an important component of Microgrid......, detailed combined heat and power (CHP) model is developed. The part load performance of CHP is modeled by curve fitting method. Furthermore, electric heater is introduced into the model to improve the economy of Microgrid operation and enhance the flexibility of the Microgrid by electricity-heat conversion....... Particle swarm optimization (PSO) is employed to solve this model for the operation schedule to minimize the total operational cost of Microgrid by coordinating the CHP, electric heater, boiler and heat storage. The efficacy of the model and methodology is verified with different operation scenarios....

  6. Aircraft wing structural design optimization based on automated finite element modelling and ground structure approach

    Science.gov (United States)

    Yang, Weizhu; Yue, Zhufeng; Li, Lei; Wang, Peiyan

    2016-01-01

    An optimization procedure combining an automated finite element modelling (AFEM) technique with a ground structure approach (GSA) is proposed for structural layout and sizing design of aircraft wings. The AFEM technique, based on CATIA VBA scripting and PCL programming, is used to generate models automatically considering the arrangement of inner systems. GSA is used for local structural topology optimization. The design procedure is applied to a high-aspect-ratio wing. The arrangement of the integral fuel tank, landing gear and control surfaces is considered. For the landing gear region, a non-conventional initial structural layout is adopted. The positions of components, the number of ribs and local topology in the wing box and landing gear region are optimized to obtain a minimum structural weight. Constraints include tank volume, strength, buckling and aeroelastic parameters. The results show that the combined approach leads to a greater weight saving, i.e. 26.5%, compared with three additional optimizations based on individual design approaches.

  7. Multi-objective Genetic Algorithm for System Identification and Controller Optimization of Automated Guided Vehicle

    Directory of Open Access Journals (Sweden)

    Xing Wu

    2011-07-01

    Full Text Available This paper presents a multi-objective genetic algorithm (MOGA with Pareto optimality and elitist tactics for the control system design of automated guided vehicle (AGV. The MOGA is used to identify AGV driving system model and optimize its servo control system sequentially. In system identification, the model identified by least square method is adopted as an evolution tutor who selects the individuals having balanced performances in all objectives as elitists. In controller optimization, the velocity regulating capability required by AGV path tracking is employed as decision-making preferences which select Pareto optimal solutions as elitists. According to different objectives and elitist tactics, several sub-populations are constructed and they evolve concurrently by using independent reproduction, neighborhood mutation and heuristic crossover. The lossless finite precision method and the multi-objective normalized increment distance are proposed to keep the population diversity with a low computational complexity. Experiment results show that the cascaded MOGA have the capability to make the system model consistent with AGV driving system both in amplitude and phase, and to make its servo control system satisfy the requirements on dynamic performance and steady-state accuracy in AGV path tracking.

  8. Optimal Solution for VLSI Physical Design Automation Using Hybrid Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    I. Hameem Shanavas

    2014-01-01

    Full Text Available In Optimization of VLSI Physical Design, area minimization and interconnect length minimization is an important objective in physical design automation of very large scale integration chips. The objective of minimizing the area and interconnect length would scale down the size of integrated chips. To meet the above objective, it is necessary to find an optimal solution for physical design components like partitioning, floorplanning, placement, and routing. This work helps to perform the optimization of the benchmark circuits with the above said components of physical design using hierarchical approach of evolutionary algorithms. The goal of minimizing the delay in partitioning, minimizing the silicon area in floorplanning, minimizing the layout area in placement, minimizing the wirelength in routing has indefinite influence on other criteria like power, clock, speed, cost, and so forth. Hybrid evolutionary algorithm is applied on each of its phases to achieve the objective. Because evolutionary algorithm that includes one or many local search steps within its evolutionary cycles to obtain the minimization of area and interconnect length. This approach combines a hierarchical design like genetic algorithm and simulated annealing to attain the objective. This hybrid approach can quickly produce optimal solutions for the popular benchmarks.

  9. Plug-and-play monitoring and performance optimization for industrial automation processes

    CERN Document Server

    Luo, Hao

    2017-01-01

    Dr.-Ing. Hao Luo demonstrates the developments of advanced plug-and-play (PnP) process monitoring and control systems for industrial automation processes. With aid of the so-called Youla parameterization, a novel PnP process monitoring and control architecture (PnP-PMCA) with modularized components is proposed. To validate the developments, a case study on an industrial rolling mill benchmark is performed, and the real-time implementation on a laboratory brushless DC motor is presented. Contents PnP Process Monitoring and Control Architecture Real-Time Configuration Techniques for PnP Process Monitoring Real-Time Configuration Techniques for PnP Performance Optimization Benchmark Study and Real-Time Implementation Target Groups Researchers and students of Automation and Control Engineering Practitioners in the area of Industrial and Production Engineering The Author Hao Luo received the Ph.D. degree at the Institute for Automatic Control and Complex Systems (AKS) at the University of Duisburg-Essen, Germany, ...

  10. Optimization and coordination of South-to-North Water Diversion supply chain with strategic customer behavior

    Directory of Open Access Journals (Sweden)

    Zhi-song CHEN

    2012-12-01

    Full Text Available The South-to-North Water Diversion (SNWD Project is a significant engineering project meant to solve water shortage problems in North China. Faced with market operations management of the water diversion system, this study defined the supply chain system for the SNWD Project, considering the actual project conditions, built a decentralized decision model and a centralized decision model with strategic customer behavior (SCB using a floating pricing mechanism (FPM, and constructed a coordination mechanism via a revenue-sharing contract. The results suggest the following: (1 owing to water shortage supplements and the excess water sale policy provided by the FPM, the optimal ordering quantity of water resources is less than that without the FPM, and the optimal profits of the whole supply chain, supplier, and external distributor are higher than they would be without the FPM; (2 wholesale pricing and supplementary wholesale pricing with SCB are higher than those without SCB, and the optimal profits of the whole supply chain, supplier, and external distributor are higher than they would be without SCB; and (3 considering SCB and introducing the FPM help increase the optimal profits of the whole supply chain, supplier, and external distributor, and improve the efficiency of water resources usage.

  11. Optimal coordination of maximal-effort horizontal and vertical jump motions – a computer simulation study

    Directory of Open Access Journals (Sweden)

    Komura Taku

    2007-06-01

    Full Text Available Abstract Background The purpose of this study was to investigate the coordination strategy of maximal-effort horizontal jumping in comparison with vertical jumping, using the methodology of computer simulation. Methods A skeletal model that has nine rigid body segments and twenty degrees of freedom was developed. Thirty-two Hill-type lower limb muscles were attached to the model. The excitation-contraction dynamics of the contractile element, the tissues around the joints to limit the joint range of motion, as well as the foot-ground interaction were implemented. Simulations were initiated from an identical standing posture for both motions. Optimal pattern of the activation input signal was searched through numerical optimization. For the horizontal jumping, the goal was to maximize the horizontal distance traveled by the body's center of mass. For the vertical jumping, the goal was to maximize the height reached by the body's center of mass. Results As a result, it was found that the hip joint was utilized more vigorously in the horizontal jumping than in the vertical jumping. The muscles that have a function of joint flexion such as the m. iliopsoas, m. rectus femoris and m. tibialis anterior were activated to a greater level during the countermovement in the horizontal jumping with an effect of moving the body's center of mass in the forward direction. Muscular work was transferred to the mechanical energy of the body's center of mass more effectively in the horizontal jump, which resulted in a greater energy gain of the body's center of mass throughout the motion. Conclusion These differences in the optimal coordination strategy seem to be caused from the requirement that the body's center of mass needs to be located above the feet in a vertical jumping, whereas this requirement is not so strict in a horizontal jumping.

  12. Economic Load Dispatch - A Comparative Study on Heuristic Optimization Techniques With an Improved Coordinated Aggregation-Based PSO

    DEFF Research Database (Denmark)

    Vlachogiannis, Ioannis (John); Lee, KY

    2009-01-01

    In this paper an improved coordinated aggregation-based particle swarm optimization (ICA-PSO) algorithm is introduced for solving the optimal economic load dispatch (ELD) problem in power systems. In the ICA-PSO algorithm each particle in the swarm retains a memory of its best position ever...... and the Hellenic bulk power system, and is compared with other state-of-the-art heuristic optimization techniques (HOTs), demonstrating improved performance over them....

  13. Optimal use of visual information in adolescents and young adults with developmental coordination disorder.

    Science.gov (United States)

    de Oliveira, Rita F; Billington, Jac; Wann, John P

    2014-09-01

    Recent reports offer contrasting views on whether or not the use of online visual control is impaired in individuals with developmental coordination disorder (DCD). This study explored the optimal temporal basis for processing and using visual information in adolescents and young adults with DCD. Participants were 22 adolescents and young adults (12 males and 10 females; M = 19 years, SD = 3). Half had been diagnosed with DCD as children and still performed poorly on the movement assessment battery for children (DCD group; n = 11), and half reported typical development (TD group; n = 11) and were age- and gender-matched with the DCD group. We used performance on a steering task as a measure of information processing and examined the use of advance visual information. The conditions varied the duration of advance visual information: 125, 250, 500, 750, and 1,000 ms. With increased duration of advance visual information, the TD group showed a pattern of linear improvement. For the DCD group, however, the pattern was best described by a U-curve where optimal performance occurred with about 750 ms of advance information. The results suggest that the DCD group has an underlying preference for immediate online processing of visual information. The exact timing for optimal online control may depend crucially on the task, but too much advance information is detrimental to performance.

  14. Novel Handover Optimization with a Coordinated Contiguous Carrier Aggregation Deployment Scenario in LTE-Advanced Systems

    Directory of Open Access Journals (Sweden)

    Ibraheem Shayea

    2016-01-01

    Full Text Available The carrier aggregation (CA technique and Handover Parameters Optimization (HPO function have been introduced in LTE-Advanced systems to enhance system performance in terms of throughput, coverage area, and connection stability and to reduce management complexity. Although LTE-Advanced has benefited from the CA technique, the low spectral efficiency and high ping-pong effect with high outage probabilities in conventional Carrier Aggregation Deployment Scenarios (CADSs have become major challenges for cell edge User Equipment (UE. Also, the existing HPO algorithms are not optimal for selecting the appropriate handover control parameters (HCPs. This paper proposes two solutions by deploying a Coordinated Contiguous-CADS (CC-CADS and a Novel Handover Parameters Optimization algorithm that is based on the Weight Performance Function (NHPO-WPF. The CC-CADS uses two contiguous component carriers (CCs that have two different beam directions. The NHPO-WPF automatically adjusts the HCPs based on the Weight Performance Function (WPF, which is evaluated as a function of the Signal-to-Interference Noise Ratio (SINR, cell load, and UE’s velocity. Simulation results show that the CC-CADS and the NHPO-WPF algorithm provide significant enhancements in system performance over that of conventional CADSs and HPO algorithms from the literature, respectively. The integration of both solutions achieves even better performance than scenarios in which each solution is considered independently.

  15. Path optimization by a variational reaction coordinate method. I. Development of formalism and algorithms.

    Science.gov (United States)

    Birkholz, Adam B; Schlegel, H Bernhard

    2015-12-28

    The development of algorithms to optimize reaction pathways between reactants and products is an active area of study. Existing algorithms typically describe the path as a discrete series of images (chain of states) which are moved downhill toward the path, using various reparameterization schemes, constraints, or fictitious forces to maintain a uniform description of the reaction path. The Variational Reaction Coordinate (VRC) method is a novel approach that finds the reaction path by minimizing the variational reaction energy (VRE) of Quapp and Bofill. The VRE is the line integral of the gradient norm along a path between reactants and products and minimization of VRE has been shown to yield the steepest descent reaction path. In the VRC method, we represent the reaction path by a linear expansion in a set of continuous basis functions and find the optimized path by minimizing the VRE with respect to the linear expansion coefficients. Improved convergence is obtained by applying constraints to the spacing of the basis functions and coupling the minimization of the VRE to the minimization of one or more points along the path that correspond to intermediates and transition states. The VRC method is demonstrated by optimizing the reaction path for the Müller-Brown surface and by finding a reaction path passing through 5 transition states and 4 intermediates for a 10 atom Lennard-Jones cluster.

  16. Design And Modeling An Automated Digsilent Power System For Optimal New Load Locations

    Directory of Open Access Journals (Sweden)

    Mohamed Saad

    2015-08-01

    Full Text Available Abstract The electric power utilities seek to take advantage of novel approaches to meet growing energy demand. Utilities are under pressure to evolve their classical topologies to increase the usage of distributed generation. Currently the electrical power engineers in many regions of the world are implementing manual methods to measure power consumption for farther assessment of voltage violation. Such process proved to be time consuming costly and inaccurate. Also demand response is a grid management technique where retail or wholesale customers are requested either electronically or manually to reduce their load. Therefore this paper aims to design and model an automated power system for optimal new load locations using DPL DIgSILENT Programming Language. This study is a diagnostic approach that assists system operator about any voltage violation cases that would happen during adding new load to the grid. The process of identifying the optimal bus bar location involves a complicated calculation of the power consumptions at each load bus As a result the DPL program would consider all the IEEE 30 bus internal networks data then a load flow simulation will be executed. To add the new load to the first bus in the network. Therefore the developed model will simulate the new load at each available bus bar in the network and generate three analytical reports for each case that captures the overunder voltage and the loading elements among the grid.

  17. Optimized Multi Agent Coordination using Evolutionary Algorithm: Special Impact in Online Education

    Directory of Open Access Journals (Sweden)

    Subrat P Pattanayak

    2012-08-01

    Full Text Available Intelligent multi-agent systems are contemporary direction ofartificial intelligence that is being built up as a result ofresearchers in information processing, distributed systems,network technologies for problem solving. Multi agentcoordination is a vital area where agents coordinate amongthemselves to achieve a particular goal, which either can not besolved by a single agent or is not time effective by a singleagent. The agent’s role in education field is rapidly increasing.Information retrieval, students information processing system,Learning Information System, Pedagogical Agents are variedwork done by different agent technology. The novice usersspecifically are the most useful learners of an E-Tutoringsystem. A multi-agent system plays a vital role in this type ofE-tutoring system. Online Education is an emerging field inEducation System. To improve the interaction betweenlearners and tutors with personalized communication, weproposed an Optimized Multi Agent System (OMAS bywhich a learner can get sufficient information to achieve theirobjective. This conceptual framework is based on the idea that,adaptiveness is the best match between a particular learnersprofile and its course contents. We also try to optimize theprocedure using evolutionary process so that style of thelearner and the learning methods with respect to the learner ismatched with high fitness value. The agent technology hasbeen applied in a varied type of applications for education, butthis system may work as a user friendly conceptual systemwhichcan be integrated with any e-learning software. Use ofthe GUI user interface can make the system moreenriched. When a particular request comes from thelearner, the agents coordinate themselves to get the bestpossible solution. The solution can be represented in ananimated way in front of the learner, so that the noviceusers those who are new to the system, can adopt it veryeasily and with ease.

  18. Understanding Innovation Engines: Automated Creativity and Improved Stochastic Optimization via Deep Learning.

    Science.gov (United States)

    Nguyen, A; Yosinski, J; Clune, J

    2016-01-01

    The Achilles Heel of stochastic optimization algorithms is getting trapped on local optima. Novelty Search mitigates this problem by encouraging exploration in all interesting directions by replacing the performance objective with a reward for novel behaviors. This reward for novel behaviors has traditionally required a human-crafted, behavioral distance function. While Novelty Search is a major conceptual breakthrough and outperforms traditional stochastic optimization on certain problems, it is not clear how to apply it to challenging, high-dimensional problems where specifying a useful behavioral distance function is difficult. For example, in the space of images, how do you encourage novelty to produce hawks and heroes instead of endless pixel static? Here we propose a new algorithm, the Innovation Engine, that builds on Novelty Search by replacing the human-crafted behavioral distance with a Deep Neural Network (DNN) that can recognize interesting differences between phenotypes. The key insight is that DNNs can recognize similarities and differences between phenotypes at an abstract level, wherein novelty means interesting novelty. For example, a DNN-based novelty search in the image space does not explore in the low-level pixel space, but instead creates a pressure to create new types of images (e.g., churches, mosques, obelisks, etc.). Here, we describe the long-term vision for the Innovation Engine algorithm, which involves many technical challenges that remain to be solved. We then implement a simplified version of the algorithm that enables us to explore some of the algorithm's key motivations. Our initial results, in the domain of images, suggest that Innovation Engines could ultimately automate the production of endless streams of interesting solutions in any domain: for example, producing intelligent software, robot controllers, optimized physical components, and art.

  19. Dynamic strategy based fast decomposed GA coordinated with FACTS devices to enhance the optimal power flow

    Energy Technology Data Exchange (ETDEWEB)

    Mahdad, Belkacem, E-mail: bemahdad@yahoo.f [University of Biskra, Department of Electrical Engineering, Biskra 07000 (Algeria); Bouktir, T. [Oum El Bouaghi, Department of Electrical Engineering, Oum El Bouaghi 04000 (Algeria); Srairi, K. [University of Biskra, Department of Electrical Engineering, Biskra 07000 (Algeria); EL Benbouzid, M. [Laboratoire Brestois de Mecanique et des Systemes, University of Brest (France)

    2010-07-15

    Under critical situation the main preoccupation of expert engineers is to assure power system security and to deliver power to the consumer within the desired index power quality. The total generation cost taken as a secondary strategy. This paper presents an efficient decomposed GA to enhance the solution of the optimal power flow (OPF) with non-smooth cost function and under severe loading conditions. At the decomposed stage the length of the original chromosome is reduced successively and adapted to the topology of the new partition. Two sub problems are proposed to coordinate the OPF problem under different loading conditions: the first sub problem related to the active power planning under different loading factor to minimize the total fuel cost, and the second sub problem is a reactive power planning designed based in practical rules to make fine corrections to the voltage deviation and reactive power violation using a specified number of shunt dynamic compensators named Static Var Compensators (SVC). To validate the robustness of the proposed approach, the proposed algorithm tested on IEEE 30-Bus, 26-Bus and IEEE 118-Bus under different loading conditions and compared with global optimization methods (GA, EGA, FGA, PSO, MTS, MDE and ACO) and with two robust simulation packages: PSAT and MATPOWER. The results show that the proposed approach can converge to the near solution and obtain a competitive solution at critical situation and with a reasonable time.

  20. Extending the NIF DISCO framework to automate complex workflow: coordinating the harvest and integration of data from diverse neuroscience information resources.

    Science.gov (United States)

    Marenco, Luis N; Wang, Rixin; Bandrowski, Anita E; Grethe, Jeffrey S; Shepherd, Gordon M; Miller, Perry L

    2014-01-01

    This paper describes how DISCO, the data aggregator that supports the Neuroscience Information Framework (NIF), has been extended to play a central role in automating the complex workflow required to support and coordinate the NIF's data integration capabilities. The NIF is an NIH Neuroscience Blueprint initiative designed to help researchers access the wealth of data related to the neurosciences available via the Internet. A central component is the NIF Federation, a searchable database that currently contains data from 231 data and information resources regularly harvested, updated, and warehoused in the DISCO system. In the past several years, DISCO has greatly extended its functionality and has evolved to play a central role in automating the complex, ongoing process of harvesting, validating, integrating, and displaying neuroscience data from a growing set of participating resources. This paper provides an overview of DISCO's current capabilities and discusses a number of the challenges and future directions related to the process of coordinating the integration of neuroscience data within the NIF Federation.

  1. Extending the NIF DISCO framework to automate complex workflow: coordinating the harvest and integration of data from diverse neuroscience information resources

    Directory of Open Access Journals (Sweden)

    Luis eMarenco

    2014-05-01

    Full Text Available This paper describes how DISCO, the data-aggregator that supports the Neuroscience Information Framework (NIF, has been extended to play a central role in automating the complex workflow required to support and coordinate the NIF's data integration capabilities. The NIF is an NIH Neuroscience Blueprint initiative designed to help researchers access the wealth of data related to the neurosciences available via the Internet. A central component is the NIF Federation, a searchable database that currently contains data from 231 data and information resources regularly harvested, updated, and warehoused in the DISCO system. In the past several years, DISCO has greatly extended its functionality and has evolved to play a central role in automating the complex, ongoing process of harvesting, validating, integrating, and displaying neuroscience data from a growing set of participating resources. This paper provides an overview of DISCO's current capabilities and discusses a number of the challenges and future directions related to the process of coordinating the integration of neuroscience data within the NIF Federation.

  2. Computer-automated multi-disciplinary analysis and design optimization of internally cooled turbine blades

    Science.gov (United States)

    Martin, Thomas Joseph

    This dissertation presents the theoretical methodology, organizational strategy, conceptual demonstration and validation of a fully automated computer program for the multi-disciplinary analysis, inverse design and optimization of convectively cooled axial gas turbine blades and vanes. Parametric computer models of the three-dimensional cooled turbine blades and vanes were developed, including the automatic generation of discretized computational grids. Several new analysis programs were written and incorporated with existing computational tools to provide computer models of the engine cycle, aero-thermodynamics, heat conduction and thermofluid physics of the internally cooled turbine blades and vanes. A generalized information transfer protocol was developed to provide the automatic mapping of geometric and boundary condition data between the parametric design tool and the numerical analysis programs. A constrained hybrid optimization algorithm controlled the overall operation of the system and guided the multi-disciplinary internal turbine cooling design process towards the objectives and constraints of engine cycle performance, aerodynamic efficiency, cooling effectiveness and turbine blade and vane durability. Several boundary element computer programs were written to solve the steady-state non-linear heat conduction equation inside the internally cooled and thermal barrier-coated turbine blades and vanes. The boundary element method (BEM) did not require grid generation inside the internally cooled turbine blades and vanes, so the parametric model was very robust. Implicit differentiations of the BEM thermal and thereto-elastic analyses were done to compute design sensitivity derivatives faster and more accurately than via explicit finite differencing. A factor of three savings of computer processing time was realized for two-dimensional thermal optimization problems, and a factor of twenty was obtained for three-dimensional thermal optimization problems

  3. Paramfit: automated optimization of force field parameters for molecular dynamics simulations.

    Science.gov (United States)

    Betz, Robin M; Walker, Ross C

    2015-01-15

    The generation of bond, angle, and torsion parameters for classical molecular dynamics force fields typically requires fitting parameters such that classical properties such as energies and gradients match precalculated quantum data for structures that scan the value of interest. We present a program, Paramfit, distributed as part of the AmberTools software package that automates and extends this fitting process, allowing for simplified parameter generation for applications ranging from single molecules to entire force fields. Paramfit implements a novel combination of a genetic and simplex algorithm to find the optimal set of parameters that replicate either quantum energy or force data. The program allows for the derivation of multiple parameters simultaneously using significantly fewer quantum calculations than previous methods, and can also fit parameters across multiple molecules with applications to force field development. Paramfit has been applied successfully to systems with a sparse number of structures, and has already proven crucial in the development of the Assisted Model Building with Energy Refinement Lipid14 force field.

  4. On the implementation of an automated acoustic output optimization algorithm for subharmonic aided pressure estimation.

    Science.gov (United States)

    Dave, J K; Halldorsdottir, V G; Eisenbrey, J R; Merton, D A; Liu, J B; Machado, P; Zhao, H; Park, S; Dianis, S; Chalek, C L; Thomenius, K E; Brown, D B; Forsberg, F

    2013-04-01

    Incident acoustic output (IAO) dependent subharmonic signal amplitudes from ultrasound contrast agents can be categorized into occurrence, growth or saturation stages. Subharmonic aided pressure estimation (SHAPE) is a technique that utilizes growth stage subharmonic signal amplitudes for hydrostatic pressure estimation. In this study, we developed an automated IAO optimization algorithm to identify the IAO level eliciting growth stage subharmonic signals and also studied the effect of pulse length on SHAPE. This approach may help eliminate the problems of acquiring and analyzing the data offline at all IAO levels as was done in previous studies and thus, pave the way for real-time clinical pressure monitoring applications. The IAO optimization algorithm was implemented on a Logiq 9 (GE Healthcare, Milwaukee, WI) scanner interfaced with a computer. The optimization algorithm stepped the ultrasound scanner from 0% to 100% IAO. A logistic equation fitting function was applied with the criterion of minimum least squared error between the fitted subharmonic amplitudes and the measured subharmonic amplitudes as a function of the IAO levels and the optimum IAO level was chosen corresponding to the inflection point calculated from the fitted data. The efficacy of the optimum IAO level was investigated for in vivo SHAPE to monitor portal vein (PV) pressures in 5 canines and was compared with the performance of IAO levels, below and above the optimum IAO level, for 4, 8 and 16 transmit cycles. The canines received a continuous infusion of Sonazoid microbubbles (1.5 μl/kg/min; GE Healthcare, Oslo, Norway). PV pressures were obtained using a surgically introduced pressure catheter (Millar Instruments, Inc., Houston, TX) and were recorded before and after increasing PV pressures. The experiments showed that optimum IAO levels for SHAPE in the canines ranged from 6% to 40%. The best correlation between changes in PV pressures and in subharmonic amplitudes (r=-0.76; p=0

  5. Optimal voltage control in distribution systems with coordination of distribution installations

    Energy Technology Data Exchange (ETDEWEB)

    Oshiro, Masato; Tanaka, Kenichi; Uehara, Akie; Senjyu, Tomonobu; Miyazato, Yoshitaka; Yona, Atsushi [Faculty of Engineering, University of the Ryukyus, 1 Senbaru, Nishihara-cho, Nakagami, Okinawa 903-0213 (Japan); Funabashi, Toshihisa [Meidensha Corporation, 2-1-1, Osaki, Shinagawa-ku, Tokyo 141-6029 (Japan)

    2010-12-15

    In recent years, distributed generations based on natural energy or co-generation systems are increasing due to global warming and reduction of fossil fuels. Many of the distributed generations are set up in the vicinity of customers, with the advantage that this decreases transmission losses and transmission capacity. However, output power generated from renewable energy such as wind power and photovoltaics, is influenced by weather conditions. Therefore if the distributed generation increases with conventional control schemes, the voltage variation in a distribution system becomes a serious problem. In this paper, an optimal control method of distribution voltage with coordination of distributed installations, such as On Load Tap Changer (OLTC), Step Voltage Regulator (SVR), Shunt Capacitor (SC), Shunt Reactor (ShR), and Static Var Compensator (SVC), is proposed. In this research, the communication infrastructure is assumed to be widespread in the distribution network. The proposed technique combines genetic algorithm (GA) and Tabu search (TS) to determine the control operation. In order to confirm the validity of the proposed method, simulation results are presented for a distribution network model with distributed (photovoltaic) generation. (author)

  6. A new module for constrained multi-fragment geometry optimization in internal coordinates implemented in the MOLCAS package.

    Science.gov (United States)

    Vysotskiy, Victor P; Boström, Jonas; Veryazov, Valera

    2013-11-15

    A parallel procedure for an effective optimization of relative position and orientation between two or more fragments has been implemented in the MOLCAS program package. By design, the procedure does not perturb the electronic structure of a system under the study. The original composite system is divided into frozen fragments and internal coordinates linking those fragments are the only optimized parameters. The procedure is capable to handle fully independent (no border atoms) fragments as well as fragments connected by covalent bonds. In the framework of the procedure, the optimization of relative position and orientation of the fragments are carried out in the internal "Z-matrix" coordinates using numerical derivatives. The total number of required single points energy evaluations scales with the number of fragments rather than with the total number of atoms in the system. The accuracy and the performance of the procedure have been studied by test calculations for a representative set of two- and three-fragment molecules with artificially distorted structures. The developed approach exhibits robust and smooth convergence to the reference optimal structures. As only a few internal coordinates are varied during the procedure, the proposed constrained fragment geometry optimization can be afforded even for high level ab initio methods like CCSD(T) and CASPT2. This capability has been demonstrated by applying the method to two larger cases, CCSD(T) and CASPT2 calculations on a positively charged benzene lithium complex and on the oxygen molecule interacting to iron porphyrin molecule, respectively.

  7. Electric Vehicle Charging and Discharging Coordination on Distribution Network Using Multi-Objective Particle Swarm Optimization and Fuzzy Decision Making

    Directory of Open Access Journals (Sweden)

    Dongqi Liu

    2016-03-01

    Full Text Available This paper proposed a optimal strategy for coordinated operation of electric vehicles (EVs charging and discharging with wind-thermal system. By aggregating a large number of EVs, the huge total battery capacity is sufficient to stabilize the disturbance of the transmission grid. Hence, a dynamic environmental dispatch model which coordinates a cluster of charging and discharging controllable EV units with wind farms and thermal plants is proposed. A multi-objective particle swarm optimization (MOPSO algorithm and a fuzzy decision maker are put forward for the simultaneous optimization of grid operating cost, CO2 emissions, wind curtailment, and EV users’ cost. Simulations are done in a 30 node system containing three traditional thermal plants, two carbon capture and storage (CCS thermal plants, two wind farms, and six EV aggregations. Contrast of strategies under different EV charging/discharging price is also discussed. The results are presented to prove the effectiveness of the proposed strategy.

  8. Automated bone segmentation from dental CBCT images using patch-based sparse representation and convex optimization

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Li; Gao, Yaozong; Shi, Feng; Liao, Shu; Li, Gang [Department of Radiology and BRIC, University of North Carolina at Chapel Hill, North Carolina 27599 (United States); Chen, Ken Chung [Department of Oral and Maxillofacial Surgery, Houston Methodist Hospital Research Institute, Houston, Texas 77030 and Department of Stomatology, National Cheng Kung University Medical College and Hospital, Tainan, Taiwan 70403 (China); Shen, Steve G. F.; Yan, Jin [Department of Oral and Craniomaxillofacial Surgery and Science, Shanghai Ninth People' s Hospital, Shanghai Jiao Tong University College of Medicine, Shanghai, China 200011 (China); Lee, Philip K. M.; Chow, Ben [Hong Kong Dental Implant and Maxillofacial Centre, Hong Kong, China 999077 (China); Liu, Nancy X. [Department of Oral and Maxillofacial Surgery, Houston Methodist Hospital Research Institute, Houston, Texas 77030 and Department of Oral and Maxillofacial Surgery, Peking University School and Hospital of Stomatology, Beijing, China 100050 (China); Xia, James J. [Department of Oral and Maxillofacial Surgery, Houston Methodist Hospital Research Institute, Houston, Texas 77030 (United States); Department of Surgery (Oral and Maxillofacial Surgery), Weill Medical College, Cornell University, New York, New York 10065 (United States); Department of Oral and Craniomaxillofacial Surgery and Science, Shanghai Ninth People' s Hospital, Shanghai Jiao Tong University College of Medicine, Shanghai, China 200011 (China); Shen, Dinggang, E-mail: dgshen@med.unc.edu [Department of Radiology and BRIC, University of North Carolina at Chapel Hill, North Carolina 27599 and Department of Brain and Cognitive Engineering, Korea University, Seoul, 136701 (Korea, Republic of)

    2014-04-15

    Purpose: Cone-beam computed tomography (CBCT) is an increasingly utilized imaging modality for the diagnosis and treatment planning of the patients with craniomaxillofacial (CMF) deformities. Accurate segmentation of CBCT image is an essential step to generate three-dimensional (3D) models for the diagnosis and treatment planning of the patients with CMF deformities. However, due to the poor image quality, including very low signal-to-noise ratio and the widespread image artifacts such as noise, beam hardening, and inhomogeneity, it is challenging to segment the CBCT images. In this paper, the authors present a new automatic segmentation method to address these problems. Methods: To segment CBCT images, the authors propose a new method for fully automated CBCT segmentation by using patch-based sparse representation to (1) segment bony structures from the soft tissues and (2) further separate the mandible from the maxilla. Specifically, a region-specific registration strategy is first proposed to warp all the atlases to the current testing subject and then a sparse-based label propagation strategy is employed to estimate a patient-specific atlas from all aligned atlases. Finally, the patient-specific atlas is integrated into amaximum a posteriori probability-based convex segmentation framework for accurate segmentation. Results: The proposed method has been evaluated on a dataset with 15 CBCT images. The effectiveness of the proposed region-specific registration strategy and patient-specific atlas has been validated by comparing with the traditional registration strategy and population-based atlas. The experimental results show that the proposed method achieves the best segmentation accuracy by comparison with other state-of-the-art segmentation methods. Conclusions: The authors have proposed a new CBCT segmentation method by using patch-based sparse representation and convex optimization, which can achieve considerably accurate segmentation results in CBCT

  9. Moving Toward an Optimal and Automated Geospatial Network for CCUS Infrastructure

    Energy Technology Data Exchange (ETDEWEB)

    Hoover, Brendan Arthur [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-08-05

    Modifications in the global climate are being driven by the anthropogenic release of greenhouse gases (GHG) including carbon dioxide (CO2) (Middleton et al. 2014). CO2 emissions have, for example, been directly linked to an increase in total global temperature (Seneviratne et al. 2016). Strategies that limit CO2 emissions—like CO2 capture, utilization, and storage (CCUS) technology—can greatly reduce emissions by capturing CO2 before it is released to the atmosphere. However, to date CCUS technology has not been developed at a large commercial scale despite several promising high profile demonstration projects (Middleton et al. 2015). Current CCUS research has often focused on capturing CO2 emissions from coal-fired power plants, but recent research at Los Alamos National Laboratory (LANL) suggests focusing CCUS CO2 capture research upon industrial sources might better encourage CCUS deployment. To further promote industrial CCUS deployment, this project builds off current LANL research by continuing the development of a software tool called SimCCS, which estimates a regional system of transport to inject CO2 into sedimentary basins. The goal of SimCCS, which was first developed by Middleton and Bielicki (2009), is to output an automated and optimal geospatial industrial CCUS pipeline that accounts for industrial source and sink locations by estimating a Delaunay triangle network which also minimizes topographic and social costs (Middleton and Bielicki 2009). Current development of SimCCS is focused on creating a new version that accounts for spatial arrangements that were not available in the previous version. This project specifically addresses the issue of non-unique Delaunay triangles by adding additional triangles to the network, which can affect how the CCUS network is calculated.

  10. Automation of sample preparation for mass cytometry barcoding in support of clinical research: protocol optimization.

    Science.gov (United States)

    Nassar, Ala F; Wisnewski, Adam V; Raddassi, Khadir

    2017-03-01

    Analysis of multiplexed assays is highly important for clinical diagnostics and other analytical applications. Mass cytometry enables multi-dimensional, single-cell analysis of cell type and state. In mass cytometry, the rare earth metals used as reporters on antibodies allow determination of marker expression in individual cells. Barcode-based bioassays for CyTOF are able to encode and decode for different experimental conditions or samples within the same experiment, facilitating progress in producing straightforward and consistent results. Herein, an integrated protocol for automated sample preparation for barcoding used in conjunction with mass cytometry for clinical bioanalysis samples is described; we offer results of our work with barcoding protocol optimization. In addition, we present some points to be considered in order to minimize the variability of quantitative mass cytometry measurements. For example, we discuss the importance of having multiple populations during titration of the antibodies and effect of storage and shipping of labelled samples on the stability of staining for purposes of CyTOF analysis. Data quality is not affected when labelled samples are stored either frozen or at 4 °C and used within 10 days; we observed that cell loss is greater if cells are washed with deionized water prior to shipment or are shipped in lower concentration. Once the labelled samples for CyTOF are suspended in deionized water, the analysis should be performed expeditiously, preferably within the first hour. Damage can be minimized if the cells are resuspended in phosphate-buffered saline (PBS) rather than deionized water while waiting for data acquisition.

  11. An automated system for quantitative analysis of newborns' oral-motor behavior and coordination during bottle feeding.

    Science.gov (United States)

    Tamilia, Eleonora; Formica, Domenico; Visco, Anna Maria; Scaini, Alberto; Taffoni, Fabrizio

    2015-01-01

    In this work a novel unobtrusive technology-aided system is presented and tested for the assessment of newborns' oral-motor behavior and coordination during bottle feeding. A low-cost monitoring device was designed and developed in order to record Suction (S) and Expression (E) pressures from a typical feeding bottle. A software system was developed to automatically treat the data and analyze them. A set of measures of motor control and coordination has been implemented for the specific application to the analysis of sucking behavior. Experimental data were collected with the developed system on two groups of newborns (Healthy vs. Low Birth Weight) in a clinical setting. We identified the most sensitive S features to group differences, and analyzed their correlation with S/E coordination measures. Then, Principal Component Analysis (PCA) was used to explore the system suitability to automatically identify peculiar oral behaviors. Results suggest the suitability of the proposed system to perform an objective technology-aided assessment of the newborn's oral-motor behavior and coordination during the first days of life.

  12. Automated property optimization via ab initio O(N) elongation method: Application to (hyper-)polarizability in DNA

    Science.gov (United States)

    Orimoto, Yuuichi; Aoki, Yuriko

    2016-07-01

    An automated property optimization method was developed based on the ab initio O(N) elongation (ELG) method and applied to the optimization of nonlinear optical (NLO) properties in DNA as a first test. The ELG method mimics a polymerization reaction on a computer, and the reaction terminal of a starting cluster is attacked by monomers sequentially to elongate the electronic structure of the system by solving in each step a limited space including the terminal (localized molecular orbitals at the terminal) and monomer. The ELG-finite field (ELG-FF) method for calculating (hyper-)polarizabilities was used as the engine program of the optimization method, and it was found to show linear scaling efficiency while maintaining high computational accuracy for a random sequenced DNA model. Furthermore, the self-consistent field convergence was significantly improved by using the ELG-FF method compared with a conventional method, and it can lead to more feasible NLO property values in the FF treatment. The automated optimization method successfully chose an appropriate base pair from four base pairs (A, T, G, and C) for each elongation step according to an evaluation function. From test optimizations for the first order hyper-polarizability (β) in DNA, a substantial difference was observed depending on optimization conditions between "choose-maximum" (choose a base pair giving the maximum β for each step) and "choose-minimum" (choose a base pair giving the minimum β). In contrast, there was an ambiguous difference between these conditions for optimizing the second order hyper-polarizability (γ) because of the small absolute value of γ and the limitation of numerical differential calculations in the FF method. It can be concluded that the ab initio level property optimization method introduced here can be an effective step towards an advanced computer aided material design method as long as the numerical limitation of the FF method is taken into account.

  13. Optimal Attitude Estimation and Filtering Without Using Local Coordinates Part I: Uncontrolled and Deterministic Attitude Dynamics

    OpenAIRE

    Sanyal, Amit K.

    2005-01-01

    There are several attitude estimation algorithms in existence, all of which use local coordinate representations for the group of rigid body orientations. All local coordinate representations of the group of orientations have associated problems. While minimal coordinate representations exhibit kinematic singularities for large rotations, the quaternion representation requires satisfaction of an extra constraint. This paper treats the attitude estimation and filtering problem as an optimizati...

  14. Vibrational quasi-degenerate perturbation theory with optimized coordinates: applications to ethylene and trans-1,3-butadiene.

    Science.gov (United States)

    Yagi, Kiyoshi; Otaki, Hiroki

    2014-02-28

    A perturbative extension to optimized coordinate vibrational self-consistent field (oc-VSCF) is proposed based on the quasi-degenerate perturbation theory (QDPT). A scheme to construct the degenerate space (P space) is developed, which incorporates degenerate configurations and alleviates the divergence of perturbative expansion due to localized coordinates in oc-VSCF (e.g., local O-H stretching modes of water). An efficient configuration selection scheme is also implemented, which screens out the Hamiltonian matrix element between the P space configuration (p) and the complementary Q space configuration (q) based on a difference in their quantum numbers (λpq = ∑s|ps - qs|). It is demonstrated that the second-order vibrational QDPT based on optimized coordinates (oc-VQDPT2) smoothly converges with respect to the order of the mode coupling, and outperforms the conventional one based on normal coordinates. Furthermore, an improved, fast algorithm is developed for optimizing the coordinates. First, the minimization of the VSCF energy is conducted in a restricted parameter space, in which only a portion of pairs of coordinates is selectively transformed. A rational index is devised for this purpose, which identifies the important coordinate pairs to mix from others that may remain unchanged based on the magnitude of harmonic coupling induced by the transformation. Second, a cubic force field (CFF) is employed in place of a quartic force field, which bypasses intensive procedures that arise due to the presence of the fourth-order force constants. It is found that oc-VSCF based on CFF together with the pair selection scheme yields the coordinates similar in character to the conventional ones such that the final vibrational energy is affected very little while gaining an order of magnitude acceleration. The proposed method is applied to ethylene and trans-1,3-butadiene. An accurate, multi-resolution potential, which combines the MP2 and coupled-cluster with singles

  15. Optimal Ordering Policy and Coordination Mechanism of a Supply Chain with Controllable Lead-Time-Dependent Demand Forecast

    Directory of Open Access Journals (Sweden)

    Hua-Ming Song

    2011-01-01

    Full Text Available This paper investigates the ordering decisions and coordination mechanism for a distributed short-life-cycle supply chain. The objective is to maximize the whole supply chain's expected profit and meanwhile make the supply chain participants achieve a Pareto improvement. We treat lead time as a controllable variable, thus the demand forecast is dependent on lead time: the shorter lead time, the better forecast. Moreover, optimal decision-making models for lead time and order quantity are formulated and compared in the decentralized and centralized cases. Besides, a three-parameter contract is proposed to coordinate the supply chain and alleviate the double margin in the decentralized scenario. In addition, based on the analysis of the models, we develop an algorithmic procedure to find the optimal ordering decisions. Finally, a numerical example is also presented to illustrate the results.

  16. Optimization of RNA Purification and Analysis for Automated, Pre-Symptomatic Disease Diagnostics

    Energy Technology Data Exchange (ETDEWEB)

    Vaidya, A; Nasarabadi, S; Milanovich, F

    2005-06-28

    When diagnosing disease, time is often a more formidable enemy than the pathogen itself. Current detection methods rely primarily on post-symptomatic protein production (i.e. antibodies), which does not occur in noticeable levels until several weeks after infection. As such, a major goal among researchers today is to expedite pre-symptomatic disease recognition and treatment. Since most pathogens are known to leave a unique signature on the genetic expression of the host, one potential diagnostic tool is host mRNA. In my experiments, I examined several methods of isolating RNA and reading its genetic sequence. I first used two types of reverse transcriptase polymerase chain reactions (using commercial RNA) and examined the resultant complementary DNA through gel electrophoresis. I then proceeded to isolate and purify whole RNA from actual human monocytes and THP-1 cells using several published methods, and examined gene expression on the RNA itself. I compared the two RT-PCR methods and concluded that a double step RT-PCR is superior to the single step method. I also compared the various techniques of RNA isolation by examining the yield and purity of the resultant RNA. Finally, I studied the level of cellular IL-8 and IL-1 gene expression, two genes involved in the human immune response, which can serve as a baseline for future genetic comparison with LPS-exposed cells. Based on the results, I have determined which conditions and procedures are optimal for RNA isolation, RT-PCR, and RNA yield assessment. The overall goal of my research is to develop a flow-through system of RNA analysis, whereby blood samples can be collected and analyzed for disease prior to the onset of symptoms. The Pathomics group hopes to automate this process by removing the human labor factor, thereby decreasing the procedure's cost and increasing its availability to the general population. Eventually, our aim is to have an autonomous diagnostic system based on RNA analysis that would

  17. Distributed Learning, Extremum Seeking, and Model-Free Optimization for the Resilient Coordination of Multi-Agent Adversarial Groups

    Science.gov (United States)

    2016-09-07

    AFRL-AFOSR-VA-TR-2016-0314 Distributed learning , extremum seeking, and model-free optimization for the resilient coordination of multi-agent...reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching...existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding

  18. Optimizing lighting, thermal performance, and energy production of building facades by using automated blinds and PV cells

    Science.gov (United States)

    Alzoubi, Hussain Hendi

    Energy consumption in buildings has recently become a major concern for environmental designers. Within this field, daylighting and solar energy design are attractive strategies for saving energy. This study seeks the integrity and the optimality of building envelopes' performance. It focuses on the transparent parts of building facades, specifically, the windows and their shading devices. It suggests a new automated method of utilizing solar energy while keeping optimal solutions for indoor daylighting. The method utilizes a statistical approach to produce mathematical equations based on physical experimentation. A full-scale mock-up representing an actual office was built. Heat gain and lighting levels were measured empirically and correlated with blind angles. Computational methods were used to estimate the power production from photovoltaic cells. Mathematical formulas were derived from the results of the experiments; these formulas were utilized to construct curves as well as mathematical equations for the purpose of optimization. The mathematical equations resulting from the optimization process were coded using Java programming language to enable future users to deal with generic locations of buildings with a broader context of various climatic conditions. For the purpose of optimization by automation under different climatic conditions, a blind control system was developed based on the findings of this study. This system calibrates the blind angles instantaneously based upon the sun position, the indoor daylight, and the power production from the photovoltaic cells. The functions of this system guarantee full control of the projected solar energy on buildings' facades for indoor lighting and heat gain. In winter, the system automatically blows heat into the space, whereas it expels heat from the space during the summer season. The study showed that the optimality of building facades' performance is achievable for integrated thermal, energy, and lighting

  19. Optimal Coordinated Management of a Plug-In Electric Vehicle Charging Station under a Flexible Penalty Contract for Voltage Security

    Directory of Open Access Journals (Sweden)

    Jip Kim

    2016-07-01

    Full Text Available The increasing penetration of plug-in electric vehicles (PEVs may cause a low-voltage problem in the distribution network. In particular, the introduction of charging stations where multiple PEVs are simultaneously charged at the same bus can aggravate the low-voltage problem. Unlike a distribution network operator (DNO who has the overall responsibility for stable and reliable network operation, a charging station operator (CSO may schedule PEV charging without consideration for the resulting severe voltage drop. Therefore, there is a need for the DNO to impose a coordination measure to induce the CSO to adjust its charging schedule to help mitigate the voltage problem. Although the current time-of-use (TOU tariff is an indirect coordination measure that can motivate the CSO to shift its charging demand to off-peak time by imposing a high rate at the peak time, it is limited by its rigidity in that the network voltage condition cannot be flexibly reflected in the tariff. Therefore, a flexible penalty contract (FPC for voltage security to be used as a direct coordination measure is proposed. In addition, the optimal coordinated management is formulated. Using the Pacific Gas and Electric Company (PG&E 69-bus test distribution network, the effectiveness of the coordination was verified by comparison with the current TOU tariff.

  20. Bottom friction optimization for barotropic tide modelling using the HYbrid Coordinate Ocean Model

    Science.gov (United States)

    Boutet, Martial; Lathuilière, Cyril; Baraille, Rémy; Son Hoang, Hong; Morel, Yves

    2014-05-01

    We can list several ways to improve tide modelling at a regional or coastal scale: a more precise and refined bathymetry, better boundary conditions (the way they are implemented and the precision of global tide atlases used) and the representation of the dissipation linked to the bottom friction. Nevertheless, the most promising improvement is the bottom friction representation. Indeed, bathymetric databases, especially in coastal areas, are more and more precise and global tide models performances are better than ever (mean discrepancy between models and tide gauges is about 1 cm for M2 tide). Bottom friction is often parameterized with a quadratic term and a constant coefficient generally taken between 2.5 10-3 and 3.0 10-3. Consequently, we need a more physically consistent approach to improve bottom friction in coastal areas. The first improvement is to enable the computation of a time- and space-dependent friction coefficient. It is obtained by vertical integration of a turbulent horizontal velocity profile. The new parameter to be prescribed for the computation is the bottom roughness, z0, that depends on a large panel of physical properties and processes (sediment properties, existence of ripples and dunes, wave-current interactions, ...). The context of increasing computer resources and data availability enables the possibility to use new methods of data assimilation and optimization. The method used for this study is the simultaneous perturbation stochastic approximation (SPSA) which consists in the approximation of the gradient based on a fixed number of cost function measurements, regardless of the dimension of the vector to be estimated. Indeed, each cost function measurement is obtained by randomly perturbing every component of the parameter vector. An important feature of SPSA is its relative ease of implementation. In particular, the method does not require the development of linear and adjoint version of the circulation model. The algorithm is

  1. Coordination Strategy for One to Many Automated Negotiation Based on Fuzzy Theory%基于模糊理论的一对多自动谈判协调策略

    Institute of Scientific and Technical Information of China (English)

    武玉英; 李赟

    2011-01-01

    Because negotiators have to wait each other in the course of negotiation, so the one-to-many automated negotiation is inefficiency. In order to reach the best agreement in the least time for the negotiators, this paper proposes a coordination strategy based on fuzzy theory which can make the course continuative. Taking advantage of the coordinating Agent, this strategy can create and pull out the negotiation threads flexibility and in the course of negotiation, it updates the believes of the negotiators, so this strategy supports the continuous negotiation in open and dynamic environment and increases the efficiency substantially. Experimental result shows that the negotiation time and utility are optimized in the negotiations progress.%针对一对多自动谈判参与各方在谈判过程中因为相互等待而效率低下的问题,提出一种能使谈判过程连续化的基于模糊的协调策略,以达到在尽可能少的时间内得到最满意协议的目的.该策略通过协调Agcnt灵活地创建和撤离新的谈判线程并在谈判过程中不断更新谈判的信念值进而支持连续谈判,满足开放和动态的谈判环境,提高谈判效率.模拟实验表明,该策略能够对谈判效用和时间进行优化,具有有效性和实用性.

  2. Optimized and Automated Radiosynthesis of [18F]DHMT for Translational Imaging of Reactive Oxygen Species with Positron Emission Tomography

    Directory of Open Access Journals (Sweden)

    Wenjie Zhang

    2016-12-01

    Full Text Available Reactive oxygen species (ROS play important roles in cell signaling and homeostasis. However, an abnormally high level of ROS is toxic, and is implicated in a number of diseases. Positron emission tomography (PET imaging of ROS can assist in the detection of these diseases. For the purpose of clinical translation of [18F]6-(4-((1-(2-fluoroethyl-1H-1,2,3-triazol-4-ylmethoxyphenyl-5-methyl-5,6-dihydrophenanthridine-3,8-diamine ([18F]DHMT, a promising ROS PET radiotracer, we first manually optimized the large-scale radiosynthesis conditions and then implemented them in an automated synthesis module. Our manual synthesis procedure afforded [18F]DHMT in 120 min with overall radiochemical yield (RCY of 31.6% ± 9.3% (n = 2, decay-uncorrected and specific activity of 426 ± 272 GBq/µmol (n = 2. Fully automated radiosynthesis of [18F]DHMT was achieved within 77 min with overall isolated RCY of 6.9% ± 2.8% (n = 7, decay-uncorrected and specific activity of 155 ± 153 GBq/µmol (n = 7 at the end of synthesis. This study is the first demonstration of producing 2-[18F]fluoroethyl azide by an automated module, which can be used for a variety of PET tracers through click chemistry. It is also the first time that [18F]DHMT was successfully tested for PET imaging in a healthy beagle dog.

  3. Guiding automated NMR structure determination using a global optimization metric, the NMR DP score

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Yuanpeng Janet, E-mail: yphuang@cabm.rutgers.edu; Mao, Binchen; Xu, Fei; Montelione, Gaetano T., E-mail: gtm@rutgers.edu [Rutgers, The State University of New Jersey, Department of Molecular Biology and Biochemistry, Center for Advanced Biotechnology and Medicine, and Northeast Structural Genomics Consortium (United States)

    2015-08-15

    ASDP is an automated NMR NOE assignment program. It uses a distinct bottom-up topology-constrained network anchoring approach for NOE interpretation, with 2D, 3D and/or 4D NOESY peak lists and resonance assignments as input, and generates unambiguous NOE constraints for iterative structure calculations. ASDP is designed to function interactively with various structure determination programs that use distance restraints to generate molecular models. In the CASD–NMR project, ASDP was tested and further developed using blinded NMR data, including resonance assignments, either raw or manually-curated (refined) NOESY peak list data, and in some cases {sup 15}N–{sup 1}H residual dipolar coupling data. In these blinded tests, in which the reference structure was not available until after structures were generated, the fully-automated ASDP program performed very well on all targets using both the raw and refined NOESY peak list data. Improvements of ASDP relative to its predecessor program for automated NOESY peak assignments, AutoStructure, were driven by challenges provided by these CASD–NMR data. These algorithmic improvements include (1) using a global metric of structural accuracy, the discriminating power score, for guiding model selection during the iterative NOE interpretation process, and (2) identifying incorrect NOESY cross peak assignments caused by errors in the NMR resonance assignment list. These improvements provide a more robust automated NOESY analysis program, ASDP, with the unique capability of being utilized with alternative structure generation and refinement programs including CYANA, CNS, and/or Rosetta.

  4. Optimal spectral coordination of subantennae in natural antennae as an efficient strategy for light harvesting in photosynthesis.

    Science.gov (United States)

    Novikov, A A; Taisova, A S; Fetisova, Z G

    2006-08-01

    This work continues a series of our investigations on efficient strategies of functioning of natural light-harvesting antennae, initiated by a concept of rigorous optimization of photosynthetic apparatus by functional criterion, and deals with the problem of an optimal spectral coordination of subantennae in photosynthetic superantenna of the green bacterium Oscillochloris trichoides from a new family of green bacteria Oscillochloridaceae based in 2000. At present, two subantennae were identified surely: chlorosomal BChl c subantenna B750 and membrane BChl a subantennae B805-860. Some indirect experiments indicated on the presence of minor amounts of BChl a in isolated chlorosomes which allowed us to propose on the existence of an intermediate-energy subantenna which can connect the chlorosomal BChl c and the membrane BChl a ones. However, in the absorption spectra of isolated chlorosomes, this BChl a subantenna was not visually identified. This promoted us to perform a theoretical analysis of the optimality of spectral coordination of Oscillochloris trichoides subantennae. Using mathematical modeling for the functioning of the natural superantenna, we showed that an intermediate-energy subantenna, connecting B750 and B805-860 ones, allows one to control superantenna efficiency, i.e. to optimize the excitation energy transfer from B750 to B805 by functional criterion, and hence, the existence of such intermediate-energy subantenna is biologically expedient.

  5. Microprocessor-based integration of microfluidic control for the implementation of automated sensor monitoring and multithreaded optimization algorithms.

    Science.gov (United States)

    Ezra, Elishai; Maor, Idan; Bavli, Danny; Shalom, Itai; Levy, Gahl; Prill, Sebastian; Jaeger, Magnus S; Nahmias, Yaakov

    2015-08-01

    Microfluidic applications range from combinatorial synthesis to high throughput screening, with platforms integrating analog perfusion components, digitally controlled micro-valves and a range of sensors that demand a variety of communication protocols. Currently, discrete control units are used to regulate and monitor each component, resulting in scattered control interfaces that limit data integration and synchronization. Here, we present a microprocessor-based control unit, utilizing the MS Gadgeteer open framework that integrates all aspects of microfluidics through a high-current electronic circuit that supports and synchronizes digital and analog signals for perfusion components, pressure elements, and arbitrary sensor communication protocols using a plug-and-play interface. The control unit supports an integrated touch screen and TCP/IP interface that provides local and remote control of flow and data acquisition. To establish the ability of our control unit to integrate and synchronize complex microfluidic circuits we developed an equi-pressure combinatorial mixer. We demonstrate the generation of complex perfusion sequences, allowing the automated sampling, washing, and calibrating of an electrochemical lactate sensor continuously monitoring hepatocyte viability following exposure to the pesticide rotenone. Importantly, integration of an optical sensor allowed us to implement automated optimization protocols that require different computational challenges including: prioritized data structures in a genetic algorithm, distributed computational efforts in multiple-hill climbing searches and real-time realization of probabilistic models in simulated annealing. Our system offers a comprehensive solution for establishing optimization protocols and perfusion sequences in complex microfluidic circuits.

  6. Civil Engineering and Building Service Topographic Permanent Landmarks Network. Spatial Coordinate Optimization

    Directory of Open Access Journals (Sweden)

    Lepadatu Daniel

    2016-06-01

    Full Text Available Sustainable development is a modern concept of adaptation conditions for achieving objectives that respond simultaneously to at least three major requirements: economic, social and environmental. Achieving sustainable development cannot be accomplished without a change of mentality of people and without communities able to use resources rationally and efficiently. For an efficient application programs surveying topography discipline the students have imagined and created a network of local topographic permanent terminals required for reporting the rectangular coordinates of applications. In order to obtain more accurate values of these coordinates we have made several types of measurements that will be presented in detail in this work.

  7. Towards an Automated Pipeline for the Translation and Optimization of Geospatial Data for Virtual Environments

    Science.gov (United States)

    2008-12-01

    clearly observed in the game industry ( Introversion , 2008). Currently there are many tools available to assist in automating the production of large...Maya, there is the option to embed in it more abstract- level information that can be used by the artificial intelligence (AI) or human user within...Graphics and Interactive Techniques, Melbourne, Australia, February 11 – 14. Introversion Software, 2008: Procedural Content Generation. http

  8. Digital Piracy: An Assessment of Consumer Piracy Risk and Optimal Supply Chain Coordination Strategies

    Science.gov (United States)

    Jeong, Bong-Keun

    2010-01-01

    Digital piracy and the emergence of new distribution channels have changed the dynamics of supply chain coordination and created many interesting problems. There has been increased attention to understanding the phenomenon of consumer piracy behavior and its impact on supply chain profitability. The purpose of this dissertation is to better…

  9. Statistical Learning in Automated Troubleshooting: Application to LTE Interference Mitigation

    CERN Document Server

    Tiwana, Moazzam Islam; Altman, Zwi

    2010-01-01

    This paper presents a method for automated healing as part of off-line automated troubleshooting. The method combines statistical learning with constraint optimization. The automated healing aims at locally optimizing radio resource management (RRM) or system parameters of cells with poor performance in an iterative manner. The statistical learning processes the data using Logistic Regression (LR) to extract closed form (functional) relations between Key Performance Indicators (KPIs) and Radio Resource Management (RRM) parameters. These functional relations are then processed by an optimization engine which proposes new parameter values. The advantage of the proposed formulation is the small number of iterations required by the automated healing method to converge, making it suitable for off-line implementation. The proposed method is applied to heal an Inter-Cell Interference Coordination (ICIC) process in a 3G Long Term Evolution (LTE) network which is based on soft-frequency reuse scheme. Numerical simulat...

  10. Optimization of explicit time-stepping algorithms and Stream-Function-Coordinate (SFC) concept for fluid dynamics problems

    Science.gov (United States)

    Huang, Chung-Yuan

    A new formulation of the stream function based on a stream function coordinate (SFC) concept for inviscid flow field calculations is presented. In addition, a new method is developed not only to accelerate, but also to stabilize the iterative schemes for steady and unsteady, linear and non-linear, scalar and system of coupled, partial differential equations. With this theory, the limitation on the time step size of an explicit scheme for solving unsteady problems and the limitation on the relaxation factors of an iterative scheme for solving steady state problems could be analytically determined. Moreover, this theory allows the determination of the optimal time steps for explicit time-stepping schemes and the optimal values of the acceleration factors for iterative schemes, if the transient behavior is immaterial.

  11. Optimizing object-based image analysis for semi-automated geomorphological mapping

    NARCIS (Netherlands)

    Anders, N.; Smith, M.; Seijmonsbergen, H.; Bouten, W.; Hengl, T.; Evans, I.S.; Wilson, J.P.; Gould, M.

    2011-01-01

    Object-Based Image Analysis (OBIA) is considered a useful tool for analyzing high-resolution digital terrain data. In the past, both segmentation and classification parameters were optimized manually by trial and error. We propose a method to automatically optimize classification parameters for incr

  12. Optimal Geometrical Set for Automated Marker Placement to Virtualized Real-Time Facial Emotions.

    Directory of Open Access Journals (Sweden)

    Vasanthan Maruthapillai

    Full Text Available In recent years, real-time face recognition has been a major topic of interest in developing intelligent human-machine interaction systems. Over the past several decades, researchers have proposed different algorithms for facial expression recognition, but there has been little focus on detection in real-time scenarios. The present work proposes a new algorithmic method of automated marker placement used to classify six facial expressions: happiness, sadness, anger, fear, disgust, and surprise. Emotional facial expressions were captured using a webcam, while the proposed algorithm placed a set of eight virtual markers on each subject's face. Facial feature extraction methods, including marker distance (distance between each marker to the center of the face and change in marker distance (change in distance between the original and new marker positions, were used to extract three statistical features (mean, variance, and root mean square from the real-time video sequence. The initial position of each marker was subjected to the optical flow algorithm for marker tracking with each emotional facial expression. Finally, the extracted statistical features were mapped into corresponding emotional facial expressions using two simple non-linear classifiers, K-nearest neighbor and probabilistic neural network. The results indicate that the proposed automated marker placement algorithm effectively placed eight virtual markers on each subject's face and gave a maximum mean emotion classification rate of 96.94% using the probabilistic neural network.

  13. Automated Gravimetric Calibration to Optimize the Accuracy and Precision of TECAN Freedom EVO Liquid Handler.

    Science.gov (United States)

    Bessemans, Laurent; Jully, Vanessa; de Raikem, Caroline; Albanese, Mathieu; Moniotte, Nicolas; Silversmet, Pascal; Lemoine, Dominique

    2016-10-01

    High-throughput screening technologies are increasingly integrated into the formulation development process of biopharmaceuticals. The performance of liquid handling systems is dependent on the ability to deliver accurate and precise volumes of specific reagents to ensure process quality. We have developed an automated gravimetric calibration procedure to adjust the accuracy and evaluate the precision of the TECAN Freedom EVO liquid handling system. Volumes from 3 to 900 µL using calibrated syringes and fixed tips were evaluated with various solutions, including aluminum hydroxide and phosphate adjuvants, β-casein, sucrose, sodium chloride, and phosphate-buffered saline. The methodology to set up liquid class pipetting parameters for each solution was to split the process in three steps: (1) screening of predefined liquid class, including different pipetting parameters; (2) adjustment of accuracy parameters based on a calibration curve; and (3) confirmation of the adjustment. The run of appropriate pipetting scripts, data acquisition, and reports until the creation of a new liquid class in EVOware was fully automated. The calibration and confirmation of the robotic system was simple, efficient, and precise and could accelerate data acquisition for a wide range of biopharmaceutical applications.

  14. Automated Identification of the Heart Wall Throughout the Entire Cardiac Cycle Using Optimal Cardiac Phase for Extracted Features

    Science.gov (United States)

    Takahashi, Hiroki; Hasegawa, Hideyuki; Kanai, Hiroshi

    2011-07-01

    In most methods for evaluation of cardiac function based on echocardiography, the heart wall is currently identified manually by an operator. However, this task is very time-consuming and suffers from inter- and intraobserver variability. The present paper proposes a method that uses multiple features of ultrasonic echo signals for automated identification of the heart wall region throughout an entire cardiac cycle. In addition, the optimal cardiac phase to select a frame of interest, i.e., the frame for the initiation of tracking, was determined. The heart wall region at the frame of interest in this cardiac phase was identified by the expectation-maximization (EM) algorithm, and heart wall regions in the following frames were identified by tracking each point classified in the initial frame as the heart wall region using the phased tracking method. The results for two subjects indicate the feasibility of the proposed method in the longitudinal axis view of the heart.

  15. Kriging-Based Parameter Estimation Algorithm for Metabolic Networks Combined with Single-Dimensional Optimization and Dynamic Coordinate Perturbation.

    Science.gov (United States)

    Wang, Hong; Wang, Xicheng; Li, Zheng; Li, Keqiu

    2016-01-01

    The metabolic network model allows for an in-depth insight into the molecular mechanism of a particular organism. Because most parameters of the metabolic network cannot be directly measured, they must be estimated by using optimization algorithms. However, three characteristics of the metabolic network model, i.e., high nonlinearity, large amount parameters, and huge variation scopes of parameters, restrict the application of many traditional optimization algorithms. As a result, there is a growing demand to develop efficient optimization approaches to address this complex problem. In this paper, a Kriging-based algorithm aiming at parameter estimation is presented for constructing the metabolic networks. In the algorithm, a new infill sampling criterion, named expected improvement and mutual information (EI&MI), is adopted to improve the modeling accuracy by selecting multiple new sample points at each cycle, and the domain decomposition strategy based on the principal component analysis is introduced to save computing time. Meanwhile, the convergence speed is accelerated by combining a single-dimensional optimization method with the dynamic coordinate perturbation strategy when determining the new sample points. Finally, the algorithm is applied to the arachidonic acid metabolic network to estimate its parameters. The obtained results demonstrate the effectiveness of the proposed algorithm in getting precise parameter values under a limited number of iterations.

  16. A Wolf Pack Algorithm for Active and Reactive Power Coordinated Optimization in Active Distribution Network

    Science.gov (United States)

    Zhuang, H. M.; Jiang, X. J.

    2016-08-01

    This paper presents an active and reactive power dynamic optimization model for active distribution network (ADN), whose control variables include the output of distributed generations (DGs), charge or discharge power of energy storage system (ESS) and reactive power from capacitor banks. To solve the high-dimension nonlinear optimization model, a new heuristic swarm intelligent method, namely wolf pack algorithm (WPA) with better global convergence and computational robustness, is adapted so that the network loss minimization can be achieved. In this paper, the IEEE33-bus system is used to show the effectiveness of WPA technique compared with other techniques. Numerical tests on the modified IEEE 33-bus system show that WPA for active and reactive multi-period optimization of ADN is exact and effective.

  17. GPRS and Bluetooth Based Devices/Mobile Connectivity Shifting From Manual To Automation For Performance Optimization

    Directory of Open Access Journals (Sweden)

    Nazia Bibi

    2011-09-01

    Full Text Available Many companies/organizations are trying to move towards automation and provide their workers with the internet facility on their mobile in order to carry out their routine tasks to save time and resources. The proposed system is based on GPRS technology aims to provide a solution to problem faced in carryout routine tasks considering mobility. The system is designed in a way that facilitates Workers/field staff get updates on their mobile phone regarding tasks at hand. This System is beneficial in a sense that it saves resources in term of time, human resources and cuts down the paper work. The proposed system has been developed in view of research study conducted in the software development and telecom industry and provides a high end solution to the customers/fieldworkers that use GPRS technology for transactions updates of databases.

  18. Optimizing Automated Classification of Periodic Variable Stars in New Synoptic Surveys

    CERN Document Server

    Long, James P; Rice, John A; Richards, Joseph W; Bloom, Joshua S

    2012-01-01

    Efficient and automated classification of periodic variable stars is becoming increasingly important as the scale of astronomical surveys grows. Several recent papers have used methods from machine learning and statistics to construct classifiers on databases of labeled, multi--epoch sources with the intention of using these classifiers to automatically infer the classes of unlabeled sources from new surveys. However, the same source observed with two different synoptic surveys will generally yield different derived metrics (features) from the light curve. Since such features are used in classifiers, this survey-dependent mismatch in feature space will typically lead to degraded classifier performance. In this paper we show how and why feature distributions change using OGLE and \\textit{Hipparcos} light curves. To overcome survey systematics, we apply a method, \\textit{noisification}, which attempts to empirically match distributions of features between the labeled sources used to construct the classifier and...

  19. New strategies for medical data mining, part 3: automated workflow analysis and optimization.

    Science.gov (United States)

    Reiner, Bruce

    2011-02-01

    The practice of evidence-based medicine calls for the creation of "best practice" guidelines, leading to improved clinical outcomes. One of the primary factors limiting evidence-based medicine in radiology today is the relative paucity of standardized databases. The creation of standardized medical imaging databases offer the potential to enhance radiologist workflow and diagnostic accuracy through objective data-driven analytics, which can be categorized in accordance with specific variables relating to the individual examination, patient, provider, and technology being used. In addition to this "global" database analysis, "individual" radiologist workflow can be analyzed through the integration of electronic auditing tools into the PACS. The combination of these individual and global analyses can ultimately identify best practice patterns, which can be adapted to the individual attributes of end users and ultimately used in the creation of automated evidence-based medicine workflow templates.

  20. An automated optimization tool for high-dose-rate (HDR) prostate brachytherapy with divergent needle pattern

    Science.gov (United States)

    Borot de Battisti, M.; Maenhout, M.; de Senneville, B. Denis; Hautvast, G.; Binnekamp, D.; Lagendijk, J. J. W.; van Vulpen, M.; Moerland, M. A.

    2015-10-01

    Focal high-dose-rate (HDR) for prostate cancer has gained increasing interest as an alternative to whole gland therapy as it may contribute to the reduction of treatment related toxicity. For focal treatment, optimal needle guidance and placement is warranted. This can be achieved under MR guidance. However, MR-guided needle placement is currently not possible due to space restrictions in the closed MR bore. To overcome this problem, a MR-compatible, single-divergent needle-implant robotic device is under development at the University Medical Centre, Utrecht: placed between the legs of the patient inside the MR bore, this robot will tap the needle in a divergent pattern from a single rotation point into the tissue. This rotation point is just beneath the perineal skin to have access to the focal prostate tumor lesion. Currently, there is no treatment planning system commercially available which allows optimization of the dose distribution with such needle arrangement. The aim of this work is to develop an automatic inverse dose planning optimization tool for focal HDR prostate brachytherapy with needle insertions in a divergent configuration. A complete optimizer workflow is proposed which includes the determination of (1) the position of the center of rotation, (2) the needle angulations and (3) the dwell times. Unlike most currently used optimizers, no prior selection or adjustment of input parameters such as minimum or maximum dose or weight coefficients for treatment region and organs at risk is required. To test this optimizer, a planning study was performed on ten patients (treatment volumes ranged from 8.5 cm3to 23.3 cm3) by using 2-14 needle insertions. The total computation time of the optimizer workflow was below 20 min and a clinically acceptable plan was reached on average using only four needle insertions.

  1. An Automated, Adaptive Framework for Optimizing Preprocessing Pipelines in Task-Based Functional MRI.

    Science.gov (United States)

    Churchill, Nathan W; Spring, Robyn; Afshin-Pour, Babak; Dong, Fan; Strother, Stephen C

    2015-01-01

    BOLD fMRI is sensitive to blood-oxygenation changes correlated with brain function; however, it is limited by relatively weak signal and significant noise confounds. Many preprocessing algorithms have been developed to control noise and improve signal detection in fMRI. Although the chosen set of preprocessing and analysis steps (the "pipeline") significantly affects signal detection, pipelines are rarely quantitatively validated in the neuroimaging literature, due to complex preprocessing interactions. This paper outlines and validates an adaptive resampling framework for evaluating and optimizing preprocessing choices by optimizing data-driven metrics of task prediction and spatial reproducibility. Compared to standard "fixed" preprocessing pipelines, this optimization approach significantly improves independent validation measures of within-subject test-retest, and between-subject activation overlap, and behavioural prediction accuracy. We demonstrate that preprocessing choices function as implicit model regularizers, and that improvements due to pipeline optimization generalize across a range of simple to complex experimental tasks and analysis models. Results are shown for brief scanning sessions (<3 minutes each), demonstrating that with pipeline optimization, it is possible to obtain reliable results and brain-behaviour correlations in relatively small datasets.

  2. An Automated, Adaptive Framework for Optimizing Preprocessing Pipelines in Task-Based Functional MRI.

    Directory of Open Access Journals (Sweden)

    Nathan W Churchill

    Full Text Available BOLD fMRI is sensitive to blood-oxygenation changes correlated with brain function; however, it is limited by relatively weak signal and significant noise confounds. Many preprocessing algorithms have been developed to control noise and improve signal detection in fMRI. Although the chosen set of preprocessing and analysis steps (the "pipeline" significantly affects signal detection, pipelines are rarely quantitatively validated in the neuroimaging literature, due to complex preprocessing interactions. This paper outlines and validates an adaptive resampling framework for evaluating and optimizing preprocessing choices by optimizing data-driven metrics of task prediction and spatial reproducibility. Compared to standard "fixed" preprocessing pipelines, this optimization approach significantly improves independent validation measures of within-subject test-retest, and between-subject activation overlap, and behavioural prediction accuracy. We demonstrate that preprocessing choices function as implicit model regularizers, and that improvements due to pipeline optimization generalize across a range of simple to complex experimental tasks and analysis models. Results are shown for brief scanning sessions (<3 minutes each, demonstrating that with pipeline optimization, it is possible to obtain reliable results and brain-behaviour correlations in relatively small datasets.

  3. Automated evolutionary optimization of ion channel conductances and kinetics in models of young and aged rhesus monkey pyramidal neurons.

    Science.gov (United States)

    Rumbell, Timothy H; Draguljić, Danel; Yadav, Aniruddha; Hof, Patrick R; Luebke, Jennifer I; Weaver, Christina M

    2016-08-01

    Conductance-based compartment modeling requires tuning of many parameters to fit the neuron model to target electrophysiological data. Automated parameter optimization via evolutionary algorithms (EAs) is a common approach to accomplish this task, using error functions to quantify differences between model and target. We present a three-stage EA optimization protocol for tuning ion channel conductances and kinetics in a generic neuron model with minimal manual intervention. We use the technique of Latin hypercube sampling in a new way, to choose weights for error functions automatically so that each function influences the parameter search to a similar degree. This protocol requires no specialized physiological data collection and is applicable to commonly-collected current clamp data and either single- or multi-objective optimization. We applied the protocol to two representative pyramidal neurons from layer 3 of the prefrontal cortex of rhesus monkeys, in which action potential firing rates are significantly higher in aged compared to young animals. Using an idealized dendritic topology and models with either 4 or 8 ion channels (10 or 23 free parameters respectively), we produced populations of parameter combinations fitting the target datasets in less than 80 hours of optimization each. Passive parameter differences between young and aged models were consistent with our prior results using simpler models and hand tuning. We analyzed parameter values among fits to a single neuron to facilitate refinement of the underlying model, and across fits to multiple neurons to show how our protocol will lead to predictions of parameter differences with aging in these neurons.

  4. Automated tracing of open-field coronal structures for an optimized large-scale magnetic field reconstruction

    Science.gov (United States)

    Uritsky, V. M.; Davila, J. M.; Jones, S. I.

    2014-12-01

    Solar Probe Plus and Solar Orbiter will provide detailed measurements in the inner heliosphere magnetically connected with the topologically complex and eruptive solar corona. Interpretation of these measurements will require accurate reconstruction of the large-scale coronal magnetic field. In a related presentation by S. Jones et al., we argue that such reconstruction can be performed using photospheric extrapolation methods constrained by white-light coronagraph images. Here, we present the image-processing component of this project dealing with an automated segmentation of fan-like coronal loop structures. In contrast to the existing segmentation codes designed for detecting small-scale closed loops in the vicinity of active regions, we focus on the large-scale geometry of the open-field coronal features observed at significant radial distances from the solar surface. The coronagraph images used for the loop segmentation are transformed into a polar coordinate system and undergo radial detrending and initial noise reduction. The preprocessed images are subject to an adaptive second order differentiation combining radial and azimuthal directions. An adjustable thresholding technique is applied to identify candidate coronagraph features associated with the large-scale coronal field. A blob detection algorithm is used to extract valid features and discard noisy data pixels. The obtained features are interpolated using higher-order polynomials which are used to derive empirical directional constraints for magnetic field extrapolation procedures based on photospheric magnetograms.

  5. SU-E-J-130: Automating Liver Segmentation Via Combined Global and Local Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Li, Dengwang; Wang, Jie [College of Physics and Electronics, Shandong Normal University, Jinan, Shandong (China); Kapp, Daniel S.; Xing, Lei [Department of Radiation Oncology, Stanford University School of Medicine, Stanford, CA (United States)

    2015-06-15

    Purpose: The aim of this work is to develop a robust algorithm for accurate segmentation of liver with special attention paid to the problems with fuzzy edges and tumor. Methods: 200 CT images were collected from radiotherapy treatment planning system. 150 datasets are selected as the panel data for shape dictionary and parameters estimation. The remaining 50 datasets were used as test images. In our study liver segmentation was formulated as optimization process of implicit function. The liver region was optimized via local and global optimization during iterations. Our method consists five steps: 1)The livers from the panel data were segmented manually by physicians, and then We estimated the parameters of GMM (Gaussian mixture model) and MRF (Markov random field). Shape dictionary was built by utilizing the 3D liver shapes. 2)The outlines of chest and abdomen were located according to rib structure in the input images, and the liver region was initialized based on GMM. 3)The liver shape for each 2D slice was adjusted using MRF within the neighborhood of liver edge for local optimization. 4)The 3D liver shape was corrected by employing SSR (sparse shape representation) based on liver shape dictionary for global optimization. Furthermore, H-PSO(Hybrid Particle Swarm Optimization) was employed to solve the SSR equation. 5)The corrected 3D liver was divided into 2D slices as input data of the third step. The iteration was repeated within the local optimization and global optimization until it satisfied the suspension conditions (maximum iterations and changing rate). Results: The experiments indicated that our method performed well even for the CT images with fuzzy edge and tumors. Comparing with physician delineated results, the segmentation accuracy with the 50 test datasets (VOE, volume overlap percentage) was on average 91%–95%. Conclusion: The proposed automatic segmentation method provides a sensible technique for segmentation of CT images. This work is

  6. An Optimized Clustering Approach for Automated Detection of White Matter Lesions in MRI Brain Images

    Directory of Open Access Journals (Sweden)

    M. Anitha

    2012-04-01

    Full Text Available Settings White Matter lesions (WMLs are small areas of dead cells found in parts of the brain. In general, it is difficult for medical experts to accurately quantify the WMLs due to decreased contrast between White Matter (WM and Grey Matter (GM. The aim of this paper is to
    automatically detect the White Matter Lesions which is present in the brains of elderly people. WML detection process includes the following stages: 1. Image preprocessing, 2. Clustering (Fuzzy c-means clustering, Geostatistical Possibilistic clustering and Geostatistical Fuzzy clustering and 3.Optimization using Particle Swarm Optimization (PSO. The proposed system is tested on a database of 208 MRI images. GFCM yields high sensitivity of 89%, specificity of 94% and overall accuracy of 93% over FCM and GPC. The clustered brain images are then subjected to Particle Swarm Optimization (PSO. The optimized result obtained from GFCM-PSO provides sensitivity of 90%, specificity of 94% and accuracy of 95%. The detection results reveals that GFCM and GFCMPSO better localizes the large regions of lesions and gives less false positive rate when compared to GPC and GPC-PSO which captures the largest loads of WMLs only in the upper ventral horns of the brain.

  7. SWANS: A Prototypic SCALE Criticality Sequence for Automated Optimization Using the SWAN Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Greenspan, E.

    2001-01-11

    SWANS is a new prototypic analysis sequence that provides an intelligent, semi-automatic search for the maximum k{sub eff} of a given amount of specified fissile material, or of the minimum critical mass. It combines the optimization strategy of the SWAN code with the composition-dependent resonance self-shielded cross sections of the SCALE package. For a given system composition arrived at during the iterative optimization process, the value of k{sub eff} is as accurate and reliable as obtained using the CSAS1X Sequence of SCALE-4.4. This report describes how SWAN is integrated within the SCALE system to form the new prototypic optimization sequence, describes the optimization procedure, provides a user guide for SWANS, and illustrates its application to five different types of problems. In addition, the report illustrates that resonance self-shielding might have a significant effect on the maximum k{sub eff} value a given fissile material mass can have.

  8. Automated Design of Synthetic Cell Classifier Circuits Using a Two-Step Optimization Strategy.

    Science.gov (United States)

    Mohammadi, Pejman; Beerenwinkel, Niko; Benenson, Yaakov

    2017-02-22

    Cell classifiers are genetic logic circuits that transduce endogenous molecular inputs into cell-type-specific responses. Designing classifiers that achieve optimal differential response between specific cell types is a hard computational problem because it involves selection of endogenous inputs and optimization of both biochemical parameters and a logic function. To address this problem, we first derive an optimal set of biochemical parameters with the largest expected differential response over a diverse set of logic circuits, and second, we use these parameters in an evolutionary algorithm to select circuit inputs and optimize the logic function. Using this approach, we design experimentally feasible microRNA-based circuits capable of perfect discrimination for several real-world cell-classification tasks. We also find that under realistic cell-to-cell variation, circuit performance is comparable to standard cross-validation performance estimates. Our approach facilitates the generation of candidate circuits for experimental testing in therapeutic settings that require precise cell targeting, such as cancer therapy.

  9. An automated optimization tool for high-dose-rate (HDR) prostate brachytherapy with divergent needle pattern

    NARCIS (Netherlands)

    Borot, Maxence; Maenhout, M.; de Senneville, B. Denis; Hautvast, G.; Binnekamp, D.; Lagendijk, J. J. W.; van Vulpen, M.; Moerland, M. A.

    2015-01-01

    Focal high-dose-rate (HDR) for prostate cancer has gained increasing interest as an alternative to whole gland therapy as it may contribute to the reduction of treatment related toxicity. For focal treatment, optimal needle guidance and placement is warranted. This can be achieved under MR guidance.

  10. The scheme of combined application of optimization and simulation models for formation of an optimum structure of an automated control system of space systems

    Science.gov (United States)

    Chernigovskiy, A. S.; Tsarev, R. Yu; Nikiforov, A. Yu; Zelenkov, P. V.

    2016-11-01

    With the development of automated control systems of space systems, there are new classes of spacecraft that requires improvement of their structure and expand their functions. When designing the automated control system of space systems occurs various tasks such as: determining location of elements and subsystems in the space, hardware selection, the distribution of the set of functions performed by the system units, all of this under certain conditions on the quality of control and connectivity of components. The problem of synthesis of structure of automated control system of space systems formalized using discrete variables at various levels of system detalization. A sequence of tasks and stages of the formation of automated control system of space systems structure is developed. The authors have developed and proposed a scheme of the combined implementation of optimization and simulation models to ensure rational distribution of functions between the automated control system complex and the rest of the system units. The proposed approach allows to make reasonable hardware selection, taking into account the different requirements for the operation of automated control systems of space systems.

  11. Process optimization and biocompatibility of cell carriers suitable for automated magnetic manipulation.

    Science.gov (United States)

    Krejci, I; Piana, C; Howitz, S; Wegener, T; Fiedler, S; Zwanzig, M; Schmitt, D; Daum, N; Meier, K; Lehr, C M; Batista, U; Zemljic, S; Messerschmidt, J; Franzke, J; Wirth, M; Gabor, F

    2012-03-01

    There is increasing demand for automated cell reprogramming in the fields of cell biology, biotechnology and the biomedical sciences. Microfluidic-based platforms that provide unattended manipulation of adherent cells promise to be an appropriate basis for cell manipulation. In this study we developed a magnetically driven cell carrier to serve as a vehicle within an in vitro environment. To elucidate the impact of the carrier on cells, biocompatibility was estimated using the human adenocarcinoma cell line Caco-2. Besides evaluation of the quality of the magnetic carriers by field emission scanning electron microscopy, the rate of adherence, proliferation and differentiation of Caco-2 cells grown on the carriers was quantified. Moreover, the morphology of the cells was monitored by immunofluorescent staining. Early generations of the cell carrier suffered from release of cytotoxic nickel from the magnetic cushion. Biocompatibility was achieved by complete encapsulation of the nickel bulk within galvanic gold. The insulation process had to be developed stepwise and was controlled by parallel monitoring of the cell viability. The final carrier generation proved to be a proper support for cell manipulation, allowing proliferation of Caco-2 cells equal to that on glass or polystyrene as a reference for up to 10 days. Functional differentiation was enhanced by more than 30% compared with the reference. A flat, ferromagnetic and fully biocompatible carrier for cell manipulation was developed for application in microfluidic systems. Beyond that, this study offers advice for the development of magnetic cell carriers and the estimation of their biocompatibility.

  12. Optimization and Quality Control of Automated Quantitative Mineralogy Analysis for Acid Rock Drainage Prediction

    Directory of Open Access Journals (Sweden)

    Robert Pooler

    2017-01-01

    Full Text Available Low ore-grade waste samples from the Codelco Andina mine that were analyzed in an environmental and mineralogical test program for acid rock drainage prediction, revealed inconsistencies between the quantitative mineralogical data (QEMSCAN® and the results of geochemical characterizations by atomic absorption spectroscopy (AAS, LECO® furnace, and sequential extractions. For the QEMSCAN® results, biases were observed in the proportions of pyrite and calcium sulfate minerals detected. An analysis of the results indicated that the problems observed were likely associated with polished section preparation. Therefore, six different sample preparation protocols were tested and evaluated using three samples from the previous study. One of the methods, which involved particle size reduction and transverse section preparation, was identified as having the greatest potential for correcting the errors observed in the mineralogical analyses. Further, the biases in the quantities of calcium sulfate minerals detected were reduced through the use of ethylene glycol as a polishing lubricant. It is recommended that the sample preparation methodology described in this study be used in order to accurately quantify percentages of pyrite and calcium sulfate minerals in environmental mineralogical studies which use automated mineralogical analysis.

  13. Software integration for automated stability analysis and design optimization of a bearingless rotor blade

    Science.gov (United States)

    Gunduz, Mustafa Emre

    Many government agencies and corporations around the world have found the unique capabilities of rotorcraft indispensable. Incorporating such capabilities into rotorcraft design poses extra challenges because it is a complicated multidisciplinary process. The concept of applying several disciplines to the design and optimization processes may not be new, but it does not currently seem to be widely accepted in industry. The reason for this might be the lack of well-known tools for realizing a complete multidisciplinary design and analysis of a product. This study aims to propose a method that enables engineers in some design disciplines to perform a fairly detailed analysis and optimization of a design using commercially available software as well as codes developed at Georgia Tech. The ultimate goal is when the system is set up properly, the CAD model of the design, including all subsystems, will be automatically updated as soon as a new part or assembly is added to the design; or it will be updated when an analysis and/or an optimization is performed and the geometry needs to be modified. Designers and engineers will be involved in only checking the latest design for errors or adding/removing features. Such a design process will take dramatically less time to complete; therefore, it should reduce development time and costs. The optimization method is demonstrated on an existing helicopter rotor originally designed in the 1960's. The rotor is already an effective design with novel features. However, application of the optimization principles together with high-speed computing resulted in an even better design. The objective function to be minimized is related to the vibrations of the rotor system under gusty wind conditions. The design parameters are all continuous variables. Optimization is performed in a number of steps. First, the most crucial design variables of the objective function are identified. With these variables, Latin Hypercube Sampling method is used

  14. Analytical study on coordinative optimization of convection in tubes with variable heat flux

    Institute of Scientific and Technical Information of China (English)

    2004-01-01

    [1]Guo, Z. Y., Li, D. Y., Wang, B. X., A novel concept for convective heat transfer enhancement, Int. J. Heat Mass Transfer, 1998, 41: 2221-2225.[2]Tao, W. Q., Guo, Z. Y., Wang, B. X., Field synergy principle for enhancing convective heat transfer--extension and numerical verification, Int. J. Heat Mass Transfer, 2002, 45: 3849-3856.[3]Guo, Z. Y., Mechanism and control of convective heat transfer--Coordination of velocity and heat flow fields, Chinese Science Bulletin, 2001, 46(7): 596-599.[4]Sellars, J. R., Tribus, M., Klein, J. S., Heat transfer to laminar flow in a round tubes or flat conduit--The Graetz problem extended, Tras. ASME, 1956, 78: 441-448.[5]Kays, W. M., Crawford, M. E., Convective Heat Transfer, 3rd ed., Chapter 9, New York: McGraw-Hill Inc., 1993.[6]Shah, R. K., London, A. L., Laminar Flow Forced Convection in Ducts, Advances in Heat Transfer, New York: Academic Press, 1978.

  15. 多级系统可靠性最优化的分解对策模型及其协调算法%THE DECOMPOSITION DECISION MODELS OF RELAIABILITY OPTIMIZATION FOR A MULTI-STAGESYSTEM AND THEIR COORDINATION ALGORITHM

    Institute of Scientific and Technical Information of China (English)

    高作峰

    2000-01-01

    For a multilevel engineering system, considering reliabilities asdecomposition parameters, construction costs as coordination parameters,its decomposition models for reliability optimization is constructed,and its corresponding coordination algorithm is also given

  16. Automation of reverse engineering process in aircraft modeling and related optimization problems

    Science.gov (United States)

    Li, W.; Swetits, J.

    1994-01-01

    During the year of 1994, the engineering problems in aircraft modeling were studied. The initial concern was to obtain a surface model with desirable geometric characteristics. Much of the effort during the first half of the year was to find an efficient way of solving a computationally difficult optimization model. Since the smoothing technique in the proposal 'Surface Modeling and Optimization Studies of Aerodynamic Configurations' requires solutions of a sequence of large-scale quadratic programming problems, it is important to design algorithms that can solve each quadratic program in a few interactions. This research led to three papers by Dr. W. Li, which were submitted to SIAM Journal on Optimization and Mathematical Programming. Two of these papers have been accepted for publication. Even though significant progress has been made during this phase of research and computation times was reduced from 30 min. to 2 min. for a sample problem, it was not good enough for on-line processing of digitized data points. After discussion with Dr. Robert E. Smith Jr., it was decided not to enforce shape constraints in order in order to simplify the model. As a consequence, P. Dierckx's nonparametric spline fitting approach was adopted, where one has only one control parameter for the fitting process - the error tolerance. At the same time the surface modeling software developed by Imageware was tested. Research indicated a substantially improved fitting of digitalized data points can be achieved if a proper parameterization of the spline surface is chosen. A winning strategy is to incorporate Dierckx's surface fitting with a natural parameterization for aircraft parts. The report consists of 4 chapters. Chapter 1 provides an overview of reverse engineering related to aircraft modeling and some preliminary findings of the effort in the second half of the year. Chapters 2-4 are the research results by Dr. W. Li on penalty functions and conjugate gradient methods for

  17. Achievements and challenges in automated parameter, shape and topology optimization for divertor design

    Science.gov (United States)

    Baelmans, M.; Blommaert, M.; Dekeyser, W.; Van Oevelen, T.

    2017-03-01

    Plasma edge transport codes play a key role in the design of future divertor concepts. Their long simulation times in combination with a large number of control parameters turn the design into a challenging task. In aerodynamics and structural mechanics, adjoint-based optimization techniques have proven successful to tackle similar design challenges. This paper provides an overview of achievements and remaining challenges with these techniques for complex divertor design. It is shown how these developments pave the way for fast sensitivity analysis and improved design from different perspectives.

  18. 风电场的集群功率优化控制%Wind Farm Coordinated Control for Power Optimization

    Institute of Scientific and Technical Information of China (English)

    舒进; 郝治国; 张保会; 薄志谦

    2011-01-01

    This paper presents a novel wind farm Laguerre function based nonlinear model predictive control(NLMPC) coordinated controller for the wake loss reduction and the power capture optimization.According to the NLMPC with the dynamic wake model,the controller can increase the farm power efficiently by regulating all wind turbine generators(WTGs) in the farm.In the controller,predicted effective wind speed error correction was proposed to compensate the predictive model mismatch and Laguerre functions were introduced for reducing the burden of receding horizon optimization.The controller robust performance for the very-short term wind speed forecasting,which is critical for the practical application,was discussed.Simulation study shows that,the wind farm coordinated controller can increase the farm total power effectively in different wind conditions and reduce the time of receding horizon optimization.In addition,the controller has robust performance to the predictive model mismatch and free wind speed forecasting error.%以降低风电场尾流损失、优化风场出力为目标,设计基于Laguerre函数非线性预测控制(nonlinear modelpredictive control,NLMPC)方案的风场集群控制器。该控制器应用风场动态尾流模型,通过NLMPC统一调整风场内各机组转速以提升风场功率。在控制器设计中,使用有效风速预测误差校正对预测模型失配及超短期风速预测误差进行补偿,引入Laguerre函数降低滚动时域优化计算负担并分析了控制器对风速预测误差的鲁棒性能。仿真研究表明,集群控制器能够在不同风速条件下提升风场功率、降低优化计算负担,且对风速预测模型失配与风场自然风速预测误差具有鲁棒性。

  19. Automated optimization of measurement setups for the inspection of specular surfaces

    Science.gov (United States)

    Kammel, Soeren

    2002-02-01

    Specular surfaces are used in a wide variety of industrial and consumer products like varnished or chrome plated parts of car bodies, dies or molds. Defects of these parts reduce the quality regarding their visual appearance and/or their technical performance. Even defects that are only about 1 micrometer deep can lead to a rejection during quality control. Deflectometric techniques are an adequate approach to recognize and measure defects on specular surfaces, because the principle of measurement of these methods mimics the behavior of a human observer inspecting the surface. With these methods, the specular object is considered as a part of the optical system. Not the object itself but the surrounding that is reflected by the specular surface is observed in order to obtain information about the object. This technique has proven sensitive for slope and topography measurement. Inherited from the principle of measurement, especially surface parts with high curvature need a special illumination which surrounds the object under inspection to guarantee that light from any direction is reflected onto the sensor. Thus the design of a specific measurement setup requires a substantial engineering effort. To avoid the time consuming process of building, testing and redesigning the measurement setup, a system to simulate and automatically optimize the setup has been developed. Based on CAD data of the object under inspection and a model of the optical system, favorable realizations of the shape, the position and the pattern of the lighting device are determined. In addition, optimization of other system parameters, such as object position and distance relative to the camera, is performed. Finally, constraints are imposed to ascertain the feasibility of illumination system construction.

  20. AN OPTIMIZATION-BASED HEURISTIC FOR A CAPACITATED LOT-SIZING MODEL IN AN AUTOMATED TELLER MACHINES NETWORK

    Directory of Open Access Journals (Sweden)

    Supatchaya Chotayakul

    2013-01-01

    Full Text Available This research studies a cash inventory problem in an ATM Network to satisfy customer’s cash needs over multiple periods with deterministic demand. The objective is to determine the amount of money to place in Automated Teller Machines (ATMs and cash centers for each period over a given time horizon. The algorithms are designed as a multi-echelon inventory problem with single-item capacitated lot-sizing to minimize total costs of running ATM network. In this study, we formulate the problem as a Mixed Integer Program (MIP and develop an approach based on reformulating the model as a shortest path formulation for finding a near-optimal solution of the problem. This reformulation is the same as the traditional model, except the capacity constraints, inventory balance constraints and setup constraints related to the management of the money in ATMs are relaxed. This new formulation gives more variables and constraints, but has a much tighter linear relaxation than the original and is faster to solve for short term planning. Computational results show its effectiveness, especially for large sized problems.

  1. Automated Sperm Head Detection Using Intersecting Cortical Model Optimised by Particle Swarm Optimization

    Science.gov (United States)

    Tan, Weng Chun; Mat Isa, Nor Ashidi

    2016-01-01

    In human sperm motility analysis, sperm segmentation plays an important role to determine the location of multiple sperms. To ensure an improved segmentation result, the Laplacian of Gaussian filter is implemented as a kernel in a pre-processing step before applying the image segmentation process to automatically segment and detect human spermatozoa. This study proposes an intersecting cortical model (ICM), which was derived from several visual cortex models, to segment the sperm head region. However, the proposed method suffered from parameter selection; thus, the ICM network is optimised using particle swarm optimization where feature mutual information is introduced as the new fitness function. The final results showed that the proposed method is more accurate and robust than four state-of-the-art segmentation methods. The proposed method resulted in rates of 98.14%, 98.82%, 86.46% and 99.81% in accuracy, sensitivity, specificity and precision, respectively, after testing with 1200 sperms. The proposed algorithm is expected to be implemented in analysing sperm motility because of the robustness and capability of this algorithm. PMID:27632581

  2. Optimization of automated radiosynthesis of [{sup 18}F]AV-45: a new PET imaging agent for Alzheimer's disease

    Energy Technology Data Exchange (ETDEWEB)

    Liu Yajing; Zhu Lin [Key Laboratory of Radiopharmaceuticals, Beijing Normal University, Ministry of Education, Beijing, 100875 (China); Department of Radiology, University of Pennsylvania, Philadelphia, PA 19014 (United States); Ploessl, Karl [Department of Radiology, University of Pennsylvania, Philadelphia, PA 19014 (United States); Choi, Seok Rye [Avid Radiopharmaceuticals Inc., Philadelphia, PA 19014 (United States); Qiao Hongwen; Sun Xiaotao; Li Song [Key Laboratory of Radiopharmaceuticals, Beijing Normal University, Ministry of Education, Beijing, 100875 (China); Zha Zhihao [Key Laboratory of Radiopharmaceuticals, Beijing Normal University, Ministry of Education, Beijing, 100875 (China); Department of Radiology, University of Pennsylvania, Philadelphia, PA 19014 (United States); Kung, Hank F., E-mail: kunghf@sunmac.spect.upenn.ed [Key Laboratory of Radiopharmaceuticals, Beijing Normal University, Ministry of Education, Beijing, 100875 (China); Department of Radiology, University of Pennsylvania, Philadelphia, PA 19014 (United States)

    2010-11-15

    Introduction: Accumulation of {beta}-amyloid (A{beta}) aggregates in the brain is linked to the pathogenesis of Alzheimer's disease (AD). Imaging probes targeting these A{beta} aggregates in the brain may provide a useful tool to facilitate the diagnosis of AD. Recently, [{sup 18}F]AV-45 ([{sup 18}F]5) demonstrated high binding to the A{beta} aggregates in AD patients. To improve the availability of this agent for widespread clinical application, a rapid, fully automated, high-yield, cGMP-compliant radiosynthesis was necessary for production of this probe. We report herein an optimal [{sup 18}F]fluorination, de-protection condition and fully automated radiosynthesis of [{sup 18}F]AV-45 ([{sup 18}F]5) on a radiosynthesis module (BNU F-A2). Methods: The preparation of [{sup 18}F]AV-45 ([{sup 18}F]5) was evaluated under different conditions, specifically by employing different precursors (-OTs and -Br as the leaving group), reagents (K222/K{sub 2}CO{sub 3} vs. tributylammonium bicarbonate) and deprotection in different acids. With optimized conditions from these experiments, the automated synthesis of [{sup 18}F]AV-45 ([{sup 18}F]5) was accomplished by using a computer-programmed, standard operating procedure, and was purified on an on-line solid-phase cartridge (Oasis HLB). Results: The optimized reaction conditions were successfully implemented to an automated nucleophilic fluorination module. The radiochemical purity of [{sup 18}F]AV-45 ([{sup 18}F]5) was >95%, and the automated synthesis yield was 33.6{+-}5.2% (no decay corrected, n=4), 50.1{+-}7.9% (decay corrected) in 50 min at a quantity level of 10-100 mCi (370-3700 MBq). Autoradiography studies of [{sup 18}F]AV-45 ([{sup 18}F]5) using postmortem AD brain and Tg mouse brain sections in the presence of different concentration of 'cold' AV-136 showed a relatively low inhibition of in vitro binding of [{sup 18}F]AV-45 ([{sup 18}F]5) to the A{beta} plaques (IC50=1-4 {mu}M, a concentration several

  3. Optimal torque coordinating control of the launching with twin clutches simultaneously involved for dry dual-clutch transmission

    Science.gov (United States)

    Zhao, Z. G.; Chen, H. J.; Zhen, Z. X.; Yang, Y. Y.

    2014-06-01

    As for the self-developed six-speed dry dual-clutch transmission (DCT), the optimal torque-coordinated control strategy between engine and dual clutches is proposed to resolve the problem of launching with twin clutches simultaneously involved based on the minimum value principle. Focusing on the sliding friction phase of the launching process, dynamics equations of dry DCT with two intermediate shafts are firstly established, and then the optimal transmitting torque variation rate and the driven plate's rotating speed of dual clutches are deduced by using the minimum value principle, in which the jerk intensity and friction work are taken as the performance indexes, and the terminal constraints of state variables are determined according to the driver's launching intention. Besides, the separating conditions of non-target gear clutch and the torque distributing relations of twin clutches are derived from the launching control targets that guarantee the approximately equal friction extent of two clutches and no power cycle. After the synchronisation of driving and driven plates of on-coming clutch, the output torque of engine is smoothly switched to the driver's demand level. Furthermore, launching the simulation model of the dry DCT vehicle is set up on the Matlab/Simulink platform. Simulation results indicate that the proposed launching control strategy not only can effectively reflect the driver's intention and extend the life span of twin clutches, but also obtain an excellent launching quality. Finally, the torque control laws of two clutches obtained through the simulation are transformed into clutch position control laws for the future realisation in the real car, and the closed-loop position controls of twin clutches in the launching process are conducted on the test bench with two sets of clutch actuator, obtaining preferable tracking effects.

  4. A Hybrid Shuffled Frog Leaping Algorithm to Solve Optimal Directional Over Current Relay Coordination Problem for Power Delivery System with DGs

    Directory of Open Access Journals (Sweden)

    Mohammad Sadegh Payam

    2012-01-01

    Full Text Available This study presents a new approach for simultaneous coordinated tuning of over current relay for a Power Delivery System (PDS including Distribution Generations (DGs. In the proposed scheme, instead of changing in protection system structure or using new elements, solving of relay coordination problem is done with revising of relays setting in presence of DGs. For this, the relay coordination problem is formulated as the optimization problem by considering two strategies: minimizing the relays operation time and minimizing the number of changes in relays setting. Also, an efficient hybrid algorithm based on Shuffled Frog Leaping (SFL algorithm and Linear Programming (LP is introduced for solving complex and non-convex optimization problem. To investigate the ability of the proposed method, a 30-bus IEEE test system is considered. Three scenarios are examined to evaluate the effectiveness of the proposed approach to solve the directional overcurrent relay coordination problem for a PDS with DGs. Simulation result show the efficiency of proposed method.

  5. Analysis of the Optimal Duration of Behavioral Observations Based on an Automated Continuous Monitoring System in Tree Swallows (Tachycineta bicolor: Is One Hour Good Enough?

    Directory of Open Access Journals (Sweden)

    Ádám Z Lendvai

    Full Text Available Studies of animal behavior often rely on human observation, which introduces a number of limitations on sampling. Recent developments in automated logging of behaviors make it possible to circumvent some of these problems. Once verified for efficacy and accuracy, these automated systems can be used to determine optimal sampling regimes for behavioral studies. Here, we used a radio-frequency identification (RFID system to quantify parental effort in a bi-parental songbird species: the tree swallow (Tachycineta bicolor. We found that the accuracy of the RFID monitoring system was similar to that of video-recorded behavioral observations for quantifying parental visits. Using RFID monitoring, we also quantified the optimum duration of sampling periods for male and female parental effort by looking at the relationship between nest visit rates estimated from sampling periods with different durations and the total visit numbers for the day. The optimum sampling duration (the shortest observation time that explained the most variation in total daily visits per unit time was 1h for both sexes. These results show that RFID and other automated technologies can be used to quantify behavior when human observation is constrained, and the information from these monitoring technologies can be useful for evaluating the efficacy of human observation methods.

  6. THE APPLICATION OF AUTOMATED CORRELATION OPTIMIZED WARPING TO THE QUALITY EVALUATION OF Radix Puerariae thomsonii: CORRECTING RETENTION TIME SHIFT IN THE CHROMATOGRAPHIC FINGERPRINTS

    Directory of Open Access Journals (Sweden)

    Long Jiao

    2015-01-01

    Full Text Available The application of automated correlation optimized warping (ACOW to the correction of retention time shift in the chromatographic fingerprints of Radix Puerariae thomsonii (RPT was investigated. Twenty-seven samples were extracted from 9 batches of RPT products. The fingerprints of the 27 samples were established by the HPLC method. Because there is a retention time shift in the established fingerprints, the quality of these samples cannot be correctly evaluated by using similarity estimation and principal component analysis (PCA. Thus, the ACOW method was used to align these fingerprints. In the ACOW procedure, the warping parameters, which have a significant influence on the alignment result, were optimized by an automated algorithm. After correcting the retention time shift, the quality of these RPT samples was correctly evaluated by similarity estimation and PCA. It is demonstrated that ACOW is a practical method for aligning the chromatographic fingerprints of RPT. The combination of ACOW, similarity estimation, and PCA is shown to be a promising method for evaluating the quality of Traditional Chinese Medicine.

  7. Safety-oriented operation in electrical energy systems. Optimization of coordination and communication; Sicherheitsorientierte Betriebsfuehrung in Elektroenergiesystemen. Koordination und Kommunikation optimieren

    Energy Technology Data Exchange (ETDEWEB)

    Sillaber, A. [Innsbrucker Kommunalbetriebe AG (Austria). Geschaeftsbereich Strom Netz; Technische Univ. Graz (Austria). Inst. fuer Elektrische Anlagen; Renner, H. [Technische Univ. Graz (Austria). Inst. fuer Elektrische Anlagen

    2007-04-16

    Technologic developments and liberal market models mean new challenges to the operation of electrical energy systems. The well-proven European, national and company-internal regulations are a solid basis for a safe operation. In order to further develop the orientation at safety, economical incentives are needed as well as adapted company-internal operational rules, e.g. a tightened (n-1) principle. Special attention should be paid to robustness, coordination and communication, system services, preparation to dangerous situations, undelayed measures, quick reconstruction and permanent training. An economically optimal operation can be achieved only by means of a well-balanced overall view.

  8. A landscape lake flow pattern design approach based on automated CFD simulation and parallel multiple objective optimization.

    Science.gov (United States)

    Guo, Hao; Tian, Yimei; Shen, Hailiang; Wang, Yi; Kang, Mengxin

    A design approach for determining the optimal flow pattern in a landscape lake is proposed based on FLUENT simulation, multiple objective optimization, and parallel computing. This paper formulates the design into a multi-objective optimization problem, with lake circulation effects and operation cost as two objectives, and solves the optimization problem with non-dominated sorting genetic algorithm II. The lake flow pattern is modelled in FLUENT. The parallelization aims at multiple FLUENT instance runs, which is different from the FLUENT internal parallel solver. This approach: (1) proposes lake flow pattern metrics, i.e. weighted average water flow velocity, water volume percentage of low flow velocity, and variance of flow velocity, (2) defines user defined functions for boundary setting, objective and constraints calculation, and (3) parallels the execution of multiple FLUENT instances runs to significantly reduce the optimization wall-clock time. The proposed approach is demonstrated through a case study for Meijiang Lake in Tianjin, China.

  9. Warehouse automation

    OpenAIRE

    Pogačnik, Jure

    2017-01-01

    An automated high bay warehouse is commonly used for storing large number of material with a high throughput. In an automated warehouse pallet movements are mainly performed by a number of automated devices like conveyors systems, trolleys, and stacker cranes. From the introduction of the material to the automated warehouse system to its dispatch the system requires no operator input or intervention since all material movements are done automatically. This allows the automated warehouse to op...

  10. Optimal installation locations for automated external defibrillators in Taipei 7-Eleven stores: using GIS and a genetic algorithm with a new stirring operator.

    Science.gov (United States)

    Huang, Chung-Yuan; Wen, Tzai-Hung

    2014-01-01

    Immediate treatment with an automated external defibrillator (AED) increases out-of-hospital cardiac arrest (OHCA) patient survival potential. While considerable attention has been given to determining optimal public AED locations, spatial and temporal factors such as time of day and distance from emergency medical services (EMSs) are understudied. Here we describe a geocomputational genetic algorithm with a new stirring operator (GANSO) that considers spatial and temporal cardiac arrest occurrence factors when assessing the feasibility of using Taipei 7-Eleven stores as installation locations for AEDs. Our model is based on two AED conveyance modes, walking/running and driving, involving service distances of 100 and 300 meters, respectively. Our results suggest different AED allocation strategies involving convenience stores in urban settings. In commercial areas, such installations can compensate for temporal gaps in EMS locations when responding to nighttime OHCA incidents. In residential areas, store installations can compensate for long distances from fire stations, where AEDs are currently held in Taipei.

  11. Neural coordination can be enhanced by occasional interruption of normal firing patterns: a self-optimizing spiking neural network model.

    Science.gov (United States)

    Woodward, Alexander; Froese, Tom; Ikegami, Takashi

    2015-02-01

    The state space of a conventional Hopfield network typically exhibits many different attractors of which only a small subset satisfies constraints between neurons in a globally optimal fashion. It has recently been demonstrated that combining Hebbian learning with occasional alterations of normal neural states avoids this problem by means of self-organized enlargement of the best basins of attraction. However, so far it is not clear to what extent this process of self-optimization is also operative in real brains. Here we demonstrate that it can be transferred to more biologically plausible neural networks by implementing a self-optimizing spiking neural network model. In addition, by using this spiking neural network to emulate a Hopfield network with Hebbian learning, we attempt to make a connection between rate-based and temporal coding based neural systems. Although further work is required to make this model more realistic, it already suggests that the efficacy of the self-optimizing process is independent from the simplifying assumptions of a conventional Hopfield network. We also discuss natural and cultural processes that could be responsible for occasional alteration of neural firing patterns in actual brains.

  12. Conflict and Coordination Problem of Carbon Tax' Diversity Targets in China-Based on the Tax Optimization Theory%Conflict and Coordination Problem of Carbon Tax' Diversity Targets in China-Based on the Tax Optimization Theory

    Institute of Scientific and Technical Information of China (English)

    Xue Gang

    2011-01-01

    Among all the emission reduction measures, carbon tax is recognized as the most effective way to protect our climate. That is why the Chinese government has recently taken it as a tax reform direction, In the current economic analysis, the design of carbon tax is mostly based on the target to maximize the efficiency However, based on the theory of tax system optimization, we should also consider other policy objectives, such as equity, revenue and cost, and then balance different objectives to achieve the suboptimum reform of carbon tax system in China.

  13. Parameter Extraction for PSpice Models by means of an Automated Optimization Tool – An IGBT model Study Case

    DEFF Research Database (Denmark)

    Suárez, Carlos Gómez; Reigosa, Paula Diaz; Iannuzzo, Francesco;

    2016-01-01

    An original tool for parameter extraction of PSpice models has been released, enabling a simple parameter identification. A physics-based IGBT model is used to demonstrate that the optimization tool is capable of generating a set of parameters which predicts the steady-state and switching behavio...

  14. Simulation of Large Scale Automation Process Control Process Optimization Scheduling Model%大型自动化过程控制流程优化调度模型仿真

    Institute of Scientific and Technical Information of China (English)

    任铭

    2015-01-01

    传统的过程控制和作业调度方法采用基于多线程集群聚类的任务调度方法,对多用户、多任务的大型自动化过程控制的调度性能不好.提出基于主特征支配集分簇提取的大型自动化过程控制流程优化调度模型.构建大型自动化过程控制模型,进行优化控制目标函数构建,实现控制流程的优化调度模型改进,最后通过仿真实验进行了性能验证.仿真结果表明,该算法能优化自动化过程控制流程,在提高生产效率,优化工业自动化过程控制方面具有重要应用价值.%Traditional process control and job scheduling method based on multi-threading set clustering task scheduling method, large automation of users, the task scheduling performance of process control is bad. Put forward based on the char-acteristics of dominating sets clumps and extraction of large automation process control process optimization scheduling model. Building large automation process control model for optimal control objective function building, to achieve the opti-mal scheduling model of control process improvements, the performance verification by simulation experiment. The simula-tion results show that the algorithm can optimize the automation process control process, to improve the production effi-ciency, optimize the industrial automation process control has important application value.

  15. Accounting Automation

    OpenAIRE

    Laynebaril1

    2017-01-01

    Accounting Automation   Click Link Below To Buy:   http://hwcampus.com/shop/accounting-automation/  Or Visit www.hwcampus.com Accounting Automation” Please respond to the following: Imagine you are a consultant hired to convert a manual accounting system to an automated system. Suggest the key advantages and disadvantages of automating a manual accounting system. Identify the most important step in the conversion process. Provide a rationale for your response. ...

  16. Biogas-pH automation control strategy for optimizing organic loading rate of anaerobic membrane bioreactor treating high COD wastewater.

    Science.gov (United States)

    Yu, Dawei; Liu, Jibao; Sui, Qianwen; Wei, Yuansong

    2016-03-01

    Control of organic loading rate (OLR) is essential for anaerobic digestion treating high COD wastewater, which would cause operation failure by overload or less efficiency by underload. A novel biogas-pH automation control strategy using the combined gas-liquor phase monitoring was developed for an anaerobic membrane bioreactor (AnMBR) treating high COD (27.53 g·L(-1)) starch wastewater. The biogas-pH strategy was proceeded with threshold between biogas production rate >98 Nml·h(-1) preventing overload and pH>7.4 preventing underload, which were determined by methane production kinetics and pH titration of methanogenesis slurry, respectively. The OLR and the effluent COD were doubled as 11.81 kgCOD·kgVSS(-1)·d(-1) and halved as 253.4 mg·L(-1), respectively, comparing with a constant OLR control strategy. Meanwhile COD removal rate, biogas yield and methane concentration were synchronously improved to 99.1%, 312 Nml·gCODin(-1) and 74%, respectively. Using the biogas-pH strategy, AnMBR formed a "pH self-regulation ternary buffer system" which seizes carbon dioxide and hence provides sufficient buffering capacity.

  17. Home Automation

    OpenAIRE

    Ahmed, Zeeshan

    2010-01-01

    In this paper I briefly discuss the importance of home automation system. Going in to the details I briefly present a real time designed and implemented software and hardware oriented house automation research project, capable of automating house's electricity and providing a security system to detect the presence of unexpected behavior.

  18. DG-AMMOS: A New tool to generate 3D conformation of small molecules using Distance Geometry and Automated Molecular Mechanics Optimization for in silico Screening

    Directory of Open Access Journals (Sweden)

    Villoutreix Bruno O

    2009-11-01

    Full Text Available Abstract Background Discovery of new bioactive molecules that could enter drug discovery programs or that could serve as chemical probes is a very complex and costly endeavor. Structure-based and ligand-based in silico screening approaches are nowadays extensively used to complement experimental screening approaches in order to increase the effectiveness of the process and facilitating the screening of thousands or millions of small molecules against a biomolecular target. Both in silico screening methods require as input a suitable chemical compound collection and most often the 3D structure of the small molecules has to be generated since compounds are usually delivered in 1D SMILES, CANSMILES or in 2D SDF formats. Results Here, we describe the new open source program DG-AMMOS which allows the generation of the 3D conformation of small molecules using Distance Geometry and their energy minimization via Automated Molecular Mechanics Optimization. The program is validated on the Astex dataset, the ChemBridge Diversity database and on a number of small molecules with known crystal structures extracted from the Cambridge Structural Database. A comparison with the free program Balloon and the well-known commercial program Omega generating the 3D of small molecules is carried out. The results show that the new free program DG-AMMOS is a very efficient 3D structure generator engine. Conclusion DG-AMMOS provides fast, automated and reliable access to the generation of 3D conformation of small molecules and facilitates the preparation of a compound collection prior to high-throughput virtual screening computations. The validation of DG-AMMOS on several different datasets proves that generated structures are generally of equal quality or sometimes better than structures obtained by other tested methods.

  19. Always-optimally-coordinated candidate selection algorithm for peer-to-peer files sharing system in mobile self-organized networks

    Institute of Scientific and Technical Information of China (English)

    Li Xi; Ji Hong; Zheng Ruiming; Li Ting

    2009-01-01

    In order to improve the performance of peer-to-peer files sharing system under mobile distributed environments, a novel always-optimally-coordinated (AOC) criterion and corresponding candidate selection algorithm are proposed in this paper. Compared with the traditional min-hops criterion, the new approach introduces a fuzzy knowledge combination theory to investigate several important factors that influence files transfer success rate and efficiency. Whereas the min-hops based protocols only ask the nearest candidate peer for desired files, the selection algorithm based on AOC comprehensively considers users' preference and network requirements with flexible balancing rules. Furthermore, its advantage also expresses in the independence of specified resource discovering protocols, allowing for scalability. The simulation results show that when using the AOC based peer selection algorithm, system performance is much better than the min-hops scheme, with files successful transfer rate improved more than 50% and transfer time reduced at least 20%.

  20. THE METHOD OF FORMING THE PIGGYBACK TECHNOLOGIES USING THE AUTOMATED HEURISTIC SYSTEM

    Directory of Open Access Journals (Sweden)

    Ye. Nahornyi

    2015-07-01

    Full Text Available In order to choose a rational piggyback technology there was offered a method that envisages the automated system improvement by giving it a heuristic nature. The automated system is based on a set of methods, techniques and strategies aimed at creating optimal resource saving technologies, which makes it possible to take into account with maximum efficiency the interests of all the participants of the delivery process. When organizing the piggyback traffic there is presupposed the coordination of operations between the piggyback traffic participants to minimize the cargo travel time.

  1. Demo abstract: Flexhouse-2-an open source building automation platform with a focus on flexible control

    DEFF Research Database (Denmark)

    Gehrke, Oliver; Kosek, Anna Magdalena; Svendsen, Mathias

    2014-01-01

    in or on buildings, and most of these resources will not be communicating directly with the smart grid; in order to allow internal coordination and optimization of resource use at building level, a building automation platform will act as an intermediary. Such a platform must be easy to adapt to the multitude......, an open-source implementation of a building automation system which has been designed with a strong focus on enabling the integration of the building into a smart power system and dedicated support for the requirements of an R&D environment. We will demonstrate the need for such a platform, discuss...

  2. Systematic review automation technologies

    Science.gov (United States)

    2014-01-01

    Systematic reviews, a cornerstone of evidence-based medicine, are not produced quickly enough to support clinical practice. The cost of production, availability of the requisite expertise and timeliness are often quoted as major contributors for the delay. This detailed survey of the state of the art of information systems designed to support or automate individual tasks in the systematic review, and in particular systematic reviews of randomized controlled clinical trials, reveals trends that see the convergence of several parallel research projects. We surveyed literature describing informatics systems that support or automate the processes of systematic review or each of the tasks of the systematic review. Several projects focus on automating, simplifying and/or streamlining specific tasks of the systematic review. Some tasks are already fully automated while others are still largely manual. In this review, we describe each task and the effect that its automation would have on the entire systematic review process, summarize the existing information system support for each task, and highlight where further research is needed for realizing automation for the task. Integration of the systems that automate systematic review tasks may lead to a revised systematic review workflow. We envisage the optimized workflow will lead to system in which each systematic review is described as a computer program that automatically retrieves relevant trials, appraises them, extracts and synthesizes data, evaluates the risk of bias, performs meta-analysis calculations, and produces a report in real time. PMID:25005128

  3. An Automated Treatment Plan Quality Control Tool for Intensity-Modulated Radiation Therapy Using a Voxel-Weighting Factor-Based Re-Optimization Algorithm.

    Science.gov (United States)

    Song, Ting; Li, Nan; Zarepisheh, Masoud; Li, Yongbao; Gautier, Quentin; Zhou, Linghong; Mell, Loren; Jiang, Steve; Cerviño, Laura

    2016-01-01

    Intensity-modulated radiation therapy (IMRT) currently plays an important role in radiotherapy, but its treatment plan quality can vary significantly among institutions and planners. Treatment plan quality control (QC) is a necessary component for individual clinics to ensure that patients receive treatments with high therapeutic gain ratios. The voxel-weighting factor-based plan re-optimization mechanism has been proved able to explore a larger Pareto surface (solution domain) and therefore increase the possibility of finding an optimal treatment plan. In this study, we incorporated additional modules into an in-house developed voxel weighting factor-based re-optimization algorithm, which was enhanced as a highly automated and accurate IMRT plan QC tool (TPS-QC tool). After importing an under-assessment plan, the TPS-QC tool was able to generate a QC report within 2 minutes. This QC report contains the plan quality determination as well as information supporting the determination. Finally, the IMRT plan quality can be controlled by approving quality-passed plans and replacing quality-failed plans using the TPS-QC tool. The feasibility and accuracy of the proposed TPS-QC tool were evaluated using 25 clinically approved cervical cancer patient IMRT plans and 5 manually created poor-quality IMRT plans. The results showed high consistency between the QC report quality determinations and the actual plan quality. In the 25 clinically approved cases that the TPS-QC tool identified as passed, a greater difference could be observed for dosimetric endpoints for organs at risk (OAR) than for planning target volume (PTV), implying that better dose sparing could be achieved in OAR than in PTV. In addition, the dose-volume histogram (DVH) curves of the TPS-QC tool re-optimized plans satisfied the dosimetric criteria more frequently than did the under-assessment plans. In addition, the criteria for unsatisfied dosimetric endpoints in the 5 poor-quality plans could typically be

  4. Influence of advanced room -and building automation and optimized operation control on the energy efficiency of buildings; Einfluss moderner Raum- und Gebaeudeautomation und optimierter Betriebsfuehrung auf die Energieeffizienz von Gebaeuden

    Energy Technology Data Exchange (ETDEWEB)

    Knoll, P.; Peters, B.; Becker, M. [Hochschule Biberach (Germany). Fachgebiet Gebaeudeautomation

    2008-07-01

    There is an increasing awareness of using our energy resources more efficiently which also leads to the finding of the importance of energy-efficient building services and operation. Unfortunately, we often restrict ourselves looking only at the costs of the investment itself instead of taking into account also the costs during the long time of building operation. In particular this is an obvious fact to decisions of investments for room and building automation equipment. However, building automation and control systems (BACS) deliver high potentials for energy savings with regard to the ongoing operation of a building. Thus, in accordance with sustainable building design, it is extremely important to understand buildings in their entirety and to look at their building facilities in an integral way. This article discusses the energy potentials of building automation and control and how the potentials can be calculated and increased. Further more, it will be presented which tools are needed for an optimized building operation management. (orig.)

  5. Autonomous Optimal Coordination Scheme in Protection System of Power Distribution Network by Using Multi-Agent Concept

    Institute of Scientific and Technical Information of China (English)

    LEE Seung-Jae; KIM Tae-Wan; LEE Gi-Young

    2008-01-01

    A protection system using a multi-agent concept for power distribution networks is pro- posed. Every digital over current relay(OCR) is developed as an agent by adding its own intelli- gence, self-tuning and communication ability. The main advantage of the multi-agent concept is that a group of agents work together to achieve a global goal which is beyond the ability of each individual agent. In order to cope with frequent changes in the network operation condition and faults, an OCR agent, proposed in this paper, is able to detect a fault or a change in the network and find its optimal parameters for protection in an autonomous manner considering information of the whole network obtained by communication between other agents.Through this kind of coordi- nation and information exchanges, not only a local but also a global protective scheme is com- pleted. Simulations in a simple distribution network show the effectiveness of the proposed protec- tion system.

  6. Optimal Installation Locations for Automated External Defibrillators in Taipei 7-Eleven Stores: Using GIS and a Genetic Algorithm with a New Stirring Operator

    Directory of Open Access Journals (Sweden)

    Chung-Yuan Huang

    2014-01-01

    Full Text Available Immediate treatment with an automated external defibrillator (AED increases out-of-hospital cardiac arrest (OHCA patient survival potential. While considerable attention has been given to determining optimal public AED locations, spatial and temporal factors such as time of day and distance from emergency medical services (EMSs are understudied. Here we describe a geocomputational genetic algorithm with a new stirring operator (GANSO that considers spatial and temporal cardiac arrest occurrence factors when assessing the feasibility of using Taipei 7-Eleven stores as installation locations for AEDs. Our model is based on two AED conveyance modes, walking/running and driving, involving service distances of 100 and 300 meters, respectively. Our results suggest different AED allocation strategies involving convenience stores in urban settings. In commercial areas, such installations can compensate for temporal gaps in EMS locations when responding to nighttime OHCA incidents. In residential areas, store installations can compensate for long distances from fire stations, where AEDs are currently held in Taipei.

  7. Full Design Automation of Multi-State RNA Devices to Program Gene Expression Using Energy-Based Optimization

    Science.gov (United States)

    Majer, Eszter; Daròs, José-Antonio; Jaramillo, Alfonso

    2013-01-01

    Small RNAs (sRNAs) can operate as regulatory agents to control protein expression by interaction with the 5′ untranslated region of the mRNA. We have developed a physicochemical framework, relying on base pair interaction energies, to design multi-state sRNA devices by solving an optimization problem with an objective function accounting for the stability of the transition and final intermolecular states. Contrary to the analysis of the reaction kinetics of an ensemble of sRNAs, we solve the inverse problem of finding sequences satisfying targeted reactions. We show here that our objective function correlates well with measured riboregulatory activity of a set of mutants. This has enabled the application of the methodology for an extended design of RNA devices with specified behavior, assuming different molecular interaction models based on Watson-Crick interaction. We designed several YES, NOT, AND, and OR logic gates, including the design of combinatorial riboregulators. In sum, our de novo approach provides a new paradigm in synthetic biology to design molecular interaction mechanisms facilitating future high-throughput functional sRNA design. PMID:23935479

  8. Sequential injection analysis for automation of the Winkler methodology, with real-time SIMPLEX optimization and shipboard application

    Energy Technology Data Exchange (ETDEWEB)

    Horstkotte, Burkhard; Tovar Sanchez, Antonio; Duarte, Carlos M. [Department of Global Change Research, IMEDEA (CSIC-UIB) Institut Mediterrani d' Estudis Avancats, Miquel Marques 21, 07190 Esporles (Spain); Cerda, Victor, E-mail: Victor.Cerda@uib.es [University of the Balearic Islands, Department of Chemistry Carreterra de Valldemossa km 7.5, 07011 Palma de Mallorca (Spain)

    2010-01-25

    A multipurpose analyzer system based on sequential injection analysis (SIA) for the determination of dissolved oxygen (DO) in seawater is presented. Three operation modes were established and successfully applied onboard during a research cruise in the Southern ocean: 1st, in-line execution of the entire Winkler method including precipitation of manganese (II) hydroxide, fixation of DO, precipitate dissolution by confluent acidification, and spectrophotometric quantification of the generated iodine/tri-iodide (I{sub 2}/I{sub 3}{sup -}), 2nd, spectrophotometric quantification of I{sub 2}/I{sub 3}{sup -} in samples prepared according the classical Winkler protocol, and 3rd, accurate batch-wise titration of I{sub 2}/I{sub 3}{sup -} with thiosulfate using one syringe pump of the analyzer as automatic burette. In the first mode, the zone stacking principle was applied to achieve high dispersion of the reagent solutions in the sample zone. Spectrophotometric detection was done at the isobestic wavelength 466 nm of I{sub 2}/I{sub 3}{sup -}. Highly reduced consumption of reagents and sample compared to the classical Winkler protocol, linear response up to 16 mg L{sup -1} DO, and an injection frequency of 30 per hour were achieved. It is noteworthy that for the offline protocol, sample metering and quantification with a potentiometric titrator lasts in general over 5 min without counting sample fixation, incubation, and glassware cleaning. The modified SIMPLEX methodology was used for the simultaneous optimization of four volumetric and two chemical variables. Vertex calculation and consequent application including in-line preparation of one reagent was carried out in real-time using the software AutoAnalysis. The analytical system featured high signal stability, robustness, and a repeatability of 3% RSD (1st mode) and 0.8% (2nd mode) during shipboard application.

  9. Automated Integrated Analog Filter Design Issues

    OpenAIRE

    2015-01-01

    An analysis of modern automated integrated analog circuits design methods and their use in integrated filter design is done. Current modern analog circuits automated tools are based on optimization algorithms and/or new circuit generation methods. Most automated integrated filter design methods are only suited to gmC and switched current filter topologies. Here, an algorithm for an active RC integrated filter design is proposed, that can be used in automated filter designs. The algorithm is t...

  10. First-Stage Development and Validation of a Web-Based Automated Dietary Modeling Tool: Using Constraint Optimization Techniques to Streamline Food Group and Macronutrient Focused Dietary Prescriptions for Clinical Trials

    Science.gov (United States)

    Morrison, Evan; Sullivan, Emma; Dam, Hoa Khanh

    2016-01-01

    Background Standardizing the background diet of participants during a dietary randomized controlled trial is vital to trial outcomes. For this process, dietary modeling based on food groups and their target servings is employed via a dietary prescription before an intervention, often using a manual process. Partial automation has employed the use of linear programming. Validity of the modeling approach is critical to allow trial outcomes to be translated to practice. Objective This paper describes the first-stage development of a tool to automatically perform dietary modeling using food group and macronutrient requirements as a test case. The Dietary Modeling Tool (DMT) was then compared with existing approaches to dietary modeling (manual and partially automated), which were previously available to dietitians working within a dietary intervention trial. Methods Constraint optimization techniques were implemented to determine whether nonlinear constraints are best suited to the development of the automated dietary modeling tool using food composition and food consumption data. Dietary models were produced and compared with a manual Microsoft Excel calculator, a partially automated Excel Solver approach, and the automated DMT that was developed. Results The web-based DMT was produced using nonlinear constraint optimization, incorporating estimated energy requirement calculations, nutrition guidance systems, and the flexibility to amend food group targets for individuals. Percentage differences between modeling tools revealed similar results for the macronutrients. Polyunsaturated fatty acids and monounsaturated fatty acids showed greater variation between tools (practically equating to a 2-teaspoon difference), although it was not considered clinically significant when the whole diet, as opposed to targeted nutrients or energy requirements, were being addressed. Conclusions Automated modeling tools can streamline the modeling process for dietary intervention trials

  11. Towards black-box calculations of tunneling splittings obtained from vibrational structure methods based on normal coordinates.

    Science.gov (United States)

    Neff, Michael; Rauhut, Guntram

    2014-02-01

    Multidimensional potential energy surfaces obtained from explicitly correlated coupled-cluster calculations and further corrections for high-order correlation contributions, scalar relativistic effects and core-correlation energy contributions were generated in a fully automated fashion for the double-minimum benchmark systems OH3(+) and NH3. The black-box generation of the potentials is based on normal coordinates, which were used in the underlying multimode expansions of the potentials and the μ-tensor within the Watson operator. Normal coordinates are not the optimal choice for describing double-minimum potentials and the question remains if they can be used for accurate calculations at all. However, their unique definition is an appealing feature, which removes remaining errors in truncated potential expansions arising from different choices of curvilinear coordinate systems. Fully automated calculations are presented, which demonstrate, that the proposed scheme allows for the determination of energy levels and tunneling splittings as a routine application.

  12. Library Automation

    OpenAIRE

    Dhakne, B. N.; Giri, V. V.; Waghmode, S. S.

    2010-01-01

    New technologies library provides several new materials, media and mode of storing and communicating the information. Library Automation reduces the drudgery of repeated manual efforts in library routine. By use of library automation collection, Storage, Administration, Processing, Preservation and communication etc.

  13. Metabolic flux ratio analysis and multi-objective optimization revealed a globally conserved and coordinated metabolic response of E. coli to paraquat-induced oxidative stress.

    Science.gov (United States)

    Shen, Tie; Rui, Bin; Zhou, Hong; Zhang, Ximing; Yi, Yin; Wen, Han; Zheng, Haoran; Wu, Jihui; Shi, Yunyu

    2013-01-27

    The ability of a microorganism to adapt to changes in the environment, such as in nutrient or oxygen availability, is essential for its competitive fitness and survival. The cellular objective and the strategy of the metabolic response to an extreme environment are therefore of tremendous interest and, thus, have been increasingly explored. However, the cellular objective of the complex regulatory structure of the metabolic changes has not yet been fully elucidated and more details regarding the quantitative behaviour of the metabolic flux redistribution are required to understand the systems-wide biological significance of this response. In this study, the intracellular metabolic flux ratios involved in the central carbon metabolism were determined by fractional (13)C-labeling and metabolic flux ratio analysis (MetaFoR) of the wild-type E. coli strain JM101 at an oxidative environment in a chemostat. We observed a significant increase in the flux through phosphoenolpyruvate carboxykinase (PEPCK), phosphoenolpyruvate carboxylase (PEPC), malic enzyme (MEZ) and serine hydroxymethyltransferase (SHMT). We applied an ε-constraint based multi-objective optimization to investigate the trade-off relationships between the biomass yield and the generation of reductive power using the in silico iJR904 genome-scale model of E. coli K-12. The theoretical metabolic redistribution supports that the trans-hydrogenase pathway should not play a direct role in the defence mounted by E. coli against oxidative stress. The agreement between the measured ratio and the theoretical redistribution established the significance of NADPH synthesis as the goal of the metabolic reprogramming that occurs in response to oxidative stress. Our work presents a framework that combines metabolic flux ratio analysis and multi-objective optimization to investigate the metabolic trade-offs that occur under varied environmental conditions. Our results led to the proposal that the metabolic response of E

  14. Automating Workflow using Dialectical Argumentation

    NARCIS (Netherlands)

    Urovi, Visara; Bromuri, Stefano; McGinnis, Jarred; Stathis, Kostas; Omicini, Andrea

    2008-01-01

    This paper presents a multi-agent framework based on argumentative agent technology for the automation of the workflow selection and execution. In this framework, workflow selection is coordinated by agent interactions governed by the rules of a dialogue game whose purpose is to evaluate the workflo

  15. Efficient automated one-step synthesis of 2-[{sup 18}F]fluoroethylcholine for clinical imaging: optimized reaction conditions and improved quality controls of different synthetic approaches

    Energy Technology Data Exchange (ETDEWEB)

    Asti, Mattia [Nuclear Medicine Department, Santa Maria Nuova Hospital, Reggio Emilia (Italy)], E-mail: asti.mattia@asmn.re.it; Farioli, Daniela; Iori, Michele; Guidotti, Claudio; Versari, Annibale; Salvo, Diana [Nuclear Medicine Department, Santa Maria Nuova Hospital, Reggio Emilia (Italy)

    2010-04-15

    [{sup 18}F]-labelled choline analogues, such as 2-[{sup 18}F]fluoroethylcholine ({sup 18}FECH), have suggested to be a new class of choline derivatives highly useful for the imaging of prostate and brain tumours. In fact, tumour cells with enhanced proliferation rate usually exhibit an improved choline uptake due to the increased membrane phospholipids biosynthesis. The aim of this study was the development of a high yielding synthesis of {sup 18}FECH. The possibility of shortening the synthesis time by reacting all the reagents in a convenient and rapid one-step reaction was specially considered. Methods: {sup 18}FECH was synthesized by reacting [{sup 18}F]fluoride with 1,2-bis(tosyloxy)ethane and N,N-dimethylaminoethanol. The synthesis was carried out using both a one- and a two-step reaction in order to compare the two procedures. The effects on the radiochemical yield and purity by using different [{sup 18}F]fluoride phase transfer catalysts, reagents amounts and purification methods were assessed. Quality controls on the final products were performed by means of radio-thin-layer chromatography, gas chromatography and high-performance liquid chromatography equipped with conductimetric, ultraviolet and radiometric detectors. Results: In the optimized experimental conditions, {sup 18}FECH was synthesized with a radiochemical yield of 43{+-}3% and 48{+-}1% (not corrected for decay) when the two-step or the one-step approach were used, respectively. The radiochemical purity was higher than 99% regardless of the different synthetic pathways or purification methods adopted. The main chemical impurity was due to N,N-dimethylmorpholinium. The identity of this impurity in {sup 18}FECH preparations was not previously reported. Conclusion: An improved two-step and an innovative one-step reaction for synthesizing {sup 18}FECH in a high yield were reported. The adaptation of a multistep synthesis to a single step process, opens further possibilities for simpler and more

  16. Automation or De-automation

    Science.gov (United States)

    Gorlach, Igor; Wessel, Oliver

    2008-09-01

    In the global automotive industry, for decades, vehicle manufacturers have continually increased the level of automation of production systems in order to be competitive. However, there is a new trend to decrease the level of automation, especially in final car assembly, for reasons of economy and flexibility. In this research, the final car assembly lines at three production sites of Volkswagen are analysed in order to determine the best level of automation for each, in terms of manufacturing costs, productivity, quality and flexibility. The case study is based on the methodology proposed by the Fraunhofer Institute. The results of the analysis indicate that fully automated assembly systems are not necessarily the best option in terms of cost, productivity and quality combined, which is attributed to high complexity of final car assembly systems; some de-automation is therefore recommended. On the other hand, the analysis shows that low automation can result in poor product quality due to reasons related to plant location, such as inadequate workers' skills, motivation, etc. Hence, the automation strategy should be formulated on the basis of analysis of all relevant aspects of the manufacturing process, such as costs, quality, productivity and flexibility in relation to the local context. A more balanced combination of automated and manual assembly operations provides better utilisation of equipment, reduces production costs and improves throughput.

  17. 商业银行效益风险协调区间研究%The Estimation and Optimization of Returns and Risks of Commercial Bank Coordination Interval

    Institute of Scientific and Technical Information of China (English)

    高艳平; 李立新

    2015-01-01

    Based on the 16 listed banks data from 2002 to 2013, the variable model of dynamic panel instrument and multi-obje-ctive optimization method, the research is done on China's commercial bank risks and returns coordination. Firstly, by the comprehen-sive measure of the commercial bank returns and risk, using the method of comprehensive evaluation and the principal component of comprehensive sequence analysis method, this paper gets the returns and risks index and establishes the risks and returns index model. Secondly, the paper controls the intrinsic factors of the returns and risks models to get the upper limit of the risks and the lower limit of the returns. Then, the external macro economic and financial environment factors are added to regain the returns and risk models. On the basis of the two models, it gets the returns under the constraint of risks and risks under the restriction of benefits. Finally, the pa-per measures the corresponding optimal benefit and the coordination interval of commercial bank efficiency and risks in our country and get the following conclusions: the relative efficiency range, risks included, is[0.86,1.15]. Once out of the scope, they may need to in-crease the risk for the price.%文章基于2002-2013年十六家上市银行数据、 动态面板工具变量模型及多目标优化方法, 对中国商业银行风险、收益协调关系进行了研究. 首先通过对商业银行效益、 风险的全面衡量, 利用综合评价法、 时序全局主成分分析方法得到效益、风险指数, 建立了风险和效益的指数模型. 其次对效益、 风险模型分别控制其内在因素, 得到商业银行所能承受的风险的上限值和效益的下限值. 再次将外部宏观经济、 金融环境等因素加进去, 重新得到效益、 风险模型, 以这两个模型为基础, 求解得到效益约束下的风险与风险约束下的效益. 最后测度我国商业银行效益与风险二者

  18. 求解AUC优化问题的对偶坐标下降方法%Dual Coordinate Descent Method for Solving AUC Optimization Problem

    Institute of Scientific and Technical Information of China (English)

    姜纪远; 陶卿; 高乾坤; 储德军

    2014-01-01

    AUC is widely used as a measure for the imbalanced classification problems. The AUC loss problem is a pairwise function between two instances from different classes, which is obviously different from that in standard binary classifications. How to improve its real convergence speed is an interesting problem. Recent study shows that the online method (OAM) using the reservoir sampling technique has better performance. However, there exist some shortcomings such as slow convergence rate and difficult parameter selection. This paper conducts a systematic investigation for solving AUC optimization problem by using the dual coordinate descent methods (AUC-DCD). It presents three kinds of algorithms: AUC-SDCD, AUC-SDCDperm and AUC-MSGD, where the first two algorithms depend on the size of training set while the last does not. Theoretical analysis shows that OAM is a special case of the AUC-DCD. Experimental results show that AUC-DCD is better than OAM on the AUC performance as well as the convergence rate. Therefore AUC-DCD is among the first optimization schemes suggested for efficiently solving AUC problems.%AUC被广泛作为衡量不平衡数据分类性能的评价标准.与二分类问题不同,AUC问题的损失函数由来自两个不同类别的样本对组成.如何提高其实际收敛速度,是一个值得研究的问题.目前的研究结果表明:使用reservoir sampling技术的在线方法(OAM)表现出很好的AUC性能,但OAM仍存在诸如收敛速度慢、参数选择复杂等缺点.针对 AUC 优化问题的对偶坐标下降(AUC-DCD)方法进行了系统的研究,给出3种算法,即 AUC-SDCD,AUC- SDCDperm和AUC-MSGD,其中,AUC-SDCD和AUC-SDCDperm与样本数目有关,AUC-MSGD与样本数目无关.理论分析指出,OAM是AUC-DCD的一种特殊情形.实验结果表明,AUC-DCD在AUC性能和收敛速度两方面均优于OAM.研究结果表明,AUC-DCD是求解AUC优化问题的首选方法.

  19. Optimism

    Science.gov (United States)

    Carver, Charles S.; Scheier, Michael F.; Segerstrom, Suzanne C.

    2010-01-01

    Optimism is an individual difference variable that reflects the extent to which people hold generalized favorable expectancies for their future. Higher levels of optimism have been related prospectively to better subjective well-being in times of adversity or difficulty (i.e., controlling for previous well-being). Consistent with such findings, optimism has been linked to higher levels of engagement coping and lower levels of avoidance, or disengagement, coping. There is evidence that optimism is associated with taking proactive steps to protect one's health, whereas pessimism is associated with health-damaging behaviors. Consistent with such findings, optimism is also related to indicators of better physical health. The energetic, task-focused approach that optimists take to goals also relates to benefits in the socioeconomic world. Some evidence suggests that optimism relates to more persistence in educational efforts and to higher later income. Optimists also appear to fare better than pessimists in relationships. Although there are instances in which optimism fails to convey an advantage, and instances in which it may convey a disadvantage, those instances are relatively rare. In sum, the behavioral patterns of optimists appear to provide models of living for others to learn from. PMID:20170998

  20. Sensitivity analysis approach to multibody systems described by natural coordinates

    Science.gov (United States)

    Li, Xiufeng; Wang, Yabin

    2014-03-01

    The classical natural coordinate modeling method which removes the Euler angles and Euler parameters from the governing equations is particularly suitable for the sensitivity analysis and optimization of multibody systems. However, the formulation has so many principles in choosing the generalized coordinates that it hinders the implementation of modeling automation. A first order direct sensitivity analysis approach to multibody systems formulated with novel natural coordinates is presented. Firstly, a new selection method for natural coordinate is developed. The method introduces 12 coordinates to describe the position and orientation of a spatial object. On the basis of the proposed natural coordinates, rigid constraint conditions, the basic constraint elements as well as the initial conditions for the governing equations are derived. Considering the characteristics of the governing equations, the newly proposed generalized-α integration method is used and the corresponding algorithm flowchart is discussed. The objective function, the detailed analysis process of first order direct sensitivity analysis and related solving strategy are provided based on the previous modeling system. Finally, in order to verify the validity and accuracy of the method presented, the sensitivity analysis of a planar spinner-slider mechanism and a spatial crank-slider mechanism are conducted. The test results agree well with that of the finite difference method, and the maximum absolute deviation of the results is less than 3%. The proposed approach is not only convenient for automatic modeling, but also helpful for the reduction of the complexity of sensitivity analysis, which provides a practical and effective way to obtain sensitivity for the optimization problems of multibody systems.

  1. Poisson Coordinates.

    Science.gov (United States)

    Li, Xian-Ying; Hu, Shi-Min

    2013-02-01

    Harmonic functions are the critical points of a Dirichlet energy functional, the linear projections of conformal maps. They play an important role in computer graphics, particularly for gradient-domain image processing and shape-preserving geometric computation. We propose Poisson coordinates, a novel transfinite interpolation scheme based on the Poisson integral formula, as a rapid way to estimate a harmonic function on a certain domain with desired boundary values. Poisson coordinates are an extension of the Mean Value coordinates (MVCs) which inherit their linear precision, smoothness, and kernel positivity. We give explicit formulas for Poisson coordinates in both continuous and 2D discrete forms. Superior to MVCs, Poisson coordinates are proved to be pseudoharmonic (i.e., they reproduce harmonic functions on n-dimensional balls). Our experimental results show that Poisson coordinates have lower Dirichlet energies than MVCs on a number of typical 2D domains (particularly convex domains). As well as presenting a formula, our approach provides useful insights for further studies on coordinates-based interpolation and fast estimation of harmonic functions.

  2. Design automation, languages, and simulations

    CERN Document Server

    Chen, Wai-Kai

    2003-01-01

    As the complexity of electronic systems continues to increase, the micro-electronic industry depends upon automation and simulations to adapt quickly to market changes and new technologies. Compiled from chapters contributed to CRC's best-selling VLSI Handbook, this volume covers a broad range of topics relevant to design automation, languages, and simulations. These include a collaborative framework that coordinates distributed design activities through the Internet, an overview of the Verilog hardware description language and its use in a design environment, hardware/software co-design, syst

  3. Automated analysis of images acquired with electronic portal imaging device during delivery of quality assurance plans for inversely optimized arc therapy

    DEFF Research Database (Denmark)

    Fredh, Anna; Korreman, Stine; Rosenschöld, Per Munck af

    2010-01-01

    This work presents an automated method for comprehensively analyzing EPID images acquired for quality assurance of RapidArc treatment delivery. In-house-developed software has been used for the analysis and long-term results from measurements on three linacs are presented.......This work presents an automated method for comprehensively analyzing EPID images acquired for quality assurance of RapidArc treatment delivery. In-house-developed software has been used for the analysis and long-term results from measurements on three linacs are presented....

  4. Automating Finance

    Science.gov (United States)

    Moore, John

    2007-01-01

    In past years, higher education's financial management side has been riddled with manual processes and aging mainframe applications. This article discusses schools which had taken advantage of an array of technologies that automate billing, payment processing, and refund processing in the case of overpayment. The investments are well worth it:…

  5. Automation in biological crystallization.

    Science.gov (United States)

    Stewart, Patrick Shaw; Mueller-Dieckmann, Jochen

    2014-06-01

    Crystallization remains the bottleneck in the crystallographic process leading from a gene to a three-dimensional model of the encoded protein or RNA. Automation of the individual steps of a crystallization experiment, from the preparation of crystallization cocktails for initial or optimization screens to the imaging of the experiments, has been the response to address this issue. Today, large high-throughput crystallization facilities, many of them open to the general user community, are capable of setting up thousands of crystallization trials per day. It is thus possible to test multiple constructs of each target for their ability to form crystals on a production-line basis. This has improved success rates and made crystallization much more convenient. High-throughput crystallization, however, cannot relieve users of the task of producing samples of high quality. Moreover, the time gained from eliminating manual preparations must now be invested in the careful evaluation of the increased number of experiments. The latter requires a sophisticated data and laboratory information-management system. A review of the current state of automation at the individual steps of crystallization with specific attention to the automation of optimization is given.

  6. Automation in biological crystallization

    Science.gov (United States)

    Shaw Stewart, Patrick; Mueller-Dieckmann, Jochen

    2014-01-01

    Crystallization remains the bottleneck in the crystallographic process leading from a gene to a three-dimensional model of the encoded protein or RNA. Automation of the individual steps of a crystallization experiment, from the preparation of crystallization cocktails for initial or optimization screens to the imaging of the experiments, has been the response to address this issue. Today, large high-throughput crystallization facilities, many of them open to the general user community, are capable of setting up thousands of crystallization trials per day. It is thus possible to test multiple constructs of each target for their ability to form crystals on a production-line basis. This has improved success rates and made crystallization much more convenient. High-throughput crystallization, however, cannot relieve users of the task of producing samples of high quality. Moreover, the time gained from eliminating manual preparations must now be invested in the careful evaluation of the increased number of experiments. The latter requires a sophisticated data and laboratory information-management system. A review of the current state of automation at the individual steps of crystallization with specific attention to the automation of optimization is given. PMID:24915074

  7. Application of an automation system and a supervisory control and data acquisition (SCADA) system for the optimal operation of a membrane adsorption hybrid system.

    Science.gov (United States)

    Smith, P J; Vigneswaran, S; Ngo, H H; Nguyen, H T; Ben-Aim, R

    2006-01-01

    The application of automation and supervisory control and data acquisition (SCADA) systems to municipal water and wastewater treatment plants is rapidly increasing. However, the application of these systems is less frequent in the research and development phases of emerging treatment technologies used in these industries. This study involved the implementation of automation and a SCADA system to the submerged membrane adsorption hybrid system for use in a semi-pilot scale research project. An incremental approach was used in the development of the automation and SCADA systems, leading to the development of two new control systems. The first system developed involved closed loop control of the backwash initiation, based upon a pressure increase, leading to productivity improvements as the backwash is only activated when required, not at a fixed time. This system resulted in a 40% reduction in the number of backwashes required and also enabled optimised operations under unsteady concentrations of wastewater. The second system developed involved closed loop control of the backwash duration, whereby the backwash was terminated when the pressure reached a steady state. This system resulted in a reduction of the duration of the backwash of up to 25% and enabled optimised operations as the foulant build-up within the reactor increased.

  8. Heating automation

    OpenAIRE

    Tomažič, Tomaž

    2013-01-01

    This degree paper presents usage and operation of peripheral devices with microcontroller for heating automation. The main goal is to make a quality system control for heating three house floors and with that, increase efficiency of heating devices and lower heating expenses. Heat pump, furnace, boiler pump, two floor-heating pumps and two radiator pumps need to be controlled by this system. For work, we have chosen a development kit stm32f4 - discovery with five temperature sensors, LCD disp...

  9. Automation Security

    OpenAIRE

    Mirzoev, Dr. Timur

    2014-01-01

    Web-based Automated Process Control systems are a new type of applications that use the Internet to control industrial processes with the access to the real-time data. Supervisory control and data acquisition (SCADA) networks contain computers and applications that perform key functions in providing essential services and commodities (e.g., electricity, natural gas, gasoline, water, waste treatment, transportation) to all Americans. As such, they are part of the nation s critical infrastructu...

  10. Marketing automation

    OpenAIRE

    Raluca Dania TODOR

    2017-01-01

    The automation of the marketing process seems to be nowadays, the only solution to face the major changes brought by the fast evolution of technology and the continuous increase in supply and demand. In order to achieve the desired marketing results, businessis have to employ digital marketing and communication services. These services are efficient and measurable thanks to the marketing technology used to track, score and implement each campaign. Due to the...

  11. Rhythms of dialogue in infancy: coordinated timing in development.

    Science.gov (United States)

    Jaffe, J; Beebe, B; Feldstein, S; Crown, C L; Jasnow, M D

    2001-01-01

    Although theories of early social development emphasize the advantage of mother-infant rhythmic coupling and bidirectional coordination, empirical demonstrations remain sparse. We therefore test the hypothesis that vocal rhythm coordination at age 4 months predicts attachment and cognition at age 12 months. Partner and site novelty were studied by recording mother-infant, stranger-infant, and mother-stranger face-to-face interactions in both home and laboratory sites for 88 4-month-old infants, for a total of 410 recordings. An automated dialogic coding scheme, appropriate to the nonperiodic rhythms of our data, implemented a systems concept of every action as jointly produced by both partners. Adult-infant coordination at age 4 months indeed predicted both outcomes at age 12 months, but midrange degree of mother-infant and stranger-infant coordination was optimal for attachment (Strange Situation), whereas high ("tight") stranger-infant coordination in the lab was optimal for cognition (Bayley Scales). Thus, high coordination can index more or less optimal outcomes, as a function of outcome measure, partner, and site. Bidirectional coordination patterns were salient in both attachment and cognition predictions. Comparison of mother-infant and stranger-infant interactions was particularly informative, suggesting the dynamics of infants' early differentiation from mothers. Stranger and infant showed different patterns of vocal rhythm activity level, were more bidirectional, accounted for 8 times more variance in Bayley scores, predicted attachment just as well as mother and infant, and revealed more varied contingency structures and a wider range of attachment outcomes. To explain why vocal timing measures at age 4 months predict outcomes at age 12 months, our dialogue model was construed as containing procedures for regulating the pragmatics of proto-conversation. The timing patterns of the 4-month-olds were seen as procedural or performance knowledge, and as

  12. Automated Integrated Analog Filter Design Issues

    Directory of Open Access Journals (Sweden)

    Karolis Kiela

    2015-07-01

    Full Text Available An analysis of modern automated integrated analog circuits design methods and their use in integrated filter design is done. Current modern analog circuits automated tools are based on optimization algorithms and/or new circuit generation methods. Most automated integrated filter design methods are only suited to gmC and switched current filter topologies. Here, an algorithm for an active RC integrated filter design is proposed, that can be used in automated filter designs. The algorithm is tested by designing an integrated active RC filter in a 65 nm CMOS technology.

  13. Automated High Throughput Drug Target Crystallography

    Energy Technology Data Exchange (ETDEWEB)

    Rupp, B

    2005-02-18

    The molecular structures of drug target proteins and receptors form the basis for 'rational' or structure guided drug design. The majority of target structures are experimentally determined by protein X-ray crystallography, which as evolved into a highly automated, high throughput drug discovery and screening tool. Process automation has accelerated tasks from parallel protein expression, fully automated crystallization, and rapid data collection to highly efficient structure determination methods. A thoroughly designed automation technology platform supported by a powerful informatics infrastructure forms the basis for optimal workflow implementation and the data mining and analysis tools to generate new leads from experimental protein drug target structures.

  14. 基于萤火虫算法的自动化仓储货位优化分配研究%Slotting allocation optimization of automated warehouse based on firefly algorithm

    Institute of Scientific and Technical Information of China (English)

    朱靖; 章瑶易; 贺青

    2016-01-01

    为解决自动化仪表仓储系统货位优化问题,提出一种基于萤火虫算法的仓储货位优化方法,以仓储货位入库的时间为目标函数,利用最小时间法实现货位的最优分配,通过实际的案例,验证了萤火虫算法在立体表库货位优化中的有效性。%In order to solve the problem of the allotting allocation optimization in automated instrument warehouse sys-tem, improve the efficiency of storage system, this paper proposes the warehouse optimization method based on the firefly algorithm, which takes the storage location time as objective function, and adopts least time to achieve optimal allocation.The relevant case analysis verified the effectiveness of firefly algorithm in three-dimensional slotting opti-mization.

  15. 一种基于规则和综合协调的多学科优化模式%A Multidisciplinary Optimization Mode Based on Rule and Comprehensive Coordination

    Institute of Scientific and Technical Information of China (English)

    刘杰; 金梦

    2011-01-01

    In this paper, aircraft multidisciplinary design optimization of background, through the GSE single-level optimization-based evolutionary algorithm and discrete transformation process,as well as aircraft design platform and mulli - enterprise collaboration to optimize the introduction of the rule base, rule -based and established a comprehensive and coordinated aircraft multidisciplinary design optimization mode. The model does not require the establishment of the objective function,the introduction of the so-called vector control,making the multi- disciplinary optimization process to a uniform computing model, and experience in the rules adopted under the guidance of co- design platform.%以飞机多学科综合设计优化为背景,通过对GSE单级优化基础算法的演化改造和离散处理,以及飞机多企业协同设计平台和优化规则库的引入,建立了一种基于规则和综合协调的飞机多学科优化设计模式.模式不要求建立目标函数,引入所谓“调控向量”,使得多学科综合优化的过程能够以统一的计算模式,在规则和经验的指导下通过协同设计平台实现.

  16. TECHNICAL COORDINATION

    CERN Multimedia

    A. Ball

    Overview From a technical perspective, CMS has been in “beam operation” state since 6th November. The detector is fully closed with all components operational and the magnetic field is normally at the nominal 3.8T. The UXC cavern is normally closed with the radiation veto set. Access to UXC is now only possible during downtimes of LHC. Such accesses must be carefully planned, documented and carried out in agreement with CMS Technical Coordination, Experimental Area Management, LHC programme coordination and the CCC. Material flow in and out of UXC is now strictly controlled. Access to USC remains possible at any time, although, for safety reasons, it is necessary to register with the shift crew in the control room before going down.It is obligatory for all material leaving UXC to pass through the underground buffer zone for RP scanning, database entry and appropriate labeling for traceability. Technical coordination (notably Stephane Bally and Christoph Schaefer), the shift crew and run ...

  17. How to assess sustainability in automated manufacturing

    DEFF Research Database (Denmark)

    Dijkman, Teunis Johannes; Rödger, Jan-Markus; Bey, Niki

    2015-01-01

    The aim of this paper is to describe how sustainability in automation can be assessed. The assessment method is illustrated using a case study of a robot. Three aspects of sustainability assessment in automation are identified. Firstly, we consider automation as part of a larger system...... that fulfills the market demand for a given functionality. Secondly, three aspects of sustainability have to be assessed: environment, economy, and society. Thirdly, automation is part of a system with many levels, with different actors on each level, resulting in meeting the market demand. In this system......, (sustainability) specifications move top-down, which helps avoiding sub-optimization and problem shifting. From these three aspects, sustainable automation is defined as automation that contributes to products that fulfill a market demand in a more sustainable way. The case study presents the carbon footprints...

  18. Integrated Automation System for Rare Earth Countercurrent Extraction Process

    Institute of Scientific and Technical Information of China (English)

    柴天佑; 杨辉

    2004-01-01

    Lower automation level in industrial rare-earth extraction processes results in high production cost, inconsistent product quality and great consumption of resources in China. An integrated automation system for extraction process of rare earth is proposed to realize optimal product indices, such as product purity,recycle rate and output. The optimal control strategy for output component, structure and function of the two-gradcd integrated automation system composed of the process management grade and the process control grade were discussed. This system is successfully applied to a HAB yttrium extraction production process and was found to provide optimal control, optimal operation, optimal management and remarkable benefits.

  19. Optimized Design of CORDIC Algorithm for Coordinate Conversion of Medical Ultrasonic Image%面向医学超声图像坐标变换的CORDIC算法优化

    Institute of Scientific and Technical Information of China (English)

    郁道银; 李妍; 李明; 汪毅; 陈晓冬

    2011-01-01

    本文提出一种用于医学超声内窥成像坐标变换的改进坐标标定的CORDIC( MCC-CORDIC)算法.在传统CORDIC算法的基础上,重新标定直角坐标,同时通过简单的数学转换将坐标位置映射到第1象限后进行坐标变换,减少FPGA资源使用量的同时解决了算法在[0,360°]范围内收敛角度范围不足的问题;通过对数据位宽、模校正方法的优化,提高了算法的精度;基于流水线结构的硬件实现使算法满足超声实时成像的要求.经仿真与超声实时成像实验验证,极角误差由0.006 3 rad减小到0.0005 rad,极径误差由0.082减小到0.03.%The modified coordinate calibration coordinate rotation digital computer (MCC-CORDIC) algorithm for coordinate conversion was introduced into medical ultrasonic endoscopic imaging system in this paper. Based on the conventional CORDIC algorithm, the Cartesian coordinates were re-calibrated to save logic element resources of FPGA. Before coordinate conversion was started, each Cartesian coordinate was mapped onto the first quadrant by a simple mathematical transformation. As a result, the problem that the convergence angle range of CORDIC algorithm could not fully cover the range of [0,360°] , was solved. The bit width of data and the method of scale factor correction were optimized, so as to improve the accuracy of the algorithm. The MCC-CORDIC algorithm was implemented on FPGA with a pipeline structure, which made the algorithm meet the real-time requirement of ultrasonic imaging system. The MCC-CORDIC algorithm was validated by both simulation and real-time ultrasonic imaging experiment on FPGA. The results show that the error of polar angle is suppressed from 0. 006 3 rad to 0. 000 5 rad and the error of polar radius is suppressed from 0.082 to 0. 03.

  20. TECHNICAL COORDINATION

    CERN Multimedia

    A. Ball

    2010-01-01

    Operational Experience At the end of the first full-year running period of LHC, CMS is established as a reliable, robust and mature experiment. In particular common systems and infrastructure faults accounted for <0.6 % CMS downtime during LHC pp physics. Technical operation throughout the entire year was rather smooth, the main faults requiring UXC access being sub-detector power systems and rack-cooling turbines. All such problems were corrected during scheduled technical stops, in the shadow of tunnel access needed by the LHC, or in negotiated accesses or access extensions. Nevertheless, the number of necessary accesses to the UXC averaged more than one per week and the technical stops were inevitably packed with work packages, typically 30 being executed within a few days, placing a high load on the coordination and area management teams. It is an appropriate moment for CMS Technical Coordination to thank all those in many CERN departments and in the Collaboration, who were involved in CMS techni...

  1. Automated curved planar reformation of 3D spine images

    Energy Technology Data Exchange (ETDEWEB)

    Vrtovec, Tomaz; Likar, Bostjan; Pernus, Franjo [University of Ljubljana, Faculty of Electrical Engineering, Trzaska 25, SI-1000 Ljubljana (Slovenia)

    2005-10-07

    Traditional techniques for visualizing anatomical structures are based on planar cross-sections from volume images, such as images obtained by computed tomography (CT) or magnetic resonance imaging (MRI). However, planar cross-sections taken in the coordinate system of the 3D image often do not provide sufficient or qualitative enough diagnostic information, because planar cross-sections cannot follow curved anatomical structures (e.g. arteries, colon, spine, etc). Therefore, not all of the important details can be shown simultaneously in any planar cross-section. To overcome this problem, reformatted images in the coordinate system of the inspected structure must be created. This operation is usually referred to as curved planar reformation (CPR). In this paper we propose an automated method for CPR of 3D spine images, which is based on the image transformation from the standard image-based to a novel spine-based coordinate system. The axes of the proposed spine-based coordinate system are determined on the curve that represents the vertebral column, and the rotation of the vertebrae around the spine curve, both of which are described by polynomial models. The optimal polynomial parameters are obtained in an image analysis based optimization framework. The proposed method was qualitatively and quantitatively evaluated on five CT spine images. The method performed well on both normal and pathological cases and was consistent with manually obtained ground truth data. The proposed spine-based CPR benefits from reduced structural complexity in favour of improved feature perception of the spine. The reformatted images are diagnostically valuable and enable easier navigation, manipulation and orientation in 3D space. Moreover, reformatted images may prove useful for segmentation and other image analysis tasks.

  2. Automated Budget System

    Data.gov (United States)

    Department of Transportation — The Automated Budget System (ABS) automates management and planning of the Mike Monroney Aeronautical Center (MMAC) budget by providing enhanced capability to plan,...

  3. Automation 2017

    CERN Document Server

    Zieliński, Cezary; Kaliczyńska, Małgorzata

    2017-01-01

    This book consists of papers presented at Automation 2017, an international conference held in Warsaw from March 15 to 17, 2017. It discusses research findings associated with the concepts behind INDUSTRY 4.0, with a focus on offering a better understanding of and promoting participation in the Fourth Industrial Revolution. Each chapter presents a detailed analysis of a specific technical problem, in most cases followed by a numerical analysis, simulation and description of the results of implementing the solution in a real-world context. The theoretical results, practical solutions and guidelines presented are valuable for both researchers working in the area of engineering sciences and practitioners looking for solutions to industrial problems. .

  4. Marketing automation

    Directory of Open Access Journals (Sweden)

    TODOR Raluca Dania

    2017-01-01

    Full Text Available The automation of the marketing process seems to be nowadays, the only solution to face the major changes brought by the fast evolution of technology and the continuous increase in supply and demand. In order to achieve the desired marketing results, businessis have to employ digital marketing and communication services. These services are efficient and measurable thanks to the marketing technology used to track, score and implement each campaign. Due to the technical progress, the marketing fragmentation, demand for customized products and services on one side and the need to achieve constructive dialogue with the customers, immediate and flexible response and the necessity to measure the investments and the results on the other side, the classical marketing approached had changed continue to improve substantially.

  5. RUN COORDINATION

    CERN Multimedia

    C. Delaere

    2012-01-01

      With the analysis of the first 5 fb–1 culminating in the announcement of the observation of a new particle with mass of around 126 GeV/c2, the CERN directorate decided to extend the LHC run until February 2013. This adds three months to the original schedule. Since then the LHC has continued to perform extremely well, and the total luminosity delivered so far this year is 22 fb–1. CMS also continues to perform excellently, recording data with efficiency higher than 95% for fills with the magnetic field at nominal value. The highest instantaneous luminosity achieved by LHC to date is 7.6x1033 cm–2s–1, which translates into 35 interactions per crossing. On the CMS side there has been a lot of work to handle these extreme conditions, such as a new DAQ computer farm and trigger menus to handle the pile-up, automation of recovery procedures to minimise the lost luminosity, better training for the shift crews, etc. We did suffer from a couple of infrastructure ...

  6. Coordination Capacity

    CERN Document Server

    Cuff, Paul; Cover, Thomas

    2009-01-01

    We develop elements of a theory of cooperation and coordination in networks. Rather than considering a communication network as a means of distributing information, or of reconstructing random processes at remote nodes, we ask what dependence can be established among the nodes given the communication constraints. Specifically, in a network with communication rates between the nodes, we ask what is the set of all achievable joint distributions p(x1, ..., xm) of actions at the nodes on the network. Several networks are solved, including arbitrarily large cascade networks. Distributed cooperation can be the solution to many problems such as distributed games, distributed control, and establishing mutual information bounds on the influence of one part of a physical system on another.

  7. 基于粒子群算法的24小时综合无功协调优化%The 24 hours reactive power optimization and coordination based on particle swarm algorithm

    Institute of Scientific and Technical Information of China (English)

    陈兰芝; 王克文

    2016-01-01

    对于电力系统24小时无功协调优化来说,优化方法是使用粒子群优化算法及罚函数法,将所有的不等式约束方程式引入原目标函数作为惩罚项;优化目标是以全天经济费用最小作为目标函数;优化过程为静态优化和综合优化两个阶段。并根据在线负荷预测来确定24个时刻的并联电容器组的投切状态和变压器分接头的位置。将粒子群算法用于求解多目标无功优化问题中能够有效降低有功网损,减少无功补偿成本,而且其收敛性能好、收敛速度快、稳定性好。%For 24 hours reactive power optimization and coordination in the power system , the optimization method is used in particle swarm optimization algorithm and penalty function method to bring all the inequality constraint equa -tions into the original objective function , which is optimized as a penalty term .The optimization goal is the minimum economic cost as the objective function throughout the day , and the optimization procedure is composed of two stages of static and comprehensive optimization .Based on the on-online forecasted load powers , the shunt capacitors switc-hing states and transform tap stalls for 24 hours are determined .The particle swarm algorithm is used to solve the multi-objective reactive power optimization problem , which can not only reduce the active power loss effectively and the cost of reactive power compensation , but also improve the convergence performance , the convergence speed and the stability .

  8. RUN COORDINATION

    CERN Multimedia

    C. Delaere

    2013-01-01

    Since the LHC ceased operations in February, a lot has been going on at Point 5, and Run Coordination continues to monitor closely the advance of maintenance and upgrade activities. In the last months, the Pixel detector was extracted and is now stored in the pixel lab in SX5; the beam pipe has been removed and ME1/1 removal has started. We regained access to the vactank and some work on the RBX of HB has started. Since mid-June, electricity and cooling are back in S1 and S2, allowing us to turn equipment back on, at least during the day. 24/7 shifts are not foreseen in the next weeks, and safety tours are mandatory to keep equipment on overnight, but re-commissioning activities are slowly being resumed. Given the (slight) delays accumulated in LS1, it was decided to merge the two global runs initially foreseen into a single exercise during the week of 4 November 2013. The aim of the global run is to check that we can run (parts of) CMS after several months switched off, with the new VME PCs installed, th...

  9. RUN COORDINATION

    CERN Multimedia

    Christophe Delaere

    2013-01-01

    The focus of Run Coordination during LS1 is to monitor closely the advance of maintenance and upgrade activities, to smooth interactions between subsystems and to ensure that all are ready in time to resume operations in 2015 with a fully calibrated and understood detector. After electricity and cooling were restored to all equipment, at about the time of the last CMS week, recommissioning activities were resumed for all subsystems. On 7 October, DCS shifts began 24/7 to allow subsystems to remain on to facilitate operations. That culminated with the Global Run in November (GriN), which   took place as scheduled during the week of 4 November. The GriN has been the first centrally managed operation since the beginning of LS1, and involved all subdetectors but the Pixel Tracker presently in a lab upstairs. All nights were therefore dedicated to long stable runs with as many subdetectors as possible. Among the many achievements in that week, three items may be highlighted. First, the Strip...

  10. Coordinated Optimization on Electric Vehicles and Renewable Energy%电动汽车与可再生能源的协调优化

    Institute of Scientific and Technical Information of China (English)

    周杨; 王东华

    2016-01-01

    In allusion to multi-objective coordinated dispatching for electric vehicles and renewable energy,this paper establi-shes a multi-objective coordinated control model taking minimum load fluctuation of the power distribution network,mini-mum network loss and minimum charging cost of electric vehicle users for objective functions. It also uses quantum particle swarm(PSO)multi-objective searching algorithm for solution and then gets reasonable network accessing numbers of electric vehicles at each time. IEEE-33 node power distribution system is used for simulating experiment and results indicate that it is able to reduce generation intermittent of renewable energy and influence on the power grid by random of network accessing of electric vehicles by taking battery energy storage system of the electric vehicle as cushion for the power grid and renew-able energy,as well as promote maximization of bilateral benefits of grid side and user side.%针对电动汽车和可再生能源之间的多目标协调调度,建立了以配电网负荷波动最小、总网络损耗最小和电动汽车用户充电成本最小为目标函数的多目标协调控制模型,并采用量子粒子群多目标搜索算法进行求解,得到各个时刻电动汽车合理的入网数量。以 IEEE-33节点配电系统进行仿真实验,结果表明,利用电动汽车的电池储能系统作为电网和可再生能源的缓冲,能降低可再生能源发电间歇性和电动汽车入网随机性对电网的影响,促使电网侧和用户侧的双边利益最大化。

  11. Exploration on the Coordination and Optimization in the Chain of Procurement and Supply of the Hospital Medical Equipment%对医院医疗器械采购供应环节协调优化的探索

    Institute of Scientific and Technical Information of China (English)

    汤黎明; 吴敏; 王卫; 钟添萍

    2011-01-01

    Through analysis of the supply chain of the medical equipment suppliers and the comparison of requirements and objectives between medical equipment purchasers and suppliers,we realized the possibility and the necessity of the coordination and optimization in the chain of procurement and supply.%本文通过对医疗器械供应商供应环节的分析,以及采购与供应双方对医疗器械采购供应要求和目标的对比,进一步了解医疗器械采购、供应环节协调优化的可行性和必要性.

  12. Chinese geodetic coordinate system 2000

    Institute of Scientific and Technical Information of China (English)

    YANG YuanXi

    2009-01-01

    The basic strategies In establishing the Chinese geodetic coordinate system 2000 have been summarized,including the definition of the coordinate system,the structure of the terrestrial reference frame,the functional and stochastic models involved in the realization of the reference frame as well as the Improvements of the adjustment procedures.First,the fundamental frame of the coordinate system is composed of the permanent GPS tracking network in China which is integrated into the international GPS service stations by combined adjustment,in order to guarantee the consistence between the international terrestrial reference system and the Chinese geodetic coordinate system.Second,the extended frame of the coordinate system is composed of the unified 2000' national GPS network which is Integrated by 6 nationwide GPS networks with more than 2500 stations under the controlling of the fundamental frame.Third,the densified frame is composed of national astronomical geodetic network with nearly 50 thousand stations which was updated by the combined adjustment with the 2000' national GPS network,thus the datum of the national astronomical geodetic network has been unified and the precision greatly improved.By the optimal data fusion method the influences of the datum errors,systematic errors and the outliers in the separated geodetic networks are weakened in the unified Chinese geodetic coordinate frame.The significance in application of the new geodetic coordinate system and the existing problems In the reference frame are described and analyzed.

  13. 考虑运费期权的航运服务供应链优化与协调%Optimization and Coordination in Shipping Service Supply Chain Based on Freight Option

    Institute of Scientific and Technical Information of China (English)

    路遥; 汪传旭

    2011-01-01

    随着全球政治、经济因素的影响,国际航运的运费费率一直处于不断变动中,投资人积极寻求金融保值工具来控制运费激增对经营带来的风险.通过引入“运费期权”的概念,本文将这一金融衍生工具运用到航运服务市场,建立了托运人收益优化模型,对其在航运服务供应链中的最优化决策进行了研究分析,经过建立对服务供应链的协调机制,使整体收益达到最优化,并通过具体算例分析对结果进行比较验证.结果表明,在采用运费期权进行套期保值交易后,航运市场中托运人存在着最优化收益决策,对整个供应链优化协调后的整体收益优于单独决策收益之和.%With the influence of the global politics and economy, the international shipping freight rate fluctuates continually. And investors have actively seek financial hedging instruments to control the risk brought by the sharp rise of the freight. By introducing the conception of "freight option" and applying the financial derivatives to shipping service market, the paper proposes a shipper's revenue optimization model, the optimization of shippers' decision-making and supply chain' s decision-making in shipping service supply chain are studied, and coordination mechanism in the whole service supply chain is designed to optimize the ' integrated revenue. At last, the results are verified by a numerical example. The optimization revenue decision exists for shippers using financial hedging instruments, and the coordination makes the integrated revenue of the whole supply chain prevails over the total revenue under separately decision-making.

  14. Multi-Agent Systems for Transportation Planning and Coordination

    NARCIS (Netherlands)

    J.M. Moonen (Hans)

    2009-01-01

    textabstractMany transportation problems are in fact coordination problems: problems that require communication, coordination and negotiation to be optimally solved. However, most software systems targeted at transportation have never approached it this way, and have instead concentrated on centrali

  15. Value-based distribution feeder automation planning

    Energy Technology Data Exchange (ETDEWEB)

    Teng, Jen-Hao [Department of Electrical Engineering, I-Shou University, No. 1, Section 1, Syuecheng Road, Dashu Township, Kaohsiung 840, Taiwan (Taiwan); Lu, Chan-Nan [Department of Electrical Engineering, National Sun Yat-Sen University, Kaohsiung, Taiwan (Taiwan)

    2006-03-15

    In a competitive electric energy market, service quality and reliability are two of the essential issues of the business. Distribution automation has been chosen by many utilities around the world as one of the most reliable measures for reducing outage time in their distribution networks. Considering network reliability data and the customer interruption costs, a value-based planning method is proposed in this paper to find the optimal numbers and locations of switches in feeder automation systems. The proposed method takes reliability costs, maintenance and investment costs into account to obtain a feeder automation plan that has a maximum benefit and a proper system reliability requirement. Three stages are involved in the search for optimal solution. Using minimum feeder data and assume equally distributed feeder loads, stage 1 gives initial estimates of reliability indices and costs, and benefit/cost ratios of different feeder automation options. To gain the maximum benefits from feeder automation with reasonable costs, results of stage 1 are used to select feeders with highest priorities for automations. Stage 2 determines the optimum locations of switches that minimize feeder outage costs, and in stage 3, the best locations for tie switches in the automated network are determined. Numerical processing procedure is described and the solution efficiency and results are compared with those obtained from a genetic algorithm. (author)

  16. 基于交替方向乘子法的非光滑损失坐标优化算法%New coordinate optimization method for non-smooth losses based on alternating direction method of multipliers

    Institute of Scientific and Technical Information of China (English)

    高乾坤; 王玉军; 王惊晓

    2013-01-01

    Alternating Direction Method of Multipliers (ADMM) already has some practical applications in machine learning field.In order to adapt to the large-scale data processing and non-smooth loss convex optimization problem,the original batch ADMM algorithm was improved by using mirror descent method,and a new coordinate optimization algorithm was proposed for solving non-smooth loss convex optimization.This new algorithm has a simple operation and efficient computation.Through detailed theoretical analysis,the convergence of the new algorithm is verified and it also has the optimal convergence rate in general convex condition.Finally,the experimental results compared with the state-of-art algorithms demonstrate it gets better convergence rate under the sparsity of solution.%交替方向乘子法(ADMM)在机器学习问题中已有一些实际应用.针对大规模数据的处理和非光滑损失凸优化问题,将镜面下降方法引入原ADMM批处理算法,得到了一种新的改进算法,并在此基础上提出了一种求解非光滑损失凸优化问题的坐标优化算法.该算法具有操作简单、计算高效的特点.通过详尽的理论分析,证明了新算法的收敛性,在一般凸条件下其具有目前最优的收敛速度.最后与相关算法进行了对比,实验结果表明该算法在保证解稀疏性的同时拥有更快的收敛速度.

  17. Effects of automation of information-processing functions on teamwork.

    Science.gov (United States)

    Wright, Melanie C; Kaber, David B

    2005-01-01

    We investigated the effects of automation as applied to different stages of information processing on team performance in a complex decision-making task. Forty teams of 2 individuals performed a simulated Theater Defense Task. Four automation conditions were simulated with computer assistance applied to realistic combinations of information acquisition, information analysis, and decision selection functions across two levels of task difficulty. Multiple measures of team effectiveness and team coordination were used. Results indicated different forms of automation have different effects on teamwork. Compared with a baseline condition, an increase in automation of information acquisition led to an increase in the ratio of information transferred to information requested; an increase in automation of information analysis resulted in higher team coordination ratings; and automation of decision selection led to better team effectiveness under low levels of task difficulty but at the cost of higher workload. The results support the use of early and intermediate forms of automation related to acquisition and analysis of information in the design of team tasks. Decision-making automation may provide benefits in more limited contexts. Applications of this research include the design and evaluation of automation in team environments.

  18. Laboratory automation: trajectory, technology, and tactics.

    Science.gov (United States)

    Markin, R S; Whalen, S A

    2000-05-01

    Laboratory automation is in its infancy, following a path parallel to the development of laboratory information systems in the late 1970s and early 1980s. Changes on the horizon in healthcare and clinical laboratory service that affect the delivery of laboratory results include the increasing age of the population in North America, the implementation of the Balanced Budget Act (1997), and the creation of disease management companies. Major technology drivers include outcomes optimization and phenotypically targeted drugs. Constant cost pressures in the clinical laboratory have forced diagnostic manufacturers into less than optimal profitability states. Laboratory automation can be a tool for the improvement of laboratory services and may decrease costs. The key to improvement of laboratory services is implementation of the correct automation technology. The design of this technology should be driven by required functionality. Automation design issues should be centered on the understanding of the laboratory and its relationship to healthcare delivery and the business and operational processes in the clinical laboratory. Automation design philosophy has evolved from a hardware-based approach to a software-based approach. Process control software to support repeat testing, reflex testing, and transportation management, and overall computer-integrated manufacturing approaches to laboratory automation implementation are rapidly expanding areas. It is clear that hardware and software are functionally interdependent and that the interface between the laboratory automation system and the laboratory information system is a key component. The cost-effectiveness of automation solutions suggested by vendors, however, has been difficult to evaluate because the number of automation installations are few and the precision with which operational data have been collected to determine payback is suboptimal. The trend in automation has moved from total laboratory automation to a

  19. Automated acoustic matrix deposition for MALDI sample preparation.

    Science.gov (United States)

    Aerni, Hans-Rudolf; Cornett, Dale S; Caprioli, Richard M

    2006-02-01

    Novel high-throughput sample preparation strategies for MALDI imaging mass spectrometry (IMS) and profiling are presented. An acoustic reagent multispotter was developed to provide improved reproducibility for depositing matrix onto a sample surface, for example, such as a tissue section. The unique design of the acoustic droplet ejector and its optimization for depositing matrix solution are discussed. Since it does not contain a capillary or nozzle for fluid ejection, issues with clogging of these orifices are avoided. Automated matrix deposition provides better control of conditions affecting protein extraction and matrix crystallization with the ability to deposit matrix accurately onto small surface features. For tissue sections, matrix spots of 180-200 microm in diameter were obtained and a procedure is described for generating coordinate files readable by a mass spectrometer to permit automated profile acquisition. Mass spectral quality and reproducibility was found to be better than that obtained with manual pipet spotting. The instrument can also deposit matrix spots in a dense array pattern so that, after analysis in a mass spectrometer, two-dimensional ion images may be constructed. Example ion images from a mouse brain are presented.

  20. An Optimization Approach Based on Multiple Time-Step Coordination for Decision Making of Unit Restoration%一种机组恢复决策的多时段协调优化方法

    Institute of Scientific and Technical Information of China (English)

    顾雪平; 刘文轩; 王佳裕; 贾京华

    2016-01-01

    为加快大停电后电网的恢复速度,在制定机组恢复方案时,应合理安排机组的启动顺序使尽可能多的火电机组实现热态启动。通过分析恢复过程中机组的串并行恢复机制,并考虑机组热启动时限对恢复顺序的影响,提出一种机组恢复决策的多时段协调优化方法。该方法采用分时段建模、多时段协调优化的全局优化策略,以热启动机组的总容量和恢复过程中机组的总发电量分别作为上、下层目标函数,建立了分时段的双层优化模型;采用基于状态优选的多种群遗传算法进行求解,实现了机组恢复决策的全局优化。通过合理控制各时段保留的方案数,既可保证方案的多样性,有效防止求解陷入局部最优,又可确保各时段的状态数有可控的上限,使计算量控制在可接受的范围内。新英格兰10机39节点系统和河北南网实际系统的算例结果验证了所提多时段协调优化方法的有效性。%To accelerate the restoration speed of a power system after a blackout, it is necessary to restore as many thermal units as possible within their hot-start intervals by determining a reasonable unit restoration sequence. By analyzing the serial and parallel restoration patterns of units and considering the effects of the hot-start interval on the restoration sequence, a multiple time-step coordinative optimization approach for decision making of the unit restoration sequences is proposed in the paper. Adopting the global optimization strategy based on multiple time-step modeling and coordinative optimization, a bi-level optimization model based on time-step is established, in which the total capacity of the hot-started units and the total units’ MWh output in the overall process are treated as the upper-level and the lower-level objective functions respectively. The multiple population genetic algorithm ( MPGA ) based on optimal choice of states is

  1. Optimization and validation of an automated DHS-TD-GC-MS method for the determination of aromatic esters in sweet wines.

    Science.gov (United States)

    Marquez, Ana; Serratosa, Maria P; Merida, Julieta; Zea, Luis; Moyano, Lourdes

    2014-06-01

    A dynamic headspace sorptive extraction (DHS) combined with thermal desorption (TD) and coupled with gas chromatography-mass spectrometry (GC/MS) was developed for the determination of 11 esters which contribute to the fruity aroma in sweet wines. A full factorial (4 factors, 2 level) experiment design was used to optimize the extraction conditions and the results were evaluated by multiple linear regression (MLR) and principal component analysis (PCA). The esters showed optimal extraction using an extraction temperature of 30°C during 20 min, and a subsequent purge volume of 300 mL and dry volume of 50 mL. Afterwards, quantification was achieved using calibration curves constructed for each ester with linear regression equations having correlation coefficients (R(2)) ranging from 0.9894 to 0.9981. The proposed method was successfully validated and showed good intermediate precision, repeatability and accuracy values for all the monitored compounds. Finally, the method was applied to quantify esters, with fruity aromatic notes, of sweet white and red wines, elaborated with different winemaking processes.

  2. The Center for Optimized Structural Studies (COSS) platform for automation in cloning, expression, and purification of single proteins and protein-protein complexes.

    Science.gov (United States)

    Mlynek, Georg; Lehner, Anita; Neuhold, Jana; Leeb, Sarah; Kostan, Julius; Charnagalov, Alexej; Stolt-Bergner, Peggy; Djinović-Carugo, Kristina; Pinotsis, Nikos

    2014-06-01

    Expression in Escherichia coli represents the simplest and most cost effective means for the production of recombinant proteins. This is a routine task in structural biology and biochemistry where milligrams of the target protein are required in high purity and monodispersity. To achieve these criteria, the user often needs to screen several constructs in different expression and purification conditions in parallel. We describe a pipeline, implemented in the Center for Optimized Structural Studies, that enables the systematic screening of expression and purification conditions for recombinant proteins and relies on a series of logical decisions. We first use bioinformatics tools to design a series of protein fragments, which we clone in parallel, and subsequently screen in small scale for optimal expression and purification conditions. Based on a scoring system that assesses soluble expression, we then select the top ranking targets for large-scale purification. In the establishment of our pipeline, emphasis was put on streamlining the processes such that it can be easily but not necessarily automatized. In a typical run of about 2 weeks, we are able to prepare and perform small-scale expression screens for 20-100 different constructs followed by large-scale purification of at least 4-6 proteins. The major advantage of our approach is its flexibility, which allows for easy adoption, either partially or entirely, by any average hypothesis driven laboratory in a manual or robot-assisted manner.

  3. Laboratory automation in clinical bacteriology: what system to choose?

    Science.gov (United States)

    Croxatto, A; Prod'hom, G; Faverjon, F; Rochais, Y; Greub, G

    2016-03-01

    Automation was introduced many years ago in several diagnostic disciplines such as chemistry, haematology and molecular biology. The first laboratory automation system for clinical bacteriology was released in 2006, and it rapidly proved its value by increasing productivity, allowing a continuous increase in sample volumes despite limited budgets and personnel shortages. Today, two major manufacturers, BD Kiestra and Copan, are commercializing partial or complete laboratory automation systems for bacteriology. The laboratory automation systems are rapidly evolving to provide improved hardware and software solutions to optimize laboratory efficiency. However, the complex parameters of the laboratory and automation systems must be considered to determine the best system for each given laboratory. We address several topics on laboratory automation that may help clinical bacteriologists to understand the particularities and operative modalities of the different systems. We present (a) a comparison of the engineering and technical features of the various elements composing the two different automated systems currently available, (b) the system workflows of partial and complete laboratory automation, which define the basis for laboratory reorganization required to optimize system efficiency, (c) the concept of digital imaging and telebacteriology, (d) the connectivity of laboratory automation to the laboratory information system, (e) the general advantages and disadvantages as well as the expected impacts provided by laboratory automation and (f) the laboratory data required to conduct a workflow assessment to determine the best configuration of an automated system for the laboratory activities and specificities.

  4. Manufacturing and automation

    Directory of Open Access Journals (Sweden)

    Ernesto Córdoba Nieto

    2010-04-01

    Full Text Available The article presents concepts and definitions from different sources concerning automation. The work approaches automation by virtue of the author’s experience in manufacturing production; why and how automation prolects are embarked upon is considered. Technological reflection regarding the progressive advances or stages of automation in the production area is stressed. Coriat and Freyssenet’s thoughts about and approaches to the problem of automation and its current state are taken and examined, especially that referring to the problem’s relationship with reconciling the level of automation with the flexibility and productivity demanded by competitive, worldwide manufacturing.

  5. Constraints Adjustment and Objectives Coordination of Satisfying Optimal Control Applied to Heavy Oil Fractionators%约束优化控制的约束调整与目标协调及在重油分离器过程中的作用

    Institute of Scientific and Technical Information of China (English)

    邹涛; 李少远

    2005-01-01

    In this paper, the feasibility and objectives coordination of real-time optimization (RTO) are systemically investigated under soft constraints. The reason for requiring soft constraints adjustment and objective relaxation simultaneously is that the result is not satisfactory when the feasible region is apart from the desired working point or the optimization problem is infeasible. The mixed logic method is introduced to describe the priority of the constraints and objectives, thereby the soft constraints adjustment and objectives coordination are solved together in RTO. A case study on the Shell heavy oil fractionators benchmark problem illustrating the method is finally presented.

  6. Motor coordination: a local hub for coordination.

    Science.gov (United States)

    Calabrese, Ronald L

    2014-03-31

    A local interneuron of a crayfish central pattern generator serves as a hub that integrates ascending and descending coordinating information and passes it on to a local oscillatory microcircuit to coordinate a series of segmental appendages known as swimmerets.

  7. Automation Framework for Flight Dynamics Products Generation

    Science.gov (United States)

    Wiegand, Robert E.; Esposito, Timothy C.; Watson, John S.; Jun, Linda; Shoan, Wendy; Matusow, Carla

    2010-01-01

    XFDS provides an easily adaptable automation platform. To date it has been used to support flight dynamics operations. It coordinates the execution of other applications such as Satellite TookKit, FreeFlyer, MATLAB, and Perl code. It provides a mechanism for passing messages among a collection of XFDS processes, and allows sending and receiving of GMSEC messages. A unified and consistent graphical user interface (GUI) is used for the various tools. Its automation configuration is stored in text files, and can be edited either directly or using the GUI.

  8. 应用多波前法快速求解最优协调电压控制问题%Fast Solution for Optimal Coordinated Voltage Control Using Multifrontal Method

    Institute of Scientific and Technical Information of China (English)

    郑文杰; 刘明波

    2011-01-01

    将长期电压稳定场景下的协调电压控制问题用带有微分-代数方程约束的最优控制模型来描述,借助Radau排列技术将这个动态优化问题转化为大型非线性规划模型,并采用非线性原-对偶内点法求解.重点探讨如何应用多波前方法结合近似最小度排序提高求解稀疏线性修正方程的效率.以IEEE 17机162节点系统和新英格兰10机39节点系统作为算例,通过与近似最小度法和反向Cuthill-McKee法排序下三角分解结果进行对比,证实了所述方法在计算速度上的优越性.%A differential algebraic equation optimization model is used to describe the optimal coordinated voltage control problem in the long-term voltage stability scenario. This dynamic optimization problem can be converted into a large-scale nonlinear programming model using Radau collocation method. The nonlinear primal-dual interior-point method is then applied to solve this problem. This paper focuses on application of the multifrontal method to enhance the efficiency of solving sparse linear correction equations by referring to approximate minimum degree permutation. The IEEE 17-generator 162-bus test system and New England 10-generator 39-bus system are used to verify the effectiveness of the method proposed for comparison with the triangular decomposition using other permutation methods such as approximate minimum degree and reversed Cuthill-McKee.

  9. An automated swimming respirometer

    DEFF Research Database (Denmark)

    STEFFENSEN, JF; JOHANSEN, K; BUSHNELL, PG

    1984-01-01

    An automated respirometer is described that can be used for computerized respirometry of trout and sharks.......An automated respirometer is described that can be used for computerized respirometry of trout and sharks....

  10. Configuration Management Automation (CMA) -

    Data.gov (United States)

    Department of Transportation — Configuration Management Automation (CMA) will provide an automated, integrated enterprise solution to support CM of FAA NAS and Non-NAS assets and investments. CMA...

  11. Autonomy and Automation

    Science.gov (United States)

    Shively, Jay

    2017-01-01

    A significant level of debate and confusion has surrounded the meaning of the terms autonomy and automation. Automation is a multi-dimensional concept, and we propose that Remotely Piloted Aircraft Systems (RPAS) automation should be described with reference to the specific system and task that has been automated, the context in which the automation functions, and other relevant dimensions. In this paper, we present definitions of automation, pilot in the loop, pilot on the loop and pilot out of the loop. We further propose that in future, the International Civil Aviation Organization (ICAO) RPAS Panel avoids the use of the terms autonomy and autonomous when referring to automated systems on board RPA. Work Group 7 proposes to develop, in consultation with other workgroups, a taxonomy of Levels of Automation for RPAS.

  12. EPOS for Coordination of Asynchronous Sensor Webs Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Develop, integrate, and deploy software-based tools to coordinate asynchronous, distributed missions and optimize observation planning spanning simultaneous...

  13. Optimizing Installation and Operation Properties of an AUV-Mounted Swath Sonar Sensor for Automated Marine Gas Seep Detection - a Modelling Approach

    Science.gov (United States)

    Wenau, S.; Fei, T.; Tóth, Z.; Keil, H.; Spiess, V.; Kraus, D.

    2014-12-01

    The detection of gas bubble streams in the water column by single- and multibeam sonars has been a common procedure in the research of marine seep sites. In the framework of the development of an AUV capable of automatic detection and sampling of gas bubble streams, such acoustic flares were modelled in MATLAB routines to assess the optimal sonar configuration for flare detection. The AUV development (IMGAM-project) is carried out as a cooperation of the company ATLAS Hydrographic and the MARUM at the University of Bremen. The combination of sensor inclination, sonar carrier frequency and pulse characteristics affect the ability of the system to detect bubble streams of different sizes and intensities. These variations in acoustic signal return from gas bubble streams depending on acquisition parameters can affect the detectability and acoustic properties of recorded acoustic flares in various seepage areas in the world's oceans. We show several examples of acoustic signatures of previously defined bubble streams under varying acquisition parameters and document the effects of changing sensor parameters on detection efficiency.

  14. Workflow automation architecture standard

    Energy Technology Data Exchange (ETDEWEB)

    Moshofsky, R.P.; Rohen, W.T. [Boeing Computer Services Co., Richland, WA (United States)

    1994-11-14

    This document presents an architectural standard for application of workflow automation technology. The standard includes a functional architecture, process for developing an automated workflow system for a work group, functional and collateral specifications for workflow automation, and results of a proof of concept prototype.

  15. Coordinated optimization between regional and provincial grids for regional generation right trade%区域发电权交易网省协调优化模式研究

    Institute of Scientific and Technical Information of China (English)

    于琪; 张晶; 王宣元; 张粒子

    2013-01-01

    提出2种区域发电权交易网省协调优化模式,即网省分层优化模式和区域统一优化模式.介绍了2种协调优化模式的组织形式,定性地分析了模式的特点及适用条件,从经济学的角度,分别在有阻塞约束和无阻塞约束的情况下,对模式的经济机理进行了深入剖析,并对2种模式的市场效率进行了比对.得到结论:有输电约束时,采用区域统一优化模式下的总社会福利大于等于网省分层优化模式下的总社会福利;无输电约束时,采用区域统一优化模式下的总社会福利大于网省分层优化模式下的总社会福利;在区域发电权交易市场建设初期,宜采用网省分层优化模式实现发电权交易从省级市场向区域市场过渡;区域统一优化模式规避了网省协调过程中所造成的效率损失,宜采用区域统一优化模式不断完善区域发电权交易市场.算例分析验证了所得结论的正确性.%Two different modes of coordinated optimization between regional and provincial grids are proposed for the regional generation right trade:layered optimization and integrated optimization. Their organizations are introduced,their features and applicability are qualitatively analyzed,their economic mechanisms are discussed in conditions of both with and without transmission congestion, and their market efficiencies are compared. Following conclusions are achieved;the total welfare of the integrated optimization mode is bigger than or equal to that of the layered optimization mode when there is transmission congestion,while it is bigger when there is no transmission congestion;the layered optimization mode should be adopted in the initial stage of market construction for regional generation right trade for the successful transition from provincial market to regional market;the integrated optimization mode should be used to continuously improve the regional market of generation right trade because it avoids the

  16. Advances in Automation and Robotics

    CERN Document Server

    International conference on Automation and Robotics ICAR2011

    2012-01-01

    The international conference on Automation and Robotics-ICAR2011 is held during December 12-13, 2011 in Dubai, UAE. The proceedings of ICAR2011 have been published by Springer Lecture Notes in Electrical Engineering, which include 163 excellent papers selected from more than 400 submitted papers.   The conference is intended to bring together the researchers and engineers/technologists working in different aspects of intelligent control systems and optimization, robotics and automation, signal processing, sensors, systems modeling and control, industrial engineering, production and management.   This part of proceedings includes 81 papers contributed by many researchers in relevant topic areas covered at ICAR2011 from various countries such as France, Japan, USA, Korea and China etc.     Many papers introduced their advanced research work recently; some of them gave a new solution to problems in the field, with powerful evidence and detail demonstration. Others stated the application of their designed and...

  17. Coordinated Optimal Dispatching of Distributed Generation Based on Quantum Differential Evolution Algorithm%基于量子差分进化算法的分布式电源协调优化调度

    Institute of Scientific and Technical Information of China (English)

    刘自发; 刘刚; 刘幸

    2013-01-01

      针对计及需求响应计划的分布式电源系统经济运行问题,建立了一种考虑燃料费用和运行管理费用、电网交互费用、可中断负荷停运补偿费用和需求侧电费支出费用等的综合优化数学模型。同时为实现能量的有效互动,优化模型中加入了需求响应模型。提出一种量子差分进化算法对优化模型进行求解。该算法基于差分进化思想,采用量子计算中的并行、坍缩等特性,并在选择策略中考虑量子位的概率特性,具有较强的鲁棒性和全局搜索能力。通过算例分析证明文中提出的模型和算法科学、有效。%  In allusion to economic operation of distributed generation (DG) system considering demand response, a comprehensive optimal mathematical model, in which the fuel cost and the cost of operation and management, the interaction cost, the compensation cost for the outage of interruptible loads and electricity cost of demand side are taken into account, is established. Meanwhile, to implement effective interaction of energy a demand side response model is added to the established optimal model. A kind of quantum differential evolution (QDE) algorithm is proposed to solve the established optimal model. Based on the idea of differential evolution and using parallel and collapse properties of the quantum calculation theory and considering probabilistic nature of quantum bit in the selection strategy, the proposed algorithm possesses strong robustness and global searching ability. Calculation results of a microgrid containing different kinds of DGs show that the established coordinated optimal dispatching model and the proposed algorithm are reasonable and effective.

  18. 圆环坐标系下自供能转换器球冠型拓扑的建模与优化%Modeling and Optimization of Spherical Cap Topology for Energy Scavenging Converters in Toroidal Coordinate Space

    Institute of Scientific and Technical Information of China (English)

    黄金鑫; 张黎; 于春辉; 李庆民; Martin D.Judd; W.H.Siew

    2012-01-01

    电容式集能转换器的设计和优化直接影响自供能装置的整体性能.集能效率的优化可转化为体积约束条件下的拓扑参数优化,由此提出一种改进的球冠型集能转换器拓扑,并在圆环坐标系下基于分离变量法建立了其解析模型.提出储能增量系数的概念,并导出以球冠开口半径与球半径之比为变量的储能增量系数表达式.针对不同尺寸的球冠型拓扑,储能增量系数的实测结果与理论值吻合地较好,从而验证了所建立的球冠型转换器拓扑解析模型的正确性.研究结果为自供能装置集能转换器的优化设计提供了理论依据.%The topology design and optimization of the capacitive energy scavenging converter will directly determine the performance of the whole device. The optimization of the energy harvesting efficiency could be translated into optimization of the topology parameters under volume restriction. An improved converter topology with the spherical cap was proposed, and the corresponding analytical model was further established based on the method of the separation of variables within the toridal coordinate system. The concept of the energy increment factor was defined and the numerical expression was presented in term of the spherical cap radius. With regards to the spherical cap converter topology of different dimensions, the measured values of energy increment factor coincided well with their theoretical equivalents, which effectively verified the validity of the proposed analytical model for the spherical cap converter topology. The research results present the theoretical basis for optimal design of the energy scavenging devices.

  19. Automation in Clinical Microbiology

    Science.gov (United States)

    Ledeboer, Nathan A.

    2013-01-01

    Historically, the trend toward automation in clinical pathology laboratories has largely bypassed the clinical microbiology laboratory. In this article, we review the historical impediments to automation in the microbiology laboratory and offer insight into the reasons why we believe that we are on the cusp of a dramatic change that will sweep a wave of automation into clinical microbiology laboratories. We review the currently available specimen-processing instruments as well as the total laboratory automation solutions. Lastly, we outline the types of studies that will need to be performed to fully assess the benefits of automation in microbiology laboratories. PMID:23515547

  20. Limitations of Radar Coordinates

    OpenAIRE

    Bini, Donato; Lusanna, Luca; Mashhoon, Bahram

    2004-01-01

    The construction of a radar coordinate system about the world line of an observer is discussed. Radar coordinates for a hyperbolic observer as well as a uniformly rotating observer are described in detail. The utility of the notion of radar distance and the admissibility of radar coordinates are investigated. Our results provide a critical assessment of the physical significance of radar coordinates.

  1. Automated Change Detection for Synthetic Aperture Sonar

    Science.gov (United States)

    2014-01-01

    B. D. Van Veen, “ Canonical coordinates are the right coordi- nates for low-rank Gauss - Gauss detection and estimation,” IEEE Trans. Signal Process. 54...features between overlapping images; sub-pixel co-registration to improves phase coherence; and finally, change detection utilizing canonical correlation...over time scales ranging from hours through several days. Keywords: automated change detection, canonical correlation analysis, coherent change detection

  2. Control coordination abilities in shock combat sports

    Directory of Open Access Journals (Sweden)

    Natalya Boychenko

    2014-12-01

    Full Text Available Purpose: optimize the process control level of coordination abilities in martial arts. Material and Methods: analysis and compilation of scientific and methodological literature, interviews with coaches of drum martial arts, video analysis techniques, teacher observations. Results: identified specific types of coordination abilities in shock combat sports. Pod branny and offered specific and nonspecific tests to monitor the level of species athletes coordination abilities. Conclusion: it is determined that in order to achieve victory in the fight martial artists to navigate the space to be able to assess and manage dynamic and spatio-temporal parameters of movements, maintain balance, have a high coordination of movements. The proposed tests to monitor species coordination abilities athletes allow an objective assessment of not only the overall level of coordination, and the level of specific types of manifestations of this ability.

  3. Reaction Coordinates and Mechanistic Hypothesis Tests.

    Science.gov (United States)

    Peters, Baron

    2016-05-27

    Reaction coordinates are integral to several classic rate theories that can (a) predict kinetic trends across conditions and homologous reactions, (b) extract activation parameters with a clear physical interpretation from experimental rates, and (c) enable efficient calculations of free energy barriers and rates. New trajectory-based rare events methods can provide rates directly from dynamical trajectories without a reaction coordinate. Trajectory-based frameworks can also generate ideal (but abstract) reaction coordinates such as committors and eigenfunctions of the master equation. However, rates and mechanistic insights obtained from trajectory-based methods and abstract coordinates are not readily generalized across simulation conditions or reaction families. We discuss methods for identifying physically meaningful reaction coordinates, including committor analysis, variational transition state theory, Kramers-Langer-Berezhkovskii-Szabo theory, and statistical inference methods that can use path sampling data to screen, mix, and optimize thousands of trial coordinates. Special focus is given to likelihood maximization and inertial likelihood maximization approaches.

  4. Reaction Coordinates and Mechanistic Hypothesis Tests

    Science.gov (United States)

    Peters, Baron

    2016-05-01

    Reaction coordinates are integral to several classic rate theories that can (a) predict kinetic trends across conditions and homologous reactions, (b) extract activation parameters with a clear physical interpretation from experimental rates, and (c) enable efficient calculations of free energy barriers and rates. New trajectory-based rare events methods can provide rates directly from dynamical trajectories without a reaction coordinate. Trajectory-based frameworks can also generate ideal (but abstract) reaction coordinates such as committors and eigenfunctions of the master equation. However, rates and mechanistic insights obtained from trajectory-based methods and abstract coordinates are not readily generalized across simulation conditions or reaction families. We discuss methods for identifying physically meaningful reaction coordinates, including committor analysis, variational transition state theory, Kramers-Langer-Berezhkovskii-Szabo theory, and statistical inference methods that can use path sampling data to screen, mix, and optimize thousands of trial coordinates. Special focus is given to likelihood maximization and inertial likelihood maximization approaches.

  5. Alternative DNA base pairing through metal coordination.

    Science.gov (United States)

    Clever, Guido H; Shionoya, Mitsuhiko

    2012-01-01

    Base-pairing in the naturally occurring DNA and RNA oligonucleotide duplexes is based on π-stacking, hydrogen bonding, and shape complementarity between the nucleobases adenine, thymine, guanine, and cytosine as well as on the hydrophobic-hydrophilic balance in aqueous media. This complex system of multiple supramolecular interactions is the product of a long-term evolutionary process and thus highly optimized to serve its biological functions such as information storage and processing. After the successful implementation of automated DNA synthesis, chemists have begun to introduce artificial modifications inside the core of the DNA double helix in order to study various aspects of base pairing, generate new base pairs orthogonal to the natural ones, and equip the biopolymer with entirely new functions. The idea to replace the hydrogen bonding interactions with metal coordination between ligand-like nucleosides and suitable transition metal ions culminated in the development of a plethora of artificial base-pairing systems termed "metal base-pairs" which were shown to strongly enhance the DNA duplex stability. Furthermore, they show great potential for the use of DNA as a molecular wire in nanoscale electronic architectures. Although single electrons have proven to be transmitted by natural DNA over a distance of several base pairs, the high ohmic resistance of unmodified oligonucleotides was identified as a serious obstacle. By exchanging some or all of the Watson-Crick base pairs in DNA with metal complexes, this problem may be solved. In the future, these research efforts are supposed to lead to DNA-like materials with superior conductivity for nano-electronic applications. Other fields of potential application such as DNA-based supramolecular architecture and catalysis may be strongly influenced by these developments as well. This text is meant to illustrate the basic concepts of metal-base pairing and give an outline over recent developments in this field.

  6. Optimal Real-time Dispatch for Integrated Energy Systems

    DEFF Research Database (Denmark)

    Anvari-Moghaddam, Amjad; Guerrero, Josep M.; Rahimi-Kian, Ashkan

    2016-01-01

    With the emerging of small-scale integrated energy systems (IESs), there are significant potentials to increase the functionality of a typical demand-side management (DSM) strategy and typical implementation of building-level distributed energy resources (DERs). By integrating DSM and DERs...... into a cohesive, networked package that fully utilizes smart energy-efficient end-use devices, advanced building control/automation systems, and integrated communications architectures, it is possible to efficiently manage energy and comfort at the end-use location. In this paper, an ontology-driven multi......-agent control system with intelligent optimizers is proposed for optimal real-time dispatch of an integrated building and microgrid system considering coordinated demand response (DR) and DERs management. The optimal dispatch problem is formulated as a mixed integer nonlinear programing problem (MINLP...

  7. 协凋控制系统直接能量平衡控制策略的优化应用%Optimization Application of Coordinated Controlling System Direct Energy Balance Controlling Strategy

    Institute of Scientific and Technical Information of China (English)

    王永涛; 杨保; 张海涛; 乔岩涛

    2015-01-01

    本文阐述了直接能量平衡(DEB)协调控制策略的基本原理与特点,分析了河南华润电力古城有限公司在该控制策略下所遇到的问题,并提出了相应的解决方案。解决方案包括:变负荷过程中智能滑压的优化应用;直接能量平衡协调控制系统的前馈优化应用;锅炉送风风量控制系统的优化应用等。方案实施后,保证了机组更加安全、稳定运行,提高了机组运行的经济效益。%This paper describes the basic principles and characteristics of the direct energy balance (DEB) coordinated control strategy, analyzes the problem encountered in the China Resources Power Henan Gucheng co., LTD, and presents the corresponding solution scheme. This scheme includes: the optimization and applications of intelligent sliding pressure in the process of changes in load、feed forward DEBCCS、the air quantity control system. Scheme implementation ensures more secure and stable operation, and improves the economic efficiency of plant.

  8. Job scheduling optimization in multi-shuttle automated storage and retrieval system%多载具自动化存取系统作业调度优化

    Institute of Scientific and Technical Information of China (English)

    杨朋; 缪立新; 秦磊

    2013-01-01

    为从作业调度角度提交多载具自动化存取系统的运作效率,根据多载具自动化存取系统的作业特点,建立了多载具自动化存取系统作业调度优化问题的数学模型,对问题进行复杂度分析,证明为NP-hard问题,设计了遗传模拟退火算法对问题进行求解.通过实例对算法性能进行分析,结果表明提出的算法具有较好的求解精度和较高的求解效率,能够有效地缩短完成存取货作业的行程时间.%To improve the operation efficiency of multi-shuttle Automated Storage and Retrieval System (AS/RS) in term of job scheduling,a mathematical model for job scheduling optimization was established according to the operational characteristic of multi-shuttle AS/RS.The job scheduling problem was proved to be NP-hard after complexity analysis,and a genetic simulated annealing algorithm was developed to solve this NP-hard problem.The performance of proposed algorithm was demonstrated by numerical examples,and the results showed that the algorithm had better solution precision and efficiency.The travel time of performing storage and retrieval operations was reduced effectively by this algorithm.

  9. Automated DNA Sequencing System

    Energy Technology Data Exchange (ETDEWEB)

    Armstrong, G.A.; Ekkebus, C.P.; Hauser, L.J.; Kress, R.L.; Mural, R.J.

    1999-04-25

    Oak Ridge National Laboratory (ORNL) is developing a core DNA sequencing facility to support biological research endeavors at ORNL and to conduct basic sequencing automation research. This facility is novel because its development is based on existing standard biology laboratory equipment; thus, the development process is of interest to the many small laboratories trying to use automation to control costs and increase throughput. Before automation, biology Laboratory personnel purified DNA, completed cycle sequencing, and prepared 96-well sample plates with commercially available hardware designed specifically for each step in the process. Following purification and thermal cycling, an automated sequencing machine was used for the sequencing. A technician handled all movement of the 96-well sample plates between machines. To automate the process, ORNL is adding a CRS Robotics A- 465 arm, ABI 377 sequencing machine, automated centrifuge, automated refrigerator, and possibly an automated SpeedVac. The entire system will be integrated with one central controller that will direct each machine and the robot. The goal of this system is to completely automate the sequencing procedure from bacterial cell samples through ready-to-be-sequenced DNA and ultimately to completed sequence. The system will be flexible and will accommodate different chemistries than existing automated sequencing lines. The system will be expanded in the future to include colony picking and/or actual sequencing. This discrete event, DNA sequencing system will demonstrate that smaller sequencing labs can achieve cost-effective the laboratory grow.

  10. Coordination control of distributed systems

    CERN Document Server

    Villa, Tiziano

    2015-01-01

    This book describes how control of distributed systems can be advanced by an integration of control, communication, and computation. The global control objectives are met by judicious combinations of local and nonlocal observations taking advantage of various forms of communication exchanges between distributed controllers. Control architectures are considered according to  increasing degrees of cooperation of local controllers:  fully distributed or decentralized control,  control with communication between controllers,  coordination control, and multilevel control.  The book covers also topics bridging computer science, communication, and control, like communication for control of networks, average consensus for distributed systems, and modeling and verification of discrete and of hybrid systems. Examples and case studies are introduced in the first part of the text and developed throughout the book. They include: control of underwater vehicles, automated-guided vehicles on a container terminal, contro...

  11. Optimization Analysis of Lattice Store Coordination Based on Revenue Sharing Contract%基于供应链收益共享契约的格子铺经营机制优化分析

    Institute of Scientific and Technical Information of China (English)

    胡本勇; 陈旭

    2012-01-01

    The lattice store business mode (LSBM) has attracted increased attention in theory and practice. Many problems, such as poor management and low occupancy rate of cells, emerge with the rapid development of LSBM. Many solutions, such as increasing the number of sales staff, improving the brand advertising efforts, and improving the rent collection mode (actually, revenue sharing mode) , can break the development bottleneck of lattice stores. These solutions can help the lattice store business develop healthily and rapidly. The academic research of LSBM is rare. LSBM mainly involves two important aspects; promotion (advertising) and revenue sharing. Promotion ( advertising) can improve the management and brand awareness of lattice stores, thereby improving customer demand. Revenue sharing can increase the flexibility of cooperation which is beneficial to the development of the lattice store. Furthermore, the cooperation between the owner and the renter has supply chain characteristics. From the perspective of supply chain, this paper analyzes the influence of revenue sharing contract on the lattice store.In the study, this paper firstly uses production function to depict the stimulating effect of brand building and promotion effort on demand. Secondly, we construct a decision-making model of supply chains. Our analyses show the importance of cooperation, optimal brand building costs, promotional costs, and revenue sharing ratio under centralized and decentralized decision-making systems. This paper also discusses the impact of revenue sharing mechanism on the lattice store.Our research results show that a simple revenue sharing mechanism can not only realize the lattice store's optimal performance, but also improve the cooperation in LSBM. So the paper discusses the reason that the coordination cannot achieve and propose a revised policy. The policy specifies that the shop's rent should correspond to the owner's cost of brand building and promotion. The policy

  12. A Coordinated Optimization Method for System-wide Power Supply-demand Balancing%全网统筹电力电量平衡协调优化方法

    Institute of Scientific and Technical Information of China (English)

    赖晓文; 钟海旺; 杨军峰; 夏清

    2015-01-01

    Large-scale interconnection of power systems has provided the possibility of a wide-range better allocation of energy resources.This paper proposes a novel state-province coordinated pattern,model and method for power supply-demand balancing over the whole system.Balancing decision-making is performed based on historical contribution of provincial grids to stimulate power exchanges.Two power supply-demand balancing schemes are proposed,to minimize generation costs if power demand can be met,or otherwise,to achieve temporal and spatial equilibrium when facing power shortage.By taking the equivalent generation cost curves of provincial grids as the key optimizing information,coordinated optimization models are developed to optimize power exchange schedule between provincial grids and generation schedule of units,resulting in better allocation of energy resources.An algorithm is proposed to precisely take into account the extra cost due to limited energy output from thermal units.Computational results of 3-area IEEE RTS9 6 system and a large-scale real power system are presented to verify the effectiveness of the method.%大规模互联电网为大范围的能源资源优化配置提供了可能。文中提出了全网统筹电力电量平衡国省两级协调优化的模式、模型与方法。以省级电网的历史贡献率为优先级进行平衡决策,从而促进省级电网参与电力交易的积极性。提出了两种国级与省级电力电量平衡决策目标:一是在电力富余时最小化发电成本;二是在电力紧缺时合理分配电力缺口,实现在时间和空间上均衡缺电。以等效发电成本曲线为关键优化信息,建立了协调优化数学模型,分层决策省间送受电方案和各省发电机组出力计划,实现能源资源的进一步优化调配。提出了以电量受限成本增量修正省级电网发电成本的求解方法,精细考虑火电机组因电量受限所带来的

  13. Wind Farm Coordinated Control for Power Optimization

    Institute of Scientific and Technical Information of China (English)

    SHU Jin; HAO Zhiguo; ZHANG Baohui; BO Zhiqian

    2011-01-01

    The total wind energy capture would decrease with the aerodynamic interaction among turbines known as wake effect, and the conventional maximum power point track (MPPT) schemes for individual wind turbine generator (WTG) can not maximize the total farm power.

  14. A Process Algebra for Supervisory Coordination

    CERN Document Server

    Baeten, Jos; van Hulst, Allan; Markovski, Jasen; 10.4204/EPTCS.60.3

    2011-01-01

    A supervisory controller controls and coordinates the behavior of different components of a complex machine by observing their discrete behaviour. Supervisory control theory studies automated synthesis of controller models, known as supervisors, based on formal models of the machine components and a formalization of the requirements. Subsequently, code generation can be used to implement this supervisor in software, on a PLC, or embedded microprocessor. In this article, we take a closer look at the control loop that couples the supervisory controller and the machine. We model both event-based and state-based observations using process algebra and bisimulation-based semantics. The main application area of supervisory control that we consider is coordination, referred to as supervisory coordination, and we give an academic and an industrial example, discussing the process-theoretic concepts employed.

  15. Embedding Temporal Constraints For Coordinated Execution in Habitat Automation

    Science.gov (United States)

    Morris, Paul; Schwabacher, Mark; Dalal, Michael; Fry, Charles

    2013-01-01

    Future NASA plans call for long-duration deep space missions with human crews. Because of light-time delay and other considerations, increased autonomy will be needed. This will necessitate integration of tools in such areas as anomaly detection, diagnosis, planning, and execution. In this paper we investigate an approach that integrates planning and execution by embedding planner-derived temporal constraints in an execution procedure. To avoid the need for propagation, we convert the temporal constraints to dispatchable form. We handle some uncertainty in the durations without it affecting the execution; larger variations may cause activities to be skipped.

  16. Embedding Temporal Constraints for Coordinated Execution in Habitat Automation

    Data.gov (United States)

    National Aeronautics and Space Administration — Future NASA plans call for long-duration deep space missions with human crews. Because of light-time delay and other considerations, increased autonomy will be...

  17. Automated Support for Rapid Coordination of Joint UUV Operation

    Science.gov (United States)

    2015-03-01

    APFs) were used to drive the autonomous vehicles toward a desired goal, or end state. The use of APFs also enabled autonomous formation control...Airlines flight 370 using its Bluefin-21 autonomous unmanned undetwa.ter vehicle (UUV). In total, it conducted 270 hours of in-water time and covered...same time period. Ideally, a coalition of countries would be able to jointly deploy tlteir autonomous UUVs witlt little or no advance preparation

  18. Cell-Detection Technique for Automated Patch Clamping

    Science.gov (United States)

    McDowell, Mark; Gray, Elizabeth

    2008-01-01

    A unique and customizable machinevision and image-data-processing technique has been developed for use in automated identification of cells that are optimal for patch clamping. [Patch clamping (in which patch electrodes are pressed against cell membranes) is an electrophysiological technique widely applied for the study of ion channels, and of membrane proteins that regulate the flow of ions across the membranes. Patch clamping is used in many biological research fields such as neurobiology, pharmacology, and molecular biology.] While there exist several hardware techniques for automated patch clamping of cells, very few of those techniques incorporate machine vision for locating cells that are ideal subjects for patch clamping. In contrast, the present technique is embodied in a machine-vision algorithm that, in practical application, enables the user to identify good and bad cells for patch clamping in an image captured by a charge-coupled-device (CCD) camera attached to a microscope, within a processing time of one second. Hence, the present technique can save time, thereby increasing efficiency and reducing cost. The present technique involves the utilization of cell-feature metrics to accurately make decisions on the degree to which individual cells are "good" or "bad" candidates for patch clamping. These metrics include position coordinates (x,y) in the image plane, major-axis length, minor-axis length, area, elongation, roundness, smoothness, angle of orientation, and degree of inclusion in the field of view. The present technique does not require any special hardware beyond commercially available, off-the-shelf patch-clamping hardware: A standard patchclamping microscope system with an attached CCD camera, a personal computer with an imagedata- processing board, and some experience in utilizing imagedata- processing software are all that are needed. A cell image is first captured by the microscope CCD camera and image-data-processing board, then the image

  19. Bayesian Safety Risk Modeling of Human-Flightdeck Automation Interaction

    Science.gov (United States)

    Ancel, Ersin; Shih, Ann T.

    2015-01-01

    Usage of automatic systems in airliners has increased fuel efficiency, added extra capabilities, enhanced safety and reliability, as well as provide improved passenger comfort since its introduction in the late 80's. However, original automation benefits, including reduced flight crew workload, human errors or training requirements, were not achieved as originally expected. Instead, automation introduced new failure modes, redistributed, and sometimes increased workload, brought in new cognitive and attention demands, and increased training requirements. Modern airliners have numerous flight modes, providing more flexibility (and inherently more complexity) to the flight crew. However, the price to pay for the increased flexibility is the need for increased mode awareness, as well as the need to supervise, understand, and predict automated system behavior. Also, over-reliance on automation is linked to manual flight skill degradation and complacency in commercial pilots. As a result, recent accidents involving human errors are often caused by the interactions between humans and the automated systems (e.g., the breakdown in man-machine coordination), deteriorated manual flying skills, and/or loss of situational awareness due to heavy dependence on automated systems. This paper describes the development of the increased complexity and reliance on automation baseline model, named FLAP for FLightdeck Automation Problems. The model development process starts with a comprehensive literature review followed by the construction of a framework comprised of high-level causal factors leading to an automation-related flight anomaly. The framework was then converted into a Bayesian Belief Network (BBN) using the Hugin Software v7.8. The effects of automation on flight crew are incorporated into the model, including flight skill degradation, increased cognitive demand and training requirements along with their interactions. Besides flight crew deficiencies, automation system

  20. Pilot interaction with automated airborne decision making systems

    Science.gov (United States)

    Rouse, W. B.; Chu, Y. Y.; Greenstein, J. S.; Walden, R. S.

    1976-01-01

    An investigation was made of interaction between a human pilot and automated on-board decision making systems. Research was initiated on the topic of pilot problem solving in automated and semi-automated flight management systems and attempts were made to develop a model of human decision making in a multi-task situation. A study was made of allocation of responsibility between human and computer, and discussed were various pilot performance parameters with varying degrees of automation. Optimal allocation of responsibility between human and computer was considered and some theoretical results found in the literature were presented. The pilot as a problem solver was discussed. Finally the design of displays, controls, procedures, and computer aids for problem solving tasks in automated and semi-automated systems was considered.

  1. Laboratory Automation and Middleware.

    Science.gov (United States)

    Riben, Michael

    2015-06-01

    The practice of surgical pathology is under constant pressure to deliver the highest quality of service, reduce errors, increase throughput, and decrease turnaround time while at the same time dealing with an aging workforce, increasing financial constraints, and economic uncertainty. Although not able to implement total laboratory automation, great progress continues to be made in workstation automation in all areas of the pathology laboratory. This report highlights the benefits and challenges of pathology automation, reviews middleware and its use to facilitate automation, and reviews the progress so far in the anatomic pathology laboratory.

  2. Railway Location Automated Optimization CAD System

    Institute of Scientific and Technical Information of China (English)

    1996-01-01

    RailwayLocationAutomatedOptimizationCADSystemYiSirongDepartmentofRailwayandRoadEnginering,SouthwestJiaotongUniversity,Chengdu...

  3. Coordination of operation optimal control with crisp coalitions game for innovative product supply network%创新型产品供应网络运营最优控制与清晰联盟博弈协调

    Institute of Scientific and Technical Information of China (English)

    何龙飞; 吕海利; 赵道致; 高常水

    2013-01-01

    研究某创新型产品在市场需求随机波动下的运输与库存最优控制策略和多节点库存运输的网络协调.首先以布朗运动描述市场随机累计需求,在满足高服务水平(保证无缺货)的同时考虑常规和紧急两种运输方式及库存补货策略,证明和求解出该产品的单节点和多节点最优库存瞬时控制策略,使得系统长期平均运营成本最小化.采用数值分析和仿真揭示了最优控制策略下单分销商和多分销商系统的运营绩效与各参数之间的变化趋势和敏感性,发现多分销商集中决策相对于分散决策不一定能带来更低的运营成本.以清晰联盟博弈刻画和证明了在补货流和需求流平衡时多分销商分散决策演化为集中运营联合控制后能更有效地协调供应链、降低运营成本,分别证明求解了单产品和多产品情况下清晰联盟博弈核的存在性及其条件,最后给出比例法则和扩充Shapley值两种总体单调分配方案作为该清晰联盟博弈成本分摊的合理预测,并解释了两者差异性的来源.%Under the market demand fluctuation,the network coordination of the innovative product's transportation and inventory optimal control strategy with multi-nodes inventory transportation were researched.Brownian motion was used to describe the accumulation of market demand,which satisfied the high service level at the same time considered the regular and emergent shipment.The optimal inventory immediate control strategy of single node and multiple nodes were proved and solved,which made the long average cost of system was minimum.Both numerical analysis and simulatiou were exploited to reveal the operational performance of single distributor and multidistributors' system and the changing trend of parameters,and the result showed that the operating cost of multidistributors' centralized decision making was not lower than decentralized decision making.The crisp coalition game

  4. Energy-Saving Coordinated Optimal Dispatch of Distributed Combined Cool, Heat and Power Supply%分布式冷热电三联供系统节能协调优化调度

    Institute of Scientific and Technical Information of China (English)

    周任军; 冉晓洪; 毛发龙; 付靖茜; 李星朗; 林绿浩

    2012-01-01

    Distributed combined cool, heat and power (DCCHP) is being developed into important trend of energy source utilization. To reveal the potential characteristics of both production cost and environment cost of DCCHP, the equivalent performance coefficient of heat supply is led in for the equivalent conversion among cool, heat and power, and a new objective function of production cost and environment cost of DCCHP is proposed. According to the unmatched property and randomness of the demand on cool, heat and power loads and considering the condition that generation units are operated in different production states of cool, heat and electric power, combining with time-of-use (TOU) price a cost function for the coordination of cool, heat and power is put forward, and on this basis a multi-objective energy-saving dispatching model, in which the production cost, environment cost and coordination cost for cool, heat and electric power are included, is built. Using the principle of maximum membership, the proposed model is turned into single object optimization problem and solved by quadratic programming. Results of case simulation show that the proposed model and method for energy-saving optimal dispatch of DCCHP is effective in high efficient utilization of energy source and economic power dispatching and pollution emission reduction.%分布式冷热电三联供将是能源利用的重要发展方向。为了揭示冷热电三联供生产成本和环境成本的潜在特性,引入供热当量性能系数,将冷、热、电能量等价转化,提出了新的冷热电联供生产成本和环境成本目标函数。针对冷热、电负荷需求的不匹配性和随机性,考虑机组运行在不同的冷热电生产状态下,结合分时电价,设立了冷热电协调成本函数。由此建立了含生产成本、环境成本和冷热电协调成本的多目标节能调度模型。运用目标隶属度函数模糊算法,将其转化为单目标优化

  5. Coordination and Cooperation

    OpenAIRE

    Janssen, Maarten

    2003-01-01

    textabstractThis comment makes four related points. First, explaining coordination is different from explaining cooperation. Second, solving the coordination problem is more important for the theory of games than solving the cooperation problem. Third, a version of the Principle of Coordination can be rationalized on individualistic grounds. Finally, psychological game theory should consider how players perceive their gaming situation. ---------------------------------------------------------...

  6. Processing Coordination Ambiguity

    Science.gov (United States)

    Engelhardt, Paul E.; Ferreira, Fernanda

    2010-01-01

    We examined temporarily ambiguous coordination structures such as "put the butter in the bowl and the pan on the towel." Minimal Attachment predicts that the ambiguous noun phrase "the pan" will be interpreted as a noun-phrase coordination structure because it is syntactically simpler than clausal coordination. Constraint-based theories assume…

  7. Coordination and Cooperation

    NARCIS (Netherlands)

    M.C.W. Janssen (Maarten)

    2003-01-01

    textabstractThis comment makes four related points. First, explaining coordination is different from explaining cooperation. Second, solving the coordination problem is more important for the theory of games than solving the cooperation problem. Third, a version of the Principle of Coordination can

  8. Automating checks of plan check automation.

    Science.gov (United States)

    Halabi, Tarek; Lu, Hsiao-Ming

    2014-07-08

    While a few physicists have designed new plan check automation solutions for their clinics, fewer, if any, managed to adapt existing solutions. As complex and varied as the systems they check, these programs must gain the full confidence of those who would run them on countless patient plans. The present automation effort, planCheck, therefore focuses on versatility and ease of implementation and verification. To demonstrate this, we apply planCheck to proton gantry, stereotactic proton gantry, stereotactic proton fixed beam (STAR), and IMRT treatments.

  9. Automation in Warehouse Development

    NARCIS (Netherlands)

    Hamberg, R.; Verriet, J.

    2012-01-01

    The warehouses of the future will come in a variety of forms, but with a few common ingredients. Firstly, human operational handling of items in warehouses is increasingly being replaced by automated item handling. Extended warehouse automation counteracts the scarcity of human operators and support

  10. Automate functional testing

    Directory of Open Access Journals (Sweden)

    Ramesh Kalindri

    2014-06-01

    Full Text Available Currently, software engineers are increasingly turning to the option of automating functional tests, but not always have successful in this endeavor. Reasons range from low planning until over cost in the process. Some principles that can guide teams in automating these tests are described in this article.

  11. More Benefits of Automation.

    Science.gov (United States)

    Getz, Malcolm

    1988-01-01

    Describes a study that measured the benefits of an automated catalog and automated circulation system from the library user's point of view in terms of the value of time saved. Topics discussed include patterns of use, access time, availability of information, search behaviors, and the effectiveness of the measures used. (seven references)…

  12. The problem of organization of a coastal coordinating computer center

    Science.gov (United States)

    Dyubkin, I. A.; Lodkin, I. I.

    1974-01-01

    The fundamental principles of the operation of a coastal coordinating and computing center under conditions of automation are presented. Special attention is devoted to the work of Coastal Computer Center of the Arctic and Antarctic Scientific Research Institute. This center generalizes from data collected in expeditions and also from observations made at polar stations.

  13. Autonomous Vehicle Coordination with Wireless Sensor and Actuator Networks

    NARCIS (Netherlands)

    Marin-Perianu, Mihai; Bosch, Stephan; Marin-Perianu, Raluca; Scholten, Hans; Havinga, Paul

    2010-01-01

    A coordinated team of mobile wireless sensor and actuator nodes can bring numerous benefits for various applications in the field of cooperative surveillance, mapping unknown areas, disaster management, automated highway and space exploration. This article explores the idea of mobile nodes using veh

  14. Advances in inspection automation

    Science.gov (United States)

    Weber, Walter H.; Mair, H. Douglas; Jansen, Dion; Lombardi, Luciano

    2013-01-01

    This new session at QNDE reflects the growing interest in inspection automation. Our paper describes a newly developed platform that makes the complex NDE automation possible without the need for software programmers. Inspection tasks that are tedious, error-prone or impossible for humans to perform can now be automated using a form of drag and drop visual scripting. Our work attempts to rectify the problem that NDE is not keeping pace with the rest of factory automation. Outside of NDE, robots routinely and autonomously machine parts, assemble components, weld structures and report progress to corporate databases. By contrast, components arriving in the NDT department typically require manual part handling, calibrations and analysis. The automation examples in this paper cover the development of robotic thickness gauging and the use of adaptive contour following on the NRU reactor inspection at Chalk River.

  15. Automation in immunohematology.

    Science.gov (United States)

    Bajpai, Meenu; Kaur, Ravneet; Gupta, Ekta

    2012-07-01

    There have been rapid technological advances in blood banking in South Asian region over the past decade with an increasing emphasis on quality and safety of blood products. The conventional test tube technique has given way to newer techniques such as column agglutination technique, solid phase red cell adherence assay, and erythrocyte-magnetized technique. These new technologies are adaptable to automation and major manufacturers in this field have come up with semi and fully automated equipments for immunohematology tests in the blood bank. Automation improves the objectivity and reproducibility of tests. It reduces human errors in patient identification and transcription errors. Documentation and traceability of tests, reagents and processes and archiving of results is another major advantage of automation. Shifting from manual methods to automation is a major undertaking for any transfusion service to provide quality patient care with lesser turnaround time for their ever increasing workload. This article discusses the various issues involved in the process.

  16. Automated model building

    CERN Document Server

    Caferra, Ricardo; Peltier, Nicholas

    2004-01-01

    This is the first book on automated model building, a discipline of automated deduction that is of growing importance Although models and their construction are important per se, automated model building has appeared as a natural enrichment of automated deduction, especially in the attempt to capture the human way of reasoning The book provides an historical overview of the field of automated deduction, and presents the foundations of different existing approaches to model construction, in particular those developed by the authors Finite and infinite model building techniques are presented The main emphasis is on calculi-based methods, and relevant practical results are provided The book is of interest to researchers and graduate students in computer science, computational logic and artificial intelligence It can also be used as a textbook in advanced undergraduate courses

  17. Automation in Warehouse Development

    CERN Document Server

    Verriet, Jacques

    2012-01-01

    The warehouses of the future will come in a variety of forms, but with a few common ingredients. Firstly, human operational handling of items in warehouses is increasingly being replaced by automated item handling. Extended warehouse automation counteracts the scarcity of human operators and supports the quality of picking processes. Secondly, the development of models to simulate and analyse warehouse designs and their components facilitates the challenging task of developing warehouses that take into account each customer’s individual requirements and logistic processes. Automation in Warehouse Development addresses both types of automation from the innovative perspective of applied science. In particular, it describes the outcomes of the Falcon project, a joint endeavour by a consortium of industrial and academic partners. The results include a model-based approach to automate warehouse control design, analysis models for warehouse design, concepts for robotic item handling and computer vision, and auton...

  18. Automation in Immunohematology

    Directory of Open Access Journals (Sweden)

    Meenu Bajpai

    2012-01-01

    Full Text Available There have been rapid technological advances in blood banking in South Asian region over the past decade with an increasing emphasis on quality and safety of blood products. The conventional test tube technique has given way to newer techniques such as column agglutination technique, solid phase red cell adherence assay, and erythrocyte-magnetized technique. These new technologies are adaptable to automation and major manufacturers in this field have come up with semi and fully automated equipments for immunohematology tests in the blood bank. Automation improves the objectivity and reproducibility of tests. It reduces human errors in patient identification and transcription errors. Documentation and traceability of tests, reagents and processes and archiving of results is another major advantage of automation. Shifting from manual methods to automation is a major undertaking for any transfusion service to provide quality patient care with lesser turnaround time for their ever increasing workload. This article discusses the various issues involved in the process.

  19. Design Optimization of Internal Flow Devices

    DEFF Research Database (Denmark)

    Madsen, Jens Ingemann

    The power of computational fluid dynamics is boosted through the use of automated design optimization methodologies. The thesis considers both derivative-based search optimization and the use of response surface methodologies.......The power of computational fluid dynamics is boosted through the use of automated design optimization methodologies. The thesis considers both derivative-based search optimization and the use of response surface methodologies....

  20. Optimization Analysis of Lattice Store Coordination based on Revenue Sharing Contract%基于供应链收益共享契约的格子铺经营机制优化分析

    Institute of Scientific and Technical Information of China (English)

    胡本勇; 陈旭

    2012-01-01

    , promotional costs and revenue sharing ratio under centralized and decentralized decision-making system. Furthermore, this paper discusses the impact of revenue sharing mechanism on the lattice store. Our analysis results also show that a simple revenue sharing mechanism can not realize the lattice store's optimal performance, also can not improve the cooperation in LSBM.This paper further discusses the reason that the coordination cannot achieve and put forward a revised policy, in which the shop's rent should correspond to the owner's cost of brand building and promotion. Further research shows that revenue sharing mechanism can be achieved when the rent is adjusted to a certain proportion of advertising cost. Although some conclusions are obtained under certain assumptions, the modeling thinking, quantitative analysis used in LSBM study, and those conclusions achieved will help managers of lattice stores make better decisions in practice.

  1. 多微网配电系统协调优化调度和经济运行研究%Research on coordinated optimal scheduling and economic operation of multi-microgrid distribution system

    Institute of Scientific and Technical Information of China (English)

    沈鑫; 曹敏; 周年荣; 张林山

    2016-01-01

    多微网系统由多个子微网构成,其控制策略较单微网系统更为复杂,针对多微网系统,提出了一种基于预测控制的多微网协调控制策略,根据微网的动态出力范围,考虑网损和负荷波动方差构建多目标优化函数,通过预测控制求得多微网协调因子的最优值,基于此最优值协调各微网之间的功率分配,从而维持整个微网电压和频率的稳定,并运用基于随机模拟的粒子群算法求解上层优化问题。仿真结果显示:基于多微网与电网合理的调度,可充分发挥微网“荷-源”双重特性的灵活调度,平滑负荷曲线并降低系统网损并可减轻负荷波动对电网影响以及可再生能源出力的不确定性。%A multi-microgrid system consists of several microgrid systems,which requires more complex control strategy than a single microgrid system.The paper presents a coordination control strategy for multi-microgrid sys-tem.According to the situation that multi-microgrid connects to the distribution network,a scheme of joint dispatc-hing model of multi-microgrid and distribution network was introduced.Based on the dynamic output range of the microgrid,the distribution network loss and the load fluctuation were considered in upper objective function.The particle swarm optimization based on stochastic simulation was applied to solve the upper optimization problem.For the purpose of giving full play to microgrid energy management of micro power sources,this paper treated microgrid reserve constraint as a chance constraint which has considered the effect of uncertain random variables.Monte Carlo simulation particle swarm optimization was applied to solve lower level problem.The results demonstrate that the joint dispatching strategy for multi-microgrid and distribution network can reduce the network loss and smooth the load curve by using the dispatching flexibility of microgrid “source-load”characteristics.At the

  2. THE UNIVERSITIES, EMPLOYERS AND STUDENTS INTERACTION AUTOMATED SYSTEM

    Directory of Open Access Journals (Sweden)

    Dmitry P. Danilaev

    2014-01-01

    Full Text Available The issues of higher education system subjects’ interaction in order to effectively staffing industries are considered in this paper. The model and the structure of the information interaction automated subsystem, providing the subjects interests, requirements and the highly qualified technicians training purposes coordination, is proposed. 

  3. Modeling the Coordinated Operation between Bus Rapid Transit and Bus

    Directory of Open Access Journals (Sweden)

    Jiaqing Wu

    2015-01-01

    Full Text Available The coordination between bus rapid transit (BRT and feeder bus service is helpful in improving the operational efficiency and service level of urban public transport system. Therefore, a coordinated operation model of BRT and bus is intended to develop in this paper. The total costs are formulated and optimized by genetic algorithm. Moreover, the skip-stop BRT operation is considered when building the coordinated operation model. A case of the existing bus network in Beijing is studied, the proposed coordinated operation model of BRT and bus is applied, and the optimized headway and costs are obtained. The results show that the coordinated operation model could effectively decrease the total costs of the transit system and the transfer time of passengers. The results also suggest that the coordination between the skip-stop BRT and bus during peak hour is more effective than non-coordination operation.

  4. Automated imaging system for single molecules

    Science.gov (United States)

    Schwartz, David Charles; Runnheim, Rodney; Forrest, Daniel

    2012-09-18

    There is provided a high throughput automated single molecule image collection and processing system that requires minimal initial user input. The unique features embodied in the present disclosure allow automated collection and initial processing of optical images of single molecules and their assemblies. Correct focus may be automatically maintained while images are collected. Uneven illumination in fluorescence microscopy is accounted for, and an overall robust imaging operation is provided yielding individual images prepared for further processing in external systems. Embodiments described herein are useful in studies of any macromolecules such as DNA, RNA, peptides and proteins. The automated image collection and processing system and method of same may be implemented and deployed over a computer network, and may be ergonomically optimized to facilitate user interaction.

  5. Chef infrastructure automation cookbook

    CERN Document Server

    Marschall, Matthias

    2013-01-01

    Chef Infrastructure Automation Cookbook contains practical recipes on everything you will need to automate your infrastructure using Chef. The book is packed with illustrated code examples to automate your server and cloud infrastructure.The book first shows you the simplest way to achieve a certain task. Then it explains every step in detail, so that you can build your knowledge about how things work. Eventually, the book shows you additional things to consider for each approach. That way, you can learn step-by-step and build profound knowledge on how to go about your configuration management

  6. Torque coordinating optimal control for dry dual clutch transmission in shifting process%干式双离合器变速器换挡过程的转矩协调最优控制

    Institute of Scientific and Technical Information of China (English)

    赵治国; 王琪; 陈海军; 刁威振

    2013-01-01

    A torque coordinating optimalcontrol strategy was developed using different driving intentions in different shifting process phases for a six-speed Dry Dual-Clutch-Transmission (DDCT) of a self-developed car. A simulation model was set up on the Matlab/Simulink software platform and then used to simulate the control strategy for DDCT shifting process. The sub-divided phases included the quadratic optimal control adopted in torque phase;the ignition parameters and fuel supply controlin inertia phase;the control factor mapping method, which relfects driving intension, both in micro-slipping phase and in demanded torque switching phase. The results show that shift jerks are within-2 m/s3 while the total frictional energy losses are within 2 kJ in shift process. Therefore, this control strategy relfects driving intentions and meets the demand of shift quality.%对于某一自主开发汽车的六速干式双离合器变速器(DDCT),提出了对不同驾驶意图、分阶段采取不同措施的转矩协调最优控制策略:在转矩相阶段,运用二次型最优控制确定了离合器转矩变化率;在惯性相阶段,采用发动机点火参数与燃油供给调节控制以缩短惯性相时间;在微滑摩阶段及需求转矩切换阶段,建立了体现驾驶意图的控制因子映射。并在Matlab/Simulink软件平台上,搭建了DDCT换挡过程模型,进行仿真试验。结果表明:换挡过程中的换挡冲击在-2 m/s3以内,产生的总滑摩功在2 kJ以内;因而,该控制策略能体现驾驶员的换挡意图,且满足换挡品质需求。

  7. Enhanced time overcurrent coordination

    Energy Technology Data Exchange (ETDEWEB)

    Enriquez, Arturo Conde; Martinez, Ernesto Vazquez [Universidad Autonoma de Nuevo Leon, Facultad de Ingenieria Mecanica y Electrica, Apdo. Postal 114-F, Ciudad Universitaria, CP 66450 San Nicolas de los Garza, Nuevo Leon (Mexico)

    2006-04-15

    In this paper, we recommend a new coordination system for time overcurrent relays. The purpose of the coordination process is to find a time element function that allows it to operate using a constant back-up time delay, for any fault current. In this article, we describe the implementation and coordination results of time overcurrent relays, fuses and reclosers. Experiments were carried out in a laboratory test situation using signals of a power electrical system physics simulator. (author)

  8. A Functional Architecture For Automated Highway Traffic Planning

    OpenAIRE

    Tsao, H. S. Jacob

    1994-01-01

    This report defines an architecture for Automated Highway System (AHS) capacity-optimizing traffic planning functions. It identifies major traffic planning functions useful for optimizing the capacity of one or more major AHS operating scenarios and organizes them in a robust architecture that is modular, hierarchical, complete, expandable and integratable.

  9. Towards reduction of Paradigm coordination models

    CERN Document Server

    Andova, Suzana; de Vink, Erik; 10.4204/EPTCS.60.1

    2011-01-01

    The coordination modelling language Paradigm addresses collaboration between components in terms of dynamic constraints. Within a Paradigm model, component dynamics are consistently specified at a detailed and a global level of abstraction. To enable automated verification of Paradigm models, a translation of Paradigm into process algebra has been defined in previous work. In this paper we investigate, guided by a client-server example, reduction of Paradigm models based on a notion of global inertness. Representation of Paradigm models as process algebraic specifications helps to establish a property-preserving equivalence relation between the original and the reduced Paradigm model. Experiments indicate that in this way larger Paradigm models can be analyzed.

  10. Automated Vehicles Symposium 2015

    CERN Document Server

    Beiker, Sven

    2016-01-01

    This edited book comprises papers about the impacts, benefits and challenges of connected and automated cars. It is the third volume of the LNMOB series dealing with Road Vehicle Automation. The book comprises contributions from researchers, industry practitioners and policy makers, covering perspectives from the U.S., Europe and Japan. It is based on the Automated Vehicles Symposium 2015 which was jointly organized by the Association of Unmanned Vehicle Systems International (AUVSI) and the Transportation Research Board (TRB) in Ann Arbor, Michigan, in July 2015. The topical spectrum includes, but is not limited to, public sector activities, human factors, ethical and business aspects, energy and technological perspectives, vehicle systems and transportation infrastructure. This book is an indispensable source of information for academic researchers, industrial engineers and policy makers interested in the topic of road vehicle automation.

  11. I-94 Automation FAQs

    Data.gov (United States)

    Department of Homeland Security — In order to increase efficiency, reduce operating costs and streamline the admissions process, U.S. Customs and Border Protection has automated Form I-94 at air and...

  12. Automated Vehicles Symposium 2014

    CERN Document Server

    Beiker, Sven; Road Vehicle Automation 2

    2015-01-01

    This paper collection is the second volume of the LNMOB series on Road Vehicle Automation. The book contains a comprehensive review of current technical, socio-economic, and legal perspectives written by experts coming from public authorities, companies and universities in the U.S., Europe and Japan. It originates from the Automated Vehicle Symposium 2014, which was jointly organized by the Association for Unmanned Vehicle Systems International (AUVSI) and the Transportation Research Board (TRB) in Burlingame, CA, in July 2014. The contributions discuss the challenges arising from the integration of highly automated and self-driving vehicles into the transportation system, with a focus on human factors and different deployment scenarios. This book is an indispensable source of information for academic researchers, industrial engineers, and policy makers interested in the topic of road vehicle automation.

  13. Hydrometeorological Automated Data System

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Office of Hydrologic Development of the National Weather Service operates HADS, the Hydrometeorological Automated Data System. This data set contains the last 48...

  14. Automating the Media Center.

    Science.gov (United States)

    Holloway, Mary A.

    1988-01-01

    Discusses the need to develop more efficient information retrieval skills by the use of new technology. Lists four stages used in automating the media center. Describes North Carolina's pilot programs. Proposes benefits and looks at the media center's future. (MVL)

  15. New hardware and workflows for semi-automated correlative cryo-fluorescence and cryo-electron microscopy/tomography.

    Science.gov (United States)

    Schorb, Martin; Gaechter, Leander; Avinoam, Ori; Sieckmann, Frank; Clarke, Mairi; Bebeacua, Cecilia; Bykov, Yury S; Sonnen, Andreas F-P; Lihl, Reinhard; Briggs, John A G

    2017-02-01

    Correlative light and electron microscopy allows features of interest defined by fluorescence signals to be located in an electron micrograph of the same sample. Rare dynamic events or specific objects can be identified, targeted and imaged by electron microscopy or tomography. To combine it with structural studies using cryo-electron microscopy or tomography, fluorescence microscopy must be performed while maintaining the specimen vitrified at liquid-nitrogen temperatures and in a dry environment during imaging and transfer. Here we present instrumentation, software and an experimental workflow that improves the ease of use, throughput and performance of correlated cryo-fluorescence and cryo-electron microscopy. The new cryo-stage incorporates a specially modified high-numerical aperture objective lens and provides a stable and clean imaging environment. It is combined with a transfer shuttle for contamination-free loading of the specimen. Optimized microscope control software allows automated acquisition of the entire specimen area by cryo-fluorescence microscopy. The software also facilitates direct transfer of the fluorescence image and associated coordinates to the cryo-electron microscope for subsequent fluorescence-guided automated imaging. Here we describe these technological developments and present a detailed workflow, which we applied for automated cryo-electron microscopy and tomography of various specimens.

  16. Disassembly automation automated systems with cognitive abilities

    CERN Document Server

    Vongbunyong, Supachai

    2015-01-01

    This book presents a number of aspects to be considered in the development of disassembly automation, including the mechanical system, vision system and intelligent planner. The implementation of cognitive robotics increases the flexibility and degree of autonomy of the disassembly system. Disassembly, as a step in the treatment of end-of-life products, can allow the recovery of embodied value left within disposed products, as well as the appropriate separation of potentially-hazardous components. In the end-of-life treatment industry, disassembly has largely been limited to manual labor, which is expensive in developed countries. Automation is one possible solution for economic feasibility. The target audience primarily comprises researchers and experts in the field, but the book may also be beneficial for graduate students.

  17. Storage Allocation in Automated Container Terminals: the Upper Level

    Directory of Open Access Journals (Sweden)

    Mengjue Xia

    2016-10-01

    Full Text Available Nowadays automation is a trend of container terminals all over the world. Although not applied in current automated container terminals, storage allocation is indispensable in conventional container terminals, and promising for automated container terminals in future. This paper seeks into the storage allocation problem in automated container terminals and proposed a two level structure for the problem. A mixed integer programming model is built for the upper level, and a modified Particle Swarm Optimization (PSO algorithm is applied to solve the model. The applicable conditions of the model is investigated by numerical experiments, so as the performance of the algorithm in different problem scales. It is left to future research the lower level of the problem and the potential benefit of storage allocation to automated container terminals.

  18. ACCOUNTING AUTOMATIONS RISKS

    OpenAIRE

    Муравський, В. В.; Хома, Н. Г.

    2015-01-01

    Accountant accepts active voice in organization of the automated account in the conditions of the informative systems introduction in enterprise activity. Effective accounting automation needs identification and warning of organizational risks. Authors researched, classified and generalized the risks of introduction of the informative accounting systems. The ways of liquidation of the organizational risks sources andminimization of their consequences are gives. The method of the effective con...

  19. Instant Sikuli test automation

    CERN Document Server

    Lau, Ben

    2013-01-01

    Get to grips with a new technology, understand what it is and what it can do for you, and then get to work with the most important features and tasks. A concise guide written in an easy-to follow style using the Starter guide approach.This book is aimed at automation and testing professionals who want to use Sikuli to automate GUI. Some Python programming experience is assumed.

  20. Automated security management

    CERN Document Server

    Al-Shaer, Ehab; Xie, Geoffrey

    2013-01-01

    In this contributed volume, leading international researchers explore configuration modeling and checking, vulnerability and risk assessment, configuration analysis, and diagnostics and discovery. The authors equip readers to understand automated security management systems and techniques that increase overall network assurability and usability. These constantly changing networks defend against cyber attacks by integrating hundreds of security devices such as firewalls, IPSec gateways, IDS/IPS, authentication servers, authorization/RBAC servers, and crypto systems. Automated Security Managemen

  1. Automation of Diagrammatic Reasoning

    OpenAIRE

    Jamnik, Mateja; Bundy, Alan; Green, Ian

    1997-01-01

    Theorems in automated theorem proving are usually proved by logical formal proofs. However, there is a subset of problems which humans can prove in a different way by the use of geometric operations on diagrams, so called diagrammatic proofs. Insight is more clearly perceived in these than in the corresponding algebraic proofs: they capture an intuitive notion of truthfulness that humans find easy to see and understand. We are identifying and automating this diagrammatic reasoning on mathemat...

  2. Automated Lattice Perturbation Theory

    Energy Technology Data Exchange (ETDEWEB)

    Monahan, Christopher

    2014-11-01

    I review recent developments in automated lattice perturbation theory. Starting with an overview of lattice perturbation theory, I focus on the three automation packages currently "on the market": HiPPy/HPsrc, Pastor and PhySyCAl. I highlight some recent applications of these methods, particularly in B physics. In the final section I briefly discuss the related, but distinct, approach of numerical stochastic perturbation theory.

  3. Marketing automation supporting sales

    OpenAIRE

    Sandell, Niko

    2016-01-01

    The past couple of decades has been a time of major changes in marketing. Digitalization has become a permanent part of marketing and at the same time enabled efficient collection of data. Personalization and customization of content are playing a crucial role in marketing when new customers are acquired. This has also created a need for automation to facilitate the distribution of targeted content. As a result of successful marketing automation more information of the customers is gathered ...

  4. Elements of EAF automation processes

    Science.gov (United States)

    Ioana, A.; Constantin, N.; Dragna, E. C.

    2017-01-01

    Our article presents elements of Electric Arc Furnace (EAF) automation. So, we present and analyze detailed two automation schemes: the scheme of electrical EAF automation system; the scheme of thermic EAF automation system. The application results of these scheme of automation consists in: the sensitive reduction of specific consummation of electrical energy of Electric Arc Furnace, increasing the productivity of Electric Arc Furnace, increase the quality of the developed steel, increasing the durability of the building elements of Electric Arc Furnace.

  5. [Backup territorial coordination, nursing roles and skills].

    Science.gov (United States)

    Benyahia, Amina; Abraham, Éliane

    2016-06-01

    Backup territorial coordination provides accompaniment and support for professionals who work with the fragile elderly people in an area. It aligns the sanitary, medical-social and social approaches, and mobilizes useful resources to optimize the treatment pathway. It has been implemented in the Nancy urban area by an operational team including nurses and a geriatric physician.

  6. Cassini Tour Atlas Automated Generation

    Science.gov (United States)

    Grazier, Kevin R.; Roumeliotis, Chris; Lange, Robert D.

    2011-01-01

    During the Cassini spacecraft s cruise phase and nominal mission, the Cassini Science Planning Team developed and maintained an online database of geometric and timing information called the Cassini Tour Atlas. The Tour Atlas consisted of several hundreds of megabytes of EVENTS mission planning software outputs, tables, plots, and images used by mission scientists for observation planning. Each time the nominal mission trajectory was altered or tweaked, a new Tour Atlas had to be regenerated manually. In the early phases of Cassini s Equinox Mission planning, an a priori estimate suggested that mission tour designers would develop approximately 30 candidate tours within a short period of time. So that Cassini scientists could properly analyze the science opportunities in each candidate tour quickly and thoroughly so that the optimal series of orbits for science return could be selected, a separate Tour Atlas was required for each trajectory. The task of manually generating the number of trajectory analyses in the allotted time would have been impossible, so the entire task was automated using code written in five different programming languages. This software automates the generation of the Cassini Tour Atlas database. It performs with one UNIX command what previously took a day or two of human labor.

  7. Coordination failure caused by sunspots

    DEFF Research Database (Denmark)

    Beugnot, Julie; Gürgüç, Zeynep; Øvlisen, Frederik Roose;

    2012-01-01

    In a coordination game with Pareto-ranked equilibria, we study whether a sunspot can lead to either coordination on an inferior equilibrium (mis-coordination) or to out-of equilibrium behavior (dis-coordination). While much of the literature searches for mechanisms to attain coordination on the e......In a coordination game with Pareto-ranked equilibria, we study whether a sunspot can lead to either coordination on an inferior equilibrium (mis-coordination) or to out-of equilibrium behavior (dis-coordination). While much of the literature searches for mechanisms to attain coordination...

  8. Social Postural Coordination

    Science.gov (United States)

    Varlet, Manuel; Marin, Ludovic; Lagarde, Julien; Bardy, Benoit G.

    2011-01-01

    The goal of the current study was to investigate whether a visual coupling between two people can produce spontaneous interpersonal postural coordination and change their intrapersonal postural coordination involved in the control of stance. We examined the front-to-back head displacements of participants and the angular motion of their hip and…

  9. Coordinate measuring machines

    DEFF Research Database (Denmark)

    De Chiffre, Leonardo

    This document is used in connection with three exercises of 2 hours duration as a part of the course GEOMETRICAL METROLOGY AND MACHINE TESTING. The exercises concern three aspects of coordinate measuring: 1) Measuring and verification of tolerances on coordinate measuring machines, 2) Traceabilit...

  10. Team coordination dynamics.

    Science.gov (United States)

    Gorman, Jamie C; Amazeen, Polemnia G; Cooke, Nancy J

    2010-07-01

    Team coordination consists of both the dynamics of team member interaction and the environmental dynamics to which a team is subjected. Focusing on dynamics, an approach is developed that contrasts with traditional aggregate-static concepts of team coordination as characterized by the shared mental model approach. A team coordination order parameter was developed to capture momentary fluctuations in coordination. Team coordination was observed in three-person uninhabited air vehicle teams across two experimental sessions. The dynamics of the order parameter were observed under changes of a team familiarity control parameter. Team members returned for the second session to either the same (Intact) or different (Mixed) team. 'Roadblock' perturbations, or novel changes in the task environment, were introduced in order to probe the stability of team coordination. Nonlinear dynamic methods revealed differences that a traditional approach did not: Intact and Mixed team coordination dynamics looked very different; Mixed teams were more stable than Intact teams and explored the space of solutions without the need for correction. Stability was positively correlated with the number of roadblock perturbations that were overcome successfully. The novel and non-intuitive contribution of a dynamical analysis was that Mixed teams, who did not have a long history working together, were more adaptive. Team coordination dynamics carries new implications for traditional problems such as training adaptive teams.

  11. Coordination models and languages

    NARCIS (Netherlands)

    Papadopoulos, G.A.; Arbab, F.

    1998-01-01

    A new class of models, formalisms and mechanisms has recently evolved for describing concurrent and distributed computations based on the concept of ``coordination''. The purpose of a coordination model and associated language is to provide a means of integrating a number of possibly heterogeneous c

  12. Automated Verification of IGRT-based Patient Positioning

    Science.gov (United States)

    Jiang, Xiaojun; Fox, Tim; Cordova, Scott S; Schreibmann, Eduard

    2016-01-01

    A system for automated quality assurance in radiotherapy of a therapist’s registration was designed and tested in clinical practice. The approach compliments the clinical software’s automated registration in terms of algorithm configuration and performance, and constitutes a practical approach for ensuring safe patient setups. Per our convergence analysis, evolutionary algorithms perform better in finding the global optima of the cost function with discrepancies from a deterministic optimizer seen sporadically. PMID:26699548

  13. Process development for automated solar cell and module production. Task 4: Automated array assembly

    Science.gov (United States)

    Hagerty, J. J.

    1981-01-01

    Progress in the development of automated solar cell and module production is reported. The unimate robot is programmed for the final 35 cell pattern to be used in the fabrication of the deliverable modules. The mechanical construction of the automated lamination station and final assembly station phases are completed and the first operational testing is underway. The final controlling program is written and optimized. The glass reinforced concrete (GRC) panels to be used for testing and deliverables are in production. Test routines are grouped together and defined to produce the final control program.

  14. Fast Automated Decoupling at RHIC

    CERN Document Server

    Beebe-Wang, Joanne

    2005-01-01

    Coupling correction is essential for the operational performance of RHIC. The independence of the transverse degrees of freedom makes diagnostics and tune control easier, and it is advantageous to operate an accelerator close to the coupling resonance to minimize nearby nonlinear sidebands. An automated decoupling application has been developed at RHIC for coupling correction during routine operations. The application decouples RHIC globally by minimizing the tune separation through finding the optimal settings of two orthogonal skew quadrupole families. The program provides options of automatic, semi-automatic and manual decoupling operations. It accesses tune information from all RHIC tune measurement systems: the PLL (Phase Lock Loop), the high frequency Schottky system, and the tune meter. It also supplies tune and skew quadrupole scans, finding the minimum tune separation, display the real time results and interface with the RHIC control system. We summarize the capabilities of the decoupling application...

  15. Modeling the Coordinated Operation between Bus Rapid Transit and Bus

    OpenAIRE

    Jiaqing Wu; Rui Song; Youan Wang; Feng Chen; Shubin Li

    2015-01-01

    The coordination between bus rapid transit (BRT) and feeder bus service is helpful in improving the operational efficiency and service level of urban public transport system. Therefore, a coordinated operation model of BRT and bus is intended to develop in this paper. The total costs are formulated and optimized by genetic algorithm. Moreover, the skip-stop BRT operation is considered when building the coordinated operation model. A case of the existing bus network in Beijing is studied, the ...

  16. 基于全网统筹的联络线分层优化调度%Optimization of Tie-line Hierarchical Schedule Based on Network-wide Coordination

    Institute of Scientific and Technical Information of China (English)

    许丹; 李晓磊; 丁强; 崔晖; 韩彬

    2014-01-01

    The traditional tie-line schedule is entirely based on the electricity transaction,hence it is loosely coupled with power grid operation all along and relatively independent from the generation scheduling.These cause difficulties in adj usting tie-line schedule and lack of capacity in global allocation of resources.In face of these problems,a hierarchical tie-line schedule model based on network-wide coordination is proposed by resorting to the existing tie-line planning.The upper model gets the ideal tie-line schedule through the economical dispatch constrained by the whole network security.The lower model uses the ideal tie-line schedule as the optimization obj ective,and takes electricity transactions as constraints to achieve the tie-line schedule. This model has achieved automatic preparation and flexible adj ustment of tie-line scheduling,while providing an available manner that combines the schedule with the grid operating state.The proposed algorithm is applied to the tie-line schedule of Central China power grid,while the comparison between the proposed method and the traditional method showing the correctness and effectiveness of the proposed algorithm.%传统联络线计划完全基于电力交易,与电网运行长期处于松耦合状态且与机组计划相对独立,存在联络线计划调整困难,资源全局配置能力不足等问题。针对上述问题,文中以现有联络线计划编制方式为基础,提出了基于全网统筹的联络线分层优化模型。上层模型通过全网安全约束经济调度求取理想联络线计划。下层模型以理想联络线计划为优化目标,以交易合同实际执行为相关约束求解联络线计划。该模型实现了联络线计划自动编制与灵活调整,提供了一种将联络线计划与电网运行状态相结合的可用方式。将所提模型运用于华中电网联络线计划编制,并对优化结果与传统计划进行对比分析,验证了所提方法的正确性与有效性。

  17. Uranyl ion coordination

    Science.gov (United States)

    Evans, H.T.

    1963-01-01

    A review of the known crystal structures containing the uranyl ion shows that plane-pentagon coordination is equally as prevalent as plane-square or plane-hexagon. It is suggested that puckered-hexagon configurations of OH - or H2O about the uranyl group will tend to revert to plane-pentagon coordination. The concept of pentagonal coordination is invoked for possible explanations of the complex crystallography of the natural uranyl hydroxides and the unusual behavior of polynuclear ions in hydrolyzed uranyl solutions.

  18. The Robo-AO automated intelligent queue system

    Science.gov (United States)

    Riddle, Reed L.; Hogstrom, Kristina; Papadopoulos, Athanasios; Baranec, Christoph; Law, Nicholas M.

    2014-07-01

    Robo-AO is the first automated laser adaptive optics instrument. In just its second year of scientific operations, it has completed the largest adaptive optics surveys to date, each comprising thousands of targets. Robo-AO uses a fully automated queue scheduling system that selects targets based on criteria entered on a per observing program or per target basis, and includes the ability to coordinate with US Strategic Command automatically to avoid lasing space assets. This enables Robo-AO to select among thousands of targets at a time, and achieve an average observation rate of approximately 20 targets per hour.

  19. The Robo-AO automated intelligent queue system

    CERN Document Server

    Riddle, Reed L; Papadopoulos, Athanasios; Baranec, Christoph; Law, Nicholas M

    2014-01-01

    Robo-AO is the first automated laser adaptive optics instrument. In just its second year of scientific operations, it has completed the largest adaptive optics surveys to date, each comprising thousands of targets. Robo-AO uses a fully automated queue scheduling system that selects targets based on criteria entered on a per observing program or per target basis, and includes the ability to coordinate with US Strategic Command automatically to avoid lasing space assets. This enables Robo-AO to select among thousands of targets at a time, and achieve an average observation rate of approximately 20 targets per hour.

  20. Materials Testing and Automation

    Science.gov (United States)

    Cooper, Wayne D.; Zweigoron, Ronald B.

    1980-07-01

    The advent of automation in materials testing has been in large part responsible for recent radical changes in the materials testing field: Tests virtually impossible to perform without a computer have become more straightforward to conduct. In addition, standardized tests may be performed with enhanced efficiency and repeatability. A typical automated system is described in terms of its primary subsystems — an analog station, a digital computer, and a processor interface. The processor interface links the analog functions with the digital computer; it includes data acquisition, command function generation, and test control functions. Features of automated testing are described with emphasis on calculated variable control, control of a variable that is computed by the processor and cannot be read directly from a transducer. Three calculated variable tests are described: a yield surface probe test, a thermomechanical fatigue test, and a constant-stress-intensity range crack-growth test. Future developments are discussed.

  1. Automation of Taxiing

    Directory of Open Access Journals (Sweden)

    Jaroslav Bursík

    2017-01-01

    Full Text Available The article focuses on the possibility of automation of taxiing, which is the part of a flight, which, under adverse weather conditions, greatly reduces the operational usability of an airport, and is the only part of a flight that has not been affected by automation, yet. Taxiing is currently handled manually by the pilot, who controls the airplane based on information from visual perception. The article primarily deals with possible ways of obtaining navigational information, and its automatic transfer to the controls. Analyzed wand assessed were currently available technologies such as computer vision, Light Detection and Ranging and Global Navigation Satellite System, which are useful for navigation and their general implementation into an airplane was designed. Obstacles to the implementation were identified, too. The result is a proposed combination of systems along with their installation into airplane’s systems so that it is possible to use the automated taxiing.

  2. Supercritical Airfoil Coordinates

    Data.gov (United States)

    National Aeronautics and Space Administration — Rectangular Supercritical Wing (Ricketts) - design and measured locations are provided in an Excel file RSW_airfoil_coordinates_ricketts.xls . One sheet is with Non...

  3. Magnetic Coordinate Systems

    CERN Document Server

    Laundal, K M

    2016-01-01

    Geospace phenomena such as the aurora, plasma motion, ionospheric currents and associated magnetic field disturbances are highly organized by Earth's main magnetic field. This is due to the fact that the charged particles that comprise space plasma can move almost freely along magnetic field lines, but not across them. For this reason it is sensible to present such phenomena relative to Earth's magnetic field. A large variety of magnetic coordinate systems exist, designed for different purposes and regions, ranging from the magnetopause to the ionosphere. In this paper we review the most common magnetic coordinate systems and describe how they are defined, where they are used, and how to convert between them. The definitions are presented based on the spherical harmonic expansion coefficients of the International Geomagnetic Reference Field (IGRF) and, in some of the coordinate systems, the position of the Sun which we show how to calculate from the time and date. The most detailed coordinate systems take the...

  4. Dimensions of Organizational Coordination

    DEFF Research Database (Denmark)

    Jensen, Andreas Schmidt; Aldewereld, Huib; Dignum, Virginia

    2013-01-01

    be supported to include organizational objectives and constraints into their reasoning processes by considering two alternatives: agent reasoning and middleware regulation. We show how agents can use an organizational specification to achieve organizational objectives by delegating and coordinating...

  5. Understanding social motor coordination.

    Science.gov (United States)

    Schmidt, R C; Fitzpatrick, Paula; Caron, Robert; Mergeche, Joanna

    2011-10-01

    Recently there has been much interest in social coordination of motor movements, or as it is referred to by some researchers, joint action. This paper reviews the cognitive perspective's common coding/mirror neuron theory of joint action, describes some of its limitations and then presents the behavioral dynamics perspective as an alternative way of understanding social motor coordination. In particular, behavioral dynamics' ability to explain the temporal coordination of interacting individuals is detailed. Two experiments are then described that demonstrate how dynamical processes of synchronization are apparent in the coordination underlying everyday joint actions such as martial art exercises, hand-clapping games, and conversations. The import of this evidence is that emergent dynamic patterns such as synchronization are the behavioral order that any neural substrate supporting joint action (e.g., mirror systems) would have to sustain.

  6. Automating the CMS DAQ

    CERN Document Server

    Bauer, Gerry; Behrens, Ulf; Branson, James; Chaze, Olivier; Cittolin, Sergio; Coarasa Perez, Jose Antonio; Darlea, Georgiana Lavinia; Deldicque, Christian; Dobson, Marc; Dupont, Aymeric; Erhan, Samim; Gigi, Dominique; Glege, Frank; Gomez Ceballos, Guillelmo; Gomez-Reino Garrido, Robert; Hartl, Christian; Hegeman, Jeroen Guido; Holzner, Andre Georg; Masetti, Lorenzo; Meijers, Franciscus; Meschi, Emilio; Mommsen, Remigius; Morovic, Srecko; Nunez Barranco Fernandez, Carlos; O'Dell, Vivian; Orsini, Luciano; Ozga, Wojciech Andrzej; Paus, Christoph Maria Ernst; Petrucci, Andrea; Pieri, Marco; Racz, Attila; Raginel, Olivier; Sakulin, Hannes; Sani, Matteo; Schwick, Christoph; Spataru, Andrei Cristian; Stieger, Benjamin Bastian; Sumorok, Konstanty; Veverka, Jan; Wakefield, Christopher Colin; Zejdl, Petr

    2014-01-01

    We present the automation mechanisms that have been added to the Data Acquisition and Run Control systems of the Compact Muon Solenoid (CMS) experiment during Run 1 of the LHC, ranging from the automation of routine tasks to automatic error recovery and context-sensitive guidance to the operator. These mechanisms helped CMS to maintain a data taking efficiency above 90\\% and to even improve it to 95\\% towards the end of Run 1, despite an increase in the occurrence of single-event upsets in sub-detector electronics at high LHC luminosity.

  7. Automating the CMS DAQ

    Energy Technology Data Exchange (ETDEWEB)

    Bauer, G.; et al.

    2014-01-01

    We present the automation mechanisms that have been added to the Data Acquisition and Run Control systems of the Compact Muon Solenoid (CMS) experiment during Run 1 of the LHC, ranging from the automation of routine tasks to automatic error recovery and context-sensitive guidance to the operator. These mechanisms helped CMS to maintain a data taking efficiency above 90% and to even improve it to 95% towards the end of Run 1, despite an increase in the occurrence of single-event upsets in sub-detector electronics at high LHC luminosity.

  8. Incremental learning for automated knowledge capture.

    Energy Technology Data Exchange (ETDEWEB)

    Benz, Zachary O.; Basilico, Justin Derrick; Davis, Warren Leon,; Dixon, Kevin R.; Jones, Brian S.; Martin, Nathaniel; Wendt, Jeremy Daniel

    2013-12-01

    People responding to high-consequence national-security situations need tools to help them make the right decision quickly. The dynamic, time-critical, and ever-changing nature of these situations, especially those involving an adversary, require models of decision support that can dynamically react as a situation unfolds and changes. Automated knowledge capture is a key part of creating individualized models of decision making in many situations because it has been demonstrated as a very robust way to populate computational models of cognition. However, existing automated knowledge capture techniques only populate a knowledge model with data prior to its use, after which the knowledge model is static and unchanging. In contrast, humans, including our national-security adversaries, continually learn, adapt, and create new knowledge as they make decisions and witness their effect. This artificial dichotomy between creation and use exists because the majority of automated knowledge capture techniques are based on traditional batch machine-learning and statistical algorithms. These algorithms are primarily designed to optimize the accuracy of their predictions and only secondarily, if at all, concerned with issues such as speed, memory use, or ability to be incrementally updated. Thus, when new data arrives, batch algorithms used for automated knowledge capture currently require significant recomputation, frequently from scratch, which makes them ill suited for use in dynamic, timecritical, high-consequence decision making environments. In this work we seek to explore and expand upon the capabilities of dynamic, incremental models that can adapt to an ever-changing feature space.

  9. Altering user' acceptance of automation through prior automation exposure.

    Science.gov (United States)

    Bekier, Marek; Molesworth, Brett R C

    2016-08-22

    Air navigation service providers worldwide see increased use of automation as one solution to overcome the capacity constraints imbedded in the present air traffic management (ATM) system. However, increased use of automation within any system is dependent on user acceptance. The present research sought to determine if the point at which an individual is no longer willing to accept or cooperate with automation can be manipulated. Forty participants underwent training on a computer-based air traffic control programme, followed by two ATM exercises (order counterbalanced), one with and one without the aid of automation. Results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation ('tipping point') decreased; suggesting it is indeed possible to alter automation acceptance. Practitioner Summary: This paper investigates whether the point at which a user of automation rejects automation (i.e. 'tipping point') is constant or can be manipulated. The results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation decreased; suggesting it is possible to alter automation acceptance.

  10. Continuous parallel coordinates.

    Science.gov (United States)

    Heinrich, Julian; Weiskopf, Daniel

    2009-01-01

    Typical scientific data is represented on a grid with appropriate interpolation or approximation schemes,defined on a continuous domain. The visualization of such data in parallel coordinates may reveal patterns latently contained in the data and thus can improve the understanding of multidimensional relations. In this paper, we adopt the concept of continuous scatterplots for the visualization of spatially continuous input data to derive a density model for parallel coordinates. Based on the point-line duality between scatterplots and parallel coordinates, we propose a mathematical model that maps density from a continuous scatterplot to parallel coordinates and present different algorithms for both numerical and analytical computation of the resulting density field. In addition, we show how the 2-D model can be used to successively construct continuous parallel coordinates with an arbitrary number of dimensions. Since continuous parallel coordinates interpolate data values within grid cells, a scalable and dense visualization is achieved, which will be demonstrated for typical multi-variate scientific data.

  11. Retrieval-based Face Annotation by Weak Label Regularized Local Coordinate Coding.

    Science.gov (United States)

    Wang, Dayong; Hoi, Steven C H; He, Ying; Zhu, Jianke; Mei, Tao; Luo, Jiebo

    2013-08-01

    Retrieval-based face annotation is a promising paradigm of mining massive web facial images for automated face annotation. This paper addresses a critical problem of such paradigm, i.e., how to effectively perform annotation by exploiting the similar facial images and their weak labels which are often noisy and incomplete. In particular, we propose an effective Weak Label Regularized Local Coordinate Coding (WLRLCC) technique, which exploits the principle of local coordinate coding in learning sparse features, and employs the idea of graph-based weak label regularization to enhance the weak labels of the similar facial images. We present an efficient optimization algorithm to solve the WLRLCC task. We conduct extensive empirical studies on two large-scale web facial image databases: (i) a Western celebrity database with a total of $6,025$ persons and $714,454$ web facial images, and (ii)an Asian celebrity database with $1,200$ persons and $126,070$ web facial images. The encouraging results validate the efficacy of the proposed WLRLCC algorithm. To further improve the efficiency and scalability, we also propose a PCA-based approximation scheme and an offline approximation scheme (AWLRLCC), which generally maintains comparable results but significantly saves much time cost. Finally, we show that WLRLCC can also tackle two existing face annotation tasks with promising performance.

  12. Case Studies on an Approach to Multiple Autonomous Vehicle Motion Coordination

    Institute of Scientific and Technical Information of China (English)

    D.K. Liu; X. Wu; G. Paul; G. Dissanayake

    2006-01-01

    This paper conducts a series of case studies on a novel Simultaneous Path and Motion Planning (SiPaMoP) approach[1] to multiple autonomous or Automated Guided Vehicle (AGV) motion coordination in bidirectional networks. The SiPaMoP approach plans collision-free paths for vehicles based on the principle of shortest path by dynamically changing the vehicles' paths, traveling speeds or waiting times, whichever gives the shortest traveling time. It integrates path planning, collision avoidance and motion planning into a comprehensive model and optimizes the vehicles' path and motion to minimize the completion time of a set of tasks. Five case studies, i.e., head-on collision avoidance,catching-up collision avoidance, buffer node generation and collision avoidance, prioritybased motion coordination, and safety distance based planning, are presented. The results demonstrated that the method can effectively plan the path and motion for a team of autonomous vehicles or AGVs, and solve the problems of traffic congestion and collision under various conditions.

  13. Coordinated garbage collection for raid array of solid state disks

    Science.gov (United States)

    Dillow, David A; Ki, Youngjae; Oral, Hakki S; Shipman, Galen M; Wang, Feiyi

    2014-04-29

    An optimized redundant array of solid state devices may include an array of one or more optimized solid-state devices and a controller coupled to the solid-state devices for managing the solid-state devices. The controller may be configured to globally coordinate the garbage collection activities of each of said optimized solid-state devices, for instance, to minimize the degraded performance time and increase the optimal performance time of the entire array of devices.

  14. Coordination under the Shadow of Career Concerns

    DEFF Research Database (Denmark)

    Koch, Alexander; Morgenstern, Albrecht

    To innovate, firms require their employees to develop novel ideas and to coordinate with each other to turn these ideas into products, services or business strategies. Because the quality of implemented designs that employees are associated with affects their labor market opportunities, career...... concerns arise that can both be ‘good’ (enhancing incentives for effort in developing ideas) and ‘bad’ (preventing voluntary coordination). Depending on the strength of career concerns, either group-based incentives or team production are optimal. This finding provides a possible link between the increased...

  15. Phoenito experiments: combining the strengths of commercial crystallization automation.

    Science.gov (United States)

    Newman, Janet; Pham, Tam M; Peat, Thomas S

    2008-11-01

    The use of crystallization robots for initial screening in macromolecular crystallization is well established. This paper describes how four general optimization techniques, growth-rate modulation, fine screening, seeding and additive screening, have been adapted for automation in a medium-throughput crystallization service facility. The use of automation for more challenging optimization experiments is discussed, as is a novel way of using both the Mosquito and the Phoenix nano-dispensing robots during the setup of a single crystallization plate. This dual-dispenser technique plays to the strengths of both machines.

  16. A scheme for a future distribution automation system in Finnish utilities

    Energy Technology Data Exchange (ETDEWEB)

    Lehtonen, M.; Kaerkkaeinen, S. [VTT Energy, Espoo (Finland); Partanen, J. [Lappeenranta Univ. of Technology (Finland)

    1998-08-01

    This presentation summarizes the results of a project, the aim of which was to define the optimal set of functions for the future distribution automation (DA) systems in Finland. The general factors, which affect the automation needs, are first discussed. The benefits of various functions of DA and demand side management (DSM) are then studied. Next a computer model for a DA feasibility analysis is presented, and some computation results are given. From these, the proposed automation scheme is finally concluded

  17. A scheme for a future distribution automation system in Finnish utilities

    Energy Technology Data Exchange (ETDEWEB)

    Lehtonen, M.; Kaerkkaeinen, S. [VTT Energy, Espoo (Finland); Partanen, J. [Lappeenranta Univ. of Technology (Finland)

    1996-12-31

    This presentation summarizes the results of a project, the aim of which was to define the optimal set of functions for the future distribution automation (DA) systems in Finland. The general factors. which affect the automation needs, are first discussed. The benefits of various functions of DA and demand side management (DSM) are then studied. Next a computer model for a DA feasibility analysis is presented, and some computation results are given. From these. the proposed automation scheme is finally concluded

  18. Microcontroller for automation application

    Science.gov (United States)

    Cooper, H. W.

    1975-01-01

    The description of a microcontroller currently being developed for automation application was given. It is basically an 8-bit microcomputer with a 40K byte random access memory/read only memory, and can control a maximum of 12 devices through standard 15-line interface ports.

  19. Automated Composite Column Wrapping

    OpenAIRE

    ECT Team, Purdue

    2007-01-01

    The Automated Composite Column Wrapping is performed by a patented machine known as Robo-Wrapper. Currently there are three versions of the machine available for bridge retrofit work depending on the size of the columns being wrapped. Composite column retrofit jacket systems can be structurally just as effective as conventional steel jacketing in improving the seismic response characteristics of substandard reinforced concrete columns.

  20. Automated Web Applications Testing

    Directory of Open Access Journals (Sweden)

    Alexandru Dan CĂPRIŢĂ

    2009-01-01

    Full Text Available Unit tests are a vital part of several software development practicesand processes such as Test-First Programming, Extreme Programming andTest-Driven Development. This article shortly presents the software quality andtesting concepts as well as an introduction to an automated unit testingframework for PHP web based applications.

  1. Automated Student Model Improvement

    Science.gov (United States)

    Koedinger, Kenneth R.; McLaughlin, Elizabeth A.; Stamper, John C.

    2012-01-01

    Student modeling plays a critical role in developing and improving instruction and instructional technologies. We present a technique for automated improvement of student models that leverages the DataShop repository, crowd sourcing, and a version of the Learning Factors Analysis algorithm. We demonstrate this method on eleven educational…

  2. Automated Accounting. Instructor Guide.

    Science.gov (United States)

    Moses, Duane R.

    This curriculum guide was developed to assist business instructors using Dac Easy Accounting College Edition Version 2.0 software in their accounting programs. The module consists of four units containing assignment sheets and job sheets designed to enable students to master competencies identified in the area of automated accounting. The first…

  3. ERGONOMICS AND PROCESS AUTOMATION

    OpenAIRE

    Carrión Muñoz, Rolando; Docente de la FII - UNMSM

    2014-01-01

    The article shows the role that ergonomics in automation of processes, and the importance for Industrial Engineering.  El artículo nos muestra el papel que tiene la ergonomía en la automatización de los procesos, y la importancia para la Ingeniería Industrial.

  4. Mechatronic Design Automation

    DEFF Research Database (Denmark)

    Fan, Zhun

    successfully design analogue filters, vibration absorbers, micro-electro-mechanical systems, and vehicle suspension systems, all in an automatic or semi-automatic way. It also investigates the very important issue of co-designing plant-structures and dynamic controllers in automated design of Mechatronic...

  5. Protokoller til Home Automation

    DEFF Research Database (Denmark)

    Kjær, Kristian Ellebæk

    2008-01-01

    computer, der kan skifte mellem foruddefinerede indstillinger. Nogle gange kan computeren fjernstyres over internettet, så man kan se hjemmets status fra en computer eller måske endda fra en mobiltelefon. Mens nævnte anvendelser er klassiske indenfor home automation, er yderligere funktionalitet dukket op...

  6. Myths in test automation

    Directory of Open Access Journals (Sweden)

    Jazmine Francis

    2015-01-01

    Full Text Available Myths in automation of software testing is an issue of discussion that echoes about the areas of service in validation of software industry. Probably, the first though that appears in knowledgeable reader would be Why this old topic again? What's New to discuss the matter? But, for the first time everyone agrees that undoubtedly automation testing today is not today what it used to be ten or fifteen years ago, because it has evolved in scope and magnitude. What began as a simple linear scripts for web applications today has a complex architecture and a hybrid framework to facilitate the implementation of testing applications developed with various platforms and technologies. Undoubtedly automation has advanced, but so did the myths associated with it. The change in perspective and knowledge of people on automation has altered the terrain. This article reflects the points of views and experience of the author in what has to do with the transformation of the original myths in new versions, and how they are derived; also provides his thoughts on the new generation of myths.

  7. Coordinated Decision Optimization Based on Hiberarchy Discrete Dynamic Bayesian Network%基于多层递推离散动态贝叶斯网络的协同决策优化

    Institute of Scientific and Technical Information of China (English)

    唐政; 高晓光; 孙超

    2011-01-01

    Based on the hiberarchy characteristic of the coordinated use between electronic countermeasure and fire attack, the hiberarchy discrete dynamic bayesian network coordinated decision model is established. Then, estate transition probability is confirmed according to the coordinated degree between electronic countermeasure and fire attack. Furthermore, the forward and back algorithm is proposed to solve the coordinated decision problem. Finally, the simulation result shows that the proposed method can reduce the communication transmission, shorten decision time, and enhance the decision adaptability on dynamic and uncertainty condition greatly.%根据电子对抗与火力打击协同使用分层递阶控制的特点,建立基于多层递推离散动态贝叶斯网络协同使用决策模型.依据协同理论,引入协同度概念,用协同度作为确定电子对抗与火力打击协同使用决策随时间变化的状态转移概率的依据.提出前向后向递归搜索算法,求解电子对抗与火力打击协同使用决策问题.通过仿真实验证明,该方法能够减少信息传输,缩短决策时间,大大增强动态不确定条件下决策的适应性.

  8. Automating spectral measurements

    Science.gov (United States)

    Goldstein, Fred T.

    2008-09-01

    This paper discusses the architecture of software utilized in spectroscopic measurements. As optical coatings become more sophisticated, there is mounting need to automate data acquisition (DAQ) from spectrophotometers. Such need is exacerbated when 100% inspection is required, ancillary devices are utilized, cost reduction is crucial, or security is vital. While instrument manufacturers normally provide point-and-click DAQ software, an application programming interface (API) may be missing. In such cases automation is impossible or expensive. An API is typically provided in libraries (*.dll, *.ocx) which may be embedded in user-developed applications. Users can thereby implement DAQ automation in several Windows languages. Another possibility, developed by FTG as an alternative to instrument manufacturers' software, is the ActiveX application (*.exe). ActiveX, a component of many Windows applications, provides means for programming and interoperability. This architecture permits a point-and-click program to act as automation client and server. Excel, for example, can control and be controlled by DAQ applications. Most importantly, ActiveX permits ancillary devices such as barcode readers and XY-stages to be easily and economically integrated into scanning procedures. Since an ActiveX application has its own user-interface, it can be independently tested. The ActiveX application then runs (visibly or invisibly) under DAQ software control. Automation capabilities are accessed via a built-in spectro-BASIC language with industry-standard (VBA-compatible) syntax. Supplementing ActiveX, spectro-BASIC also includes auxiliary serial port commands for interfacing programmable logic controllers (PLC). A typical application is automatic filter handling.

  9. The research and design of automated warehouse optimal management system based on RFID%基于RFID的自动化立体仓库优化管理系统的设计与实现

    Institute of Scientific and Technical Information of China (English)

    杨玮; 鄢陈; 张志远; 张成泽; 张亚楠

    2013-01-01

    This paper introduces the technology of RFID,and constructs the basic structure of the automated warehouse management system which is based on RFID technology,and also analyzes structure and function of the system.Through the study of the typical RFID reader,it designs the data acquisition system based on the middleware technology,realizing the normal communication between the RFID hardware and scheduling system software,guiding the design of the warehouse management system based on RDID Technology.Through the application of RFID technology,improved the operational efficiency and accuracy of the automated warehouse,and reduced the man-made error.%介绍了RFID技术,构建了基于RFID技术的自动化立体仓库管理系统的基本结构,对其结构和功能进行了分析.并通过对典型RFID阅读器的研究,对基于中间件技术的数据采集子系统进行设计,实现RFID硬件与调度系统软件的正常通信,对基于RDID技术的仓储管理系统的设计有指导意义.通过RFID技术的应用,提高了自动化立体仓库的运作效率和准确性,减少了人为的差错.

  10. Automated measurement of Drosophila wings

    Directory of Open Access Journals (Sweden)

    Mezey Jason

    2003-12-01

    Full Text Available Abstract Background Many studies in evolutionary biology and genetics are limited by the rate at which phenotypic information can be acquired. The wings of Drosophila species are a favorable target for automated analysis because of the many interesting questions in evolution and development that can be addressed with them, and because of their simple structure. Results We have developed an automated image analysis system (WINGMACHINE that measures the positions of all the veins and the edges of the wing blade of Drosophilid flies. A video image is obtained with the aid of a simple suction device that immobilizes the wing of a live fly. Low-level processing is used to find the major intersections of the veins. High-level processing then optimizes the fit of an a priori B-spline model of wing shape. WINGMACHINE allows the measurement of 1 wing per minute, including handling, imaging, analysis, and data editing. The repeatabilities of 12 vein intersections averaged 86% in a sample of flies of the same species and sex. Comparison of 2400 wings of 25 Drosophilid species shows that wing shape is quite conservative within the group, but that almost all taxa are diagnosably different from one another. Wing shape retains some phylogenetic structure, although some species have shapes very different from closely related species. The WINGMACHINE system facilitates artificial selection experiments on complex aspects of wing shape. We selected on an index which is a function of 14 separate measurements of each wing. After 14 generations, we achieved a 15 S.D. difference between up and down-selected treatments. Conclusion WINGMACHINE enables rapid, highly repeatable measurements of wings in the family Drosophilidae. Our approach to image analysis may be applicable to a variety of biological objects that can be represented as a framework of connected lines.

  11. Advances in Automation and Robotics, Vol 2

    CERN Document Server

    International conference on Automation and Robotics-ICAR2011

    2012-01-01

    The international conference on Automation and Robotics-ICAR2011 is held during December 12-13, 2011 in Dubai, UAE. The conference is intended to bring together the researchers and engineers/technologists working in different aspects of intelligent control systems and optimization, robotics and automation, signal processing, sensors, systems modeling and control, industrial engineering, production and management. This part of proceedings includes 82 papers contributed by many researchers in relevant topic areas covered at ICAR2011 from various countries such as France, Japan, USA, Korea and China etc.  The session topics of this proceedings are signal processing and industrial engineering, production and management, which includes papers about signal reconstruction, mechanical sensors, real-time systems control system identification, change detection problems, business process modeling, production planning, scheduling and control, computer-based manufacturing technologies, systems modeling and simulation, fa...

  12. Lexical evolution rates by automated stability measure

    CERN Document Server

    Petroni, Filippo

    2009-01-01

    Phylogenetic trees can be reconstructed from the matrix which contains the distances between all pairs of languages in a family. Recently, we proposed a new method which uses normalized Levenshtein distances among words with same meaning and averages on all the items of a given list. Decisions about the number of items in the input lists for language comparison have been debated since the beginning of glottochronology. The point is that words associated to some of the meanings have a rapid lexical evolution. Therefore, a large vocabulary comparison is only apparently more accurate then a smaller one since many of the words do not carry any useful information. In principle, one should find the optimal length of the input lists studying the stability of the different items. In this paper we tackle the problem with an automated methodology only based on our normalized Levenshtein distance. With this approach, the program of an automated reconstruction of languages relationships is completed.

  13. Coordinating Interactions: The Event Coordination Notation

    DEFF Research Database (Denmark)

    Kindler, Ekkart

    on a much more technical level. The Event Coordination Notation (ECNO) allows modelling the behaviour of an application on a high level of abstraction that is closer to the application’s domain than to the software realizing it. Still, these models contain all necessary details for actually executing...... implementation of ECNO, which consists of a modelling environment based on Eclipse and the Eclipse Modeling Framework (EMF) and an execution engine, which fully supports all the concepts and features of ECNO discussed in this technical report. All the examples are based on EMF, but the ECNO Engine can be used......The purpose of a domain model is to concisely capture the concepts of an application’s domain, and their relation among each other. Even though the main purpose of domain models is not on implementing the application, major parts of an application can be generated from the application’s domain...

  14. Minimization of Distribution Grid Losses by Consumption Coordination

    DEFF Research Database (Denmark)

    Juelsgaard, Morten; Andersen, Palle; Wisniewski, Rafal

    2013-01-01

    In this work, we address the problem of optimizing the electrical consumption patterns for a community of closely located households, with a large degree of flexible consumption, and further some degree of local electricity production from solar panels. We describe optimization methods...... for coordinating consumption of electrical energy within the community, with the purpose of reducing grid loading and active power losses. For this we present a simplified model of the electrical grid, including system losses and capacity constraints. Coordination is performed in a distributed fashion, where each...... are obeyed. These objectives are enforced by coordinating consumers through nonlinear tariffs on power consumption. We present simulation test-cases, illustrating that significant reduction of active losses, can be obtained by such coordination. The distributed optimization algorithm, employs the alternating...

  15. Coordinate Standard Measurement Development

    Energy Technology Data Exchange (ETDEWEB)

    Hanshaw, R.A.

    2000-02-18

    A Shelton Precision Interferometer Base, which is used for calibration of coordinate standards, was improved through hardware replacement, software geometry error correction, and reduction of vibration effects. Substantial increases in resolution and reliability, as well as reduction in sampling time, were achieved through hardware replacement; vibration effects were reduced substantially through modification of the machine component dampening and software routines; and the majority of the machine's geometry error was corrected through software geometry error correction. Because of these modifications, the uncertainty of coordinate standards calibrated on this device has been reduced dramatically.

  16. Introduction to Coordination Chemistry

    CERN Document Server

    Lawrance, Geoffrey Alan

    2010-01-01

    Introduction to Coordination Chemistry examines and explains how metals and molecules that bind as ligands interact, and the consequences of this assembly process. This book describes the chemical and physical properties and behavior of the complex assemblies that form, and applications that may arise as a result of these properties. Coordination complexes are an important but often hidden part of our world?even part of us?and what they do is probed in this book. This book distills the essence of this topic for undergraduate students and for research scientists.

  17. Quantifying linguistic coordination

    DEFF Research Database (Denmark)

    Fusaroli, Riccardo; Tylén, Kristian

    ). We employ nominal recurrence analysis (Orsucci et al 2005, Dale et al 2011) on the decision-making conversations between the participants. We report strong correlations between various indexes of recurrence and collective performance. We argue this method allows us to quantify the qualities......Language has been defined as a social coordination device (Clark 1996) enabling innovative modalities of joint action. However, the exact coordinative dynamics over time and their effects are still insufficiently investigated and quantified. Relying on the data produced in a collective decision...

  18. Automation technology saves 30% energy; Automatisierungstechnik spart 30% Energie ein

    Energy Technology Data Exchange (ETDEWEB)

    Klinkow, Torsten; Meyer, Michael [Wago Kontakttechnik GmbH und Co. KG, Minden (Germany)

    2013-04-01

    A systematic energy management is in more demand than ever in order to reduce the increasing energy costs. What used to be a difficult puzzle consisting of different technology components in the early days is today easier to solve by means of a standardized and cost-effective automation technology. With its IO system, Wago Kontakttechnik GmbH and Co. KG (Minden, Federal Republic of Germany) supplies a complete and coordinated portfolio for the energy efficiency.

  19. ATLAS Distributed Computing Automation

    CERN Document Server

    Schovancova, J; The ATLAS collaboration; Borrego, C; Campana, S; Di Girolamo, A; Elmsheuser, J; Hejbal, J; Kouba, T; Legger, F; Magradze, E; Medrano Llamas, R; Negri, G; Rinaldi, L; Sciacca, G; Serfon, C; Van Der Ster, D C

    2012-01-01

    The ATLAS Experiment benefits from computing resources distributed worldwide at more than 100 WLCG sites. The ATLAS Grid sites provide over 100k CPU job slots, over 100 PB of storage space on disk or tape. Monitoring of status of such a complex infrastructure is essential. The ATLAS Grid infrastructure is monitored 24/7 by two teams of shifters distributed world-wide, by the ATLAS Distributed Computing experts, and by site administrators. In this paper we summarize automation efforts performed within the ATLAS Distributed Computing team in order to reduce manpower costs and improve the reliability of the system. Different aspects of the automation process are described: from the ATLAS Grid site topology provided by the ATLAS Grid Information System, via automatic site testing by the HammerCloud, to automatic exclusion from production or analysis activities.

  20. Rapid automated nuclear chemistry

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, R.A.

    1979-05-31

    Rapid Automated Nuclear Chemistry (RANC) can be thought of as the Z-separation of Neutron-rich Isotopes by Automated Methods. The range of RANC studies of fission and its products is large. In a sense, the studies can be categorized into various energy ranges from the highest where the fission process and particle emission are considered, to low energies where nuclear dynamics are being explored. This paper presents a table which gives examples of current research using RANC on fission and fission products. The remainder of this text is divided into three parts. The first contains a discussion of the chemical methods available for the fission product elements, the second describes the major techniques, and in the last section, examples of recent results are discussed as illustrations of the use of RANC.

  1. The automation of science.

    Science.gov (United States)

    King, Ross D; Rowland, Jem; Oliver, Stephen G; Young, Michael; Aubrey, Wayne; Byrne, Emma; Liakata, Maria; Markham, Magdalena; Pir, Pinar; Soldatova, Larisa N; Sparkes, Andrew; Whelan, Kenneth E; Clare, Amanda

    2009-04-03

    The basis of science is the hypothetico-deductive method and the recording of experiments in sufficient detail to enable reproducibility. We report the development of Robot Scientist "Adam," which advances the automation of both. Adam has autonomously generated functional genomics hypotheses about the yeast Saccharomyces cerevisiae and experimentally tested these hypotheses by using laboratory automation. We have confirmed Adam's conclusions through manual experiments. To describe Adam's research, we have developed an ontology and logical language. The resulting formalization involves over 10,000 different research units in a nested treelike structure, 10 levels deep, that relates the 6.6 million biomass measurements to their logical description. This formalization describes how a machine contributed to scientific knowledge.

  2. Automated Layout Generation of Analogue and Mixed-Signal ASIC's

    DEFF Research Database (Denmark)

    Bloch, Rene

    The research and development carried out in this Ph.D. study focusses on two key areas of the design flow for analogue and mixed-signal integrated circuit design, the mixed-signal floorplanning and the analogue layout generation.A novel approach to floorplanning is presented which provides true...... flow.A new design flow for automated layout generation of general analogue integrated circuits is presented. The design flow provides an automated design path from a sized circuit schematic to the final layout containing the placed, but unrouted, devices of the circuit. The analogue circuit layout...... interactive floorplanning capabilities due to a new implementation variant of a Genetic Algorithm. True interactive floorplanning allows the designer to communicate with existing floorplans during optimization. By entering the "ideas" and expertise of the designer into the optimization algorithm the automated...

  3. The Automated Medical Office

    OpenAIRE

    1990-01-01

    With shock and surprise many physicians learned in the 1980s that they must change the way they do business. Competition for patients, increasing government regulation, and the rapidly escalating risk of litigation forces physicians to seek modern remedies in office management. The author describes a medical clinic that strives to be paperless using electronic innovation to solve the problems of medical practice management. A computer software program to automate information management in a c...

  4. Automation of printing machine

    OpenAIRE

    Sušil, David

    2016-01-01

    Bachelor thesis is focused on the automation of the printing machine and comparing the two types of printing machines. The first chapter deals with the history of printing, typesettings, printing techniques and various kinds of bookbinding. The second chapter describes the difference between sheet-fed printing machines and offset printing machines, the difference between two representatives of rotary machines, technological process of the products on these machines, the description of the mac...

  5. Automated Cooperative Trajectories

    Science.gov (United States)

    Hanson, Curt; Pahle, Joseph; Brown, Nelson

    2015-01-01

    This presentation is an overview of the Automated Cooperative Trajectories project. An introduction to the phenomena of wake vortices is given, along with a summary of past research into the possibility of extracting energy from the wake by flying close parallel trajectories. Challenges and barriers to adoption of civilian automatic wake surfing technology are identified. A hardware-in-the-loop simulation is described that will support future research. Finally, a roadmap for future research and technology transition is proposed.

  6. Optimal Real-time Dispatch for Integrated Energy Systems: An Ontology-Based Multi-Agent Approach

    DEFF Research Database (Denmark)

    Anvari-Moghaddam, Amjad; Rahimi-Kian, Ashkan; Mirian, Maryam S.;

    2016-01-01

    into a cohesive, networked package that fully utilizes smart energy-efficient end-use devices, advanced building control/automation systems, and integrated communications architectures, it is possible to efficiently manage energy and comfort at the end-use location. In this paper, an ontology-driven multi......With the emerging of small-scale integrated energy systems (IESs), there are significant potentials to increase the functionality of a typical demand-side management (DSM) strategy and typical implementation of building-level distributed energy resources (DERs). By integrating DSM and DERs......-agent control system with intelligent optimizers is proposed for optimal real-time dispatch of an integrated building and microgrid system considering coordinated demand response (DR) and DERs management. The optimal dispatch problem is formulated as a mixed integer nonlinear programing problem (MINLP...

  7. Explicit Spin Coordinates

    CERN Document Server

    Hunter, G; Hunter, Geoffrey; Schlifer, Ian

    2005-01-01

    The recently established existence of spherical harmonic functions, $Y_\\ell^{m}(\\theta,\\phi)$ for half-odd-integer values of $\\ell$ and $m$, allows for the introduction into quantum chemistry of explicit electron spin-coordinates; i.e. spherical polar angles $\\theta_s, \\phi_s$, that specify the orientation of the spin angular momentum vector in space. In this coordinate representation the spin angular momentum operators, $S^2, S_z$, are represented by the usual differential operators in spherical polar coordinates (commonly used for $L^2, L_z$), and their electron-spin eigenfunctions are $\\sqrt{\\sin\\theta_s} \\exp(\\pm\\phi_s/2)$. This eigenfunction representation has the pedagogical advantage over the abstract spin eigenfunctions, $\\alpha, \\beta,$ that ``integration over spin coordinates'' is a true integration (over the angles $\\theta_s, \\phi_s$). In addition they facilitate construction of many electron wavefunctions in which the electron spins are neither parallel nor antiparallel, but inclined at an interme...

  8. Block coordination copolymers

    Energy Technology Data Exchange (ETDEWEB)

    Koh, Kyoung Moo; Wong-Foy, Antek G; Matzger, Adam J; Benin, Annabelle I; Willis, Richard R

    2014-11-11

    The present invention provides compositions of crystalline coordination copolymers wherein multiple organic molecules are assembled to produce porous framework materials with layered or core-shell structures. These materials are synthesized by sequential growth techniques such as the seed growth technique. In addition, the invention provides a simple procedure for controlling functionality.

  9. Coordinating Work with Groupware

    DEFF Research Database (Denmark)

    Pors, Jens Kaaber; Simonsen, Jesper

    2003-01-01

    One important goal of employing groupware is to make possible complex collaboration between geographically distributed groups. This requires a dual transformation of both technology and work practice. The challenge is to re­duce the complexity of the coordination work by successfully inte­grating...

  10. Recursive Advice for Coordination

    DEFF Research Database (Denmark)

    Terepeta, Michal Tomasz; Nielson, Hanne Riis; Nielson, Flemming

    2012-01-01

    Aspect-oriented programming is a programming paradigm that is often praised for the ability to create modular software and separate cross-cutting concerns. Recently aspects have been also considered in the context of coordination languages, offering similar advantages. However, introducing aspects...

  11. Block coordination copolymers

    Science.gov (United States)

    Koh, Kyoung Moo; Wong-Foy, Antek G; Matzger, Adam J; Benin, Annabelle I; Willis, Richard R

    2012-11-13

    The present invention provides compositions of crystalline coordination copolymers wherein multiple organic molecules are assembled to produce porous framework materials with layered or core-shell structures. These materials are synthesized by sequential growth techniques such as the seed growth technique. In addition, the invention provides a simple procedure for controlling functionality.

  12. Contaminant analysis automation demonstration proposal

    Energy Technology Data Exchange (ETDEWEB)

    Dodson, M.G.; Schur, A.; Heubach, J.G.

    1993-10-01

    The nation-wide and global need for environmental restoration and waste remediation (ER&WR) presents significant challenges to the analytical chemistry laboratory. The expansion of ER&WR programs forces an increase in the volume of samples processed and the demand for analysis data. To handle this expanding volume, productivity must be increased. However. The need for significantly increased productivity, faces contaminant analysis process which is costly in time, labor, equipment, and safety protection. Laboratory automation offers a cost effective approach to meeting current and future contaminant analytical laboratory needs. The proposed demonstration will present a proof-of-concept automated laboratory conducting varied sample preparations. This automated process also highlights a graphical user interface that provides supervisory, control and monitoring of the automated process. The demonstration provides affirming answers to the following questions about laboratory automation: Can preparation of contaminants be successfully automated?; Can a full-scale working proof-of-concept automated laboratory be developed that is capable of preparing contaminant and hazardous chemical samples?; Can the automated processes be seamlessly integrated and controlled?; Can the automated laboratory be customized through readily convertible design? and Can automated sample preparation concepts be extended to the other phases of the sample analysis process? To fully reap the benefits of automation, four human factors areas should be studied and the outputs used to increase the efficiency of laboratory automation. These areas include: (1) laboratory configuration, (2) procedures, (3) receptacles and fixtures, and (4) human-computer interface for the full automated system and complex laboratory information management systems.

  13. Automated generation of curved planar reformations from MR images of the spine

    Energy Technology Data Exchange (ETDEWEB)

    Vrtovec, Tomaz [Faculty of Electrical Engineering, University of Ljubljana, Trzaska 25, SI-1000 Ljubljana (Slovenia); Ourselin, Sebastien [CSIRO ICT Centre, Autonomous Systems Laboratory, BioMedIA Lab, Locked Bag 17, North Ryde, NSW 2113 (Australia); Gomes, Lavier [Department of Radiology, Westmead Hospital, University of Sydney, Hawkesbury Road, Westmead NSW 2145 (Australia); Likar, Bostjan [Faculty of Electrical Engineering, University of Ljubljana, Trzaska 25, SI-1000 Ljubljana (Slovenia); Pernus, Franjo [Faculty of Electrical Engineering, University of Ljubljana, Trzaska 25, SI-1000 Ljubljana (Slovenia)

    2007-05-21

    A novel method for automated curved planar reformation (CPR) of magnetic resonance (MR) images of the spine is presented. The CPR images, generated by a transformation from image-based to spine-based coordinate system, follow the structural shape of the spine and allow the whole course of the curved anatomy to be viewed in individual cross-sections. The three-dimensional (3D) spine curve and the axial vertebral rotation, which determine the transformation, are described by polynomial functions. The 3D spine curve passes through the centres of vertebral bodies, while the axial vertebral rotation determines the rotation of vertebrae around the axis of the spinal column. The optimal polynomial parameters are obtained by a robust refinement of the initial estimates of the centres of vertebral bodies and axial vertebral rotation. The optimization framework is based on the automatic image analysis of MR spine images that exploits some basic anatomical properties of the spine. The method was evaluated on 21 MR images from 12 patients and the results provided a good description of spine anatomy, with mean errors of 2.5 mm and 1.7{sup 0} for the position of the 3D spine curve and axial rotation of vertebrae, respectively. The generated CPR images are independent of the position of the patient in the scanner while comprising both anatomical and geometrical properties of the spine.

  14. Automated generation of curved planar reformations from MR images of the spine

    Science.gov (United States)

    Vrtovec, Tomaz; Ourselin, Sébastien; Gomes, Lavier; Likar, Boštjan; Pernuš, Franjo

    2007-05-01

    A novel method for automated curved planar reformation (CPR) of magnetic resonance (MR) images of the spine is presented. The CPR images, generated by a transformation from image-based to spine-based coordinate system, follow the structural shape of the spine and allow the whole course of the curved anatomy to be viewed in individual cross-sections. The three-dimensional (3D) spine curve and the axial vertebral rotation, which determine the transformation, are described by polynomial functions. The 3D spine curve passes through the centres of vertebral bodies, while the axial vertebral rotation determines the rotation of vertebrae around the axis of the spinal column. The optimal polynomial parameters are obtained by a robust refinement of the initial estimates of the centres of vertebral bodies and axial vertebral rotation. The optimization framework is based on the automatic image analysis of MR spine images that exploits some basic anatomical properties of the spine. The method was evaluated on 21 MR images from 12 patients and the results provided a good description of spine anatomy, with mean errors of 2.5 mm and 1.7° for the position of the 3D spine curve and axial rotation of vertebrae, respectively. The generated CPR images are independent of the position of the patient in the scanner while comprising both anatomical and geometrical properties of the spine.

  15. 采用两阶段优化模型的电动汽车充电站内有序充电策略%Two-Stage Optimization Model Based Coordinated Charging for EV Charging Station

    Institute of Scientific and Technical Information of China (English)

    张良; 严正; 冯冬涵; 许少伦; 李乃湖; 景雷

    2014-01-01

    Under the premise of satisfying the charging demand of electric vehicle (EV) and complying with the restriction of distribution transformer capacity, a first-stage optimal EV charging model, which takes the maximized charging revenue of the charging station as the objective, is established. Considering maximizing the incentive to reducing peak-valley difference given by grid corporation and taking the maximum charging revenue, which is not lower than that obtained by the first-stage optimization, as constraint, the second-stage optimization model is built. Based on the driving habits of EV users, the charging demand of EV users is simulated by Monte Carlo method, and the economic benefit of charging station and the load condition of distribution transformer under three situations, namely the uncoordinated charging, the charging under the first-stage optimization model and the charging under the two-stage optimization model, are simulated and analyzed. Research results show that using the first-stage optimization model and the second-stage optimization model the economic benefit of charging station can be evidently improved. However, under current time-of-use (TOU) mechanism, new peak load will occur when only the first-stage optimization model is used to control the charging of lots of EVs, and yet the improved two-stage optimization model can play a significant role in further increasing economic benefit of charging station, reducing peak-valley difference and smoothing the load curves, besides, the computational cost of the improved two-stage optimization model is still low, so it is suitable for practical application.%在满足电动汽车用户充电需求及配电变压器容量限制的前提下,建立了以充电站充电收益最大化为目标的第一阶段优化模型。考虑最大化电网公司对缩小峰谷差所给予的激励,以不低于第一阶段优化所求得的最大充电收益为约束,建立了第二阶段优化模型。根据用

  16. Quantification of Aromaticity Based on Interaction Coordinates: A New Proposal.

    Science.gov (United States)

    Pandey, Sarvesh Kumar; Manogaran, Dhivya; Manogaran, Sadasivam; Schaefer, Henry F

    2016-05-12

    Attempts to establish degrees of aromaticity in molecules are legion. In the present study, we begin with a fictitious fragment arising from only those atoms contributing to the aromatic ring and having a force field projected from the original system. For example, in benzene, we adopt a fictitious C6 fragment with a force field projected from the full benzene force field. When one bond or angle is stretched and kept fixed, followed by a partial optimization for all other internal coordinates, structures change from their respective equilibria. These changes are the responses of all other internal coordinates for constraining the bond or angle by unit displacements and relaxing the forces on all other internal coordinates. The "interaction coordinate" derived from the redundant internal coordinate compliance constants measures how a bond (its electron density) responds for constrained optimization when another bond or angle is stretched by a specified unit (its electron density is perturbed by a finite amount). The sum of interaction coordinates (responses) of all bonded neighbors for all internal coordinates of the fictitious fragment is a measure of the strength of the σ and π electron interactions leading to aromatic stability. This sum, based on interaction coordinates, appears to be successful as an aromaticity index for a range of chemical systems. Since the concept involves analyzing a fragment rather than the whole molecule, this idea is more general and is likely to lead to new insights.

  17. MULTIDISCIPLINARY ROBUST OPTIMIZATION DESIGN

    Institute of Scientific and Technical Information of China (English)

    Chen Jianjiang; Xiao Renbin; Zhong Yifang; Dou Gang

    2005-01-01

    Because uncertainty factors inevitably exist under multidisciplinary design environment, a hierarchical multidisciplinary robust optimization design based on response surface is proposed. The method constructs optimization model of subsystem level and system level to coordinate the coupling among subsystems, and also the response surface based on the artificial neural network is introduced to provide information for system level optimization tool to maintain the independence of subsystems,i.e. to realize multidisciplinary parallel design. The application case of electrical packaging demonstrates that reasonable robust optimum solution can be yielded and it is a potential and efficient multidisciplinary robust optimization approach.

  18. Automated pulmonary nodule volumetry with an optimized algorithm - accuracy at different slice thicknesses compared to unidimensional and bidimentional measurements; Lungenrundherdvolumetrie mit optimiertem Segmentierungsalgorithmus - Genauigkeit bei verschiedenen Schichtdicken verglichen mit ein- und zweidimensionalen Messungen

    Energy Technology Data Exchange (ETDEWEB)

    Vogel, M.N.; Schmuecker, S.; Maksimovich, O.; Claussen, C.D.; Horger, M. [Diagnostische und Interventionelle Radiologie, Universitaetsklinikum Tuebingen (Germany); Vonthein, R. [Institut fuer Medizinische Biometrie, Universitaetsklinikum Tuebingen (Germany); Bethge, W. [Innere Medizin 2, Universitaetsklinikum Tuebingen (Germany); Dicken, V. [Research GmbH, MeVis (Germany)

    2008-09-15

    Purpose: This in-vivo study quantifies the accuracy of automated pulmonary nodule volumetry in reconstructions with different slice thicknesses (ST) of clinical routine CT scans. The accuracy of volumetry is compared to that of unidimensional and bidimensional measurements. Materials and Methods: 28 patients underwent contrast enhanced 64-row CT scans of the chest and abdomen obtained in the clinical routine. All scans were reconstructed with 1, 3, and 5 mm ST. Volume, maximum axial diameter, and areas following the guidelines of Response Evaluation Criteria in Solid Tumors (RECIST) and the World Health Organization (WHO) were measured in all 101 lesions located in the overlap region of both scans using the new software tool OncoTreat (MeVis, Deutschland). The accuracy of quantifications in both scans was evaluated using the Bland and Altmann method. The reproducibility of measurements in dependence on the ST was compared using the likelihood ratio Chi-squared test. Results: A total of 101 nodules were identified in all patients. Segmentation was considered successful in 88.1% of the cases without local manual correction which was deliberately not employed in this study. For 80 nodules all 6 measurements were successful. These were statistically evaluated. The volumes were in the range 0.1 to 15.6 ml. Of all 80 lesions, 34 (42%) had direct contact to the pleura parietalis oder diaphragmalis and were termed parapleural, 32 (40%) were paravascular, 7 (9%) both parapleural and paravascular, the remaining 21 (27%) were free standing in the lung. The trueness differed significantly (Chi-square 7.22, p value 0.027) and was best with an ST of 3 mm and worst at 5 mm. Differences in precision were not significant (Chi-square 5.20, p value 0.074). The limits of agreement for an ST of 3 mm were {+-} 17.5% of the mean volume for volumetry, for maximum diameters {+-} 1.3 mm, and {+-} 31.8% for the calculated areas. Conclusion: Automated volumetry of pulmonary nodules using Onco

  19. Automated Critical Peak Pricing Field Tests: Program Descriptionand Results

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Watson, David; Motegi, Naoya; Kiliccote, Sila; Xu, Peng

    2006-04-06

    will respond to this form of automation for CPP. (4) Evaluate what type of DR shifting and shedding strategies can be automated. (5) Explore how automation of control strategies can increase participation rates and DR saving levels with CPP. (6) Identify optimal demand response control strategies. (7) Determine occupant and tenant response.

  20. Evolution of technology for automated peritoneal dialysis.

    Science.gov (United States)

    Ronco, Claudio; Amerling, Richard; Dell'aquila, Roberto; Rodighiero, Maria Pia; Di Loreto, Pierluigi

    2006-01-01

    Automated peritoneal dialysis (APD) is important for the further penetration of PD in the dialysis marketplace. Long dwell, equilibration PD (CAPD) has limited applicability in many patients due to inadequate solute clearance or fast membrane transport characteristics. Providing large volumes of dialysate over circumscribed hours is highly labor intensive without an automated system. Early attempts at APD were crude but effective in reducing labor, which was generally provided by nursing staff. Later evolution of PD technology has been greatly accelerated by the microchip, and by miniaturization of components. Current generation machines allow individualized fill volumes, variable tidal volumes and additional daytime automated exchanges, teledialysis, memorized delivery control, and full portability. The ideal machine should not only be able to perform all treatment schedules, but it should also optimize the performance of a selected treatment strategy. Biocompatible solutions, improved osmotic agents, and sorbent technology are all adaptable to APD. The eventual evolution toward continuous flow PD will resolve many of the current problems with both CAPD and APD.

  1. Automated sample preparation for CE-SDS.

    Science.gov (United States)

    Le, M Eleanor; Vizel, Alona; Hutterer, Katariina M

    2013-05-01

    Traditionally, CE with SDS (CE-SDS) places many restrictions on sample composition. Requirements include low salt content, known initial sample concentration, and a narrow window of final sample concentration. As these restrictions require buffer exchange for many sample types, sample preparation is often tedious and yields poor sample recoveries. To improve capacity and streamline sample preparation, an automated robotic platform was developed using the PhyNexus Micro-Extractor Automated Instrument (MEA) for both the reduced and nonreduced CE-SDS assays. This automated sample preparation normalizes sample concentration, removes salts and other contaminants, and adds the required CE-SDS reagents, essentially eliminating manual steps during sample preparation. Fc-fusion proteins and monoclonal antibodies were used in this work to demonstrate benefits of this approach when compared to the manual method. With optimized conditions, this application has demonstrated decreased analyst "hands on" time and reduced total assay time. Sample recovery greater than 90% can be achieved, regardless of initial composition and concentration of analyte.

  2. Greater Buyer Effectiveness through Automation

    Science.gov (United States)

    1989-01-01

    FOB = free on board FPAC = Federal Procurement Automation Council FPDS = Federal Procurement Data System 4GL = fourth generation language GAO = General...Procurement Automation Council ( FPAC ), entitled Compendium of Automated Procurement Systems in Federal Agencies. The FPAC inventory attempted to identify...In some cases we have updated descriptions of systems identified by the FPAC study, but many of the newer systems are identified here for the first

  3. 78 FR 66039 - Modification of National Customs Automation Program Test Concerning Automated Commercial...

    Science.gov (United States)

    2013-11-04

    ... SECURITY U.S. Customs and Border Protection Modification of National Customs Automation Program Test... National Customs Automation Program (NCAP) test concerning the Simplified Entry functionality in the...'s (CBP's) National Customs Automation Program (NCAP) test concerning Automated...

  4. 77 FR 48527 - National Customs Automation Program (NCAP) Test Concerning Automated Commercial Environment (ACE...

    Science.gov (United States)

    2012-08-14

    ... SECURITY U.S. Customs and Border Protection National Customs Automation Program (NCAP) Test Concerning...: General notice. SUMMARY: This notice announces modifications to the National Customs Automation Program...) National Customs Automation Program (NCAP) test concerning Automated Commercial Environment...

  5. Automation of a Versatile Crane (the LSMS) for Lunar Outpost Construction, Maintenance and Inspection

    Science.gov (United States)

    Doggett, William R.; Roithmayr, Carlos M.; Dorsey, John T.; Jones, Thomas C.; Shen, Haijun; Seywald, Hans; King, Bruce D.; Mikulas, Martin M., Jr.

    2009-01-01

    , greatly expanding the operational versatility of the LSMS. This paper develops the equations describing the forward and inverse relation between LSMS joint angles and Cartesian coordinates of the LSMS tip. These equations allow a variety of schemes to be used to maneuver the LSMS to optimize the maneuver. One such scheme will be described in detail that eliminates undesirable swinging of the payload at the conclusion of a maneuver, even when the payload is suspended from a passive rigid link. The swinging is undesirable when performing precision maneuvers, such as aligning an object for mating or positioning a camera. Use of the equations described here enables automated control of the LSMS greatly improving its operational versatility.

  6. Global Coordinate System

    Science.gov (United States)

    1985-02-01

    Time in hours at Oh UT is GAST (hours) = GMST + E (41) GAST in radians is GASTo (radians) = GAST (hours) L (42) The angle e required for transforming...inertial coordinates to ECEF is- 6(radians) GASTo + 6.3003880.99 (ti - th) (43) o ~ooUT Mod ( E 27) where St.i - tohLjT = (JD -2.4 106). (JDOE -2.4 x

  7. World-wide distribution automation systems

    Energy Technology Data Exchange (ETDEWEB)

    Devaney, T.M.

    1994-12-31

    A worldwide power distribution automation system is outlined. Distribution automation is defined and the status of utility automation is discussed. Other topics discussed include a distribution management system, substation feeder, and customer functions, potential benefits, automation costs, planning and engineering considerations, automation trends, databases, system operation, computer modeling of system, and distribution management systems.

  8. Motion Planning Based Coordinated Control for Hydraulic Excavators

    Institute of Scientific and Technical Information of China (English)

    GAO Yingjie; JIN Yanchao; ZHANG Qin

    2009-01-01

    Hydraulic excavator is one type of the most widely applied construction equipment for various applications mainly because of its versatility and mobility. Among the tasks performed by a hydraulic excavator, repeatable level digging or flat surface finishing may take a large percentage. Using automated functions to perform such repeatable and tedious jobs will not only greatly increase the overall productivity but more importantly also improve the operation safety. For the purpose of investigating the technology without loss of generality, this research is conducted to create a coordinate control method for the boom, arm and bucket cylinders on a hydraulic excavator to perform accurate and effective works. On the basis of the kinematic analysis of the excavator linkage system, the tip trajectory of the end-effector can be determined in terms of three hydraulic cylinders coordinated motion with a visualized method. The coordination of those hydraulic cylinders is realized by controlling three electro-hydranlic proportional valves coordinately. Therefore,the complex control algorithm of a hydraulic excavator can be simplified into coordinated motion control of three individual systems.This coordinate control algorithm was validated on a wheeled hydraulic excavator, and the validation results indicated that this developed control method could satisfaetorily accomplish the auto-digging function for level digging or flat surface finishing.

  9. Universal mechatronics coordinator

    Science.gov (United States)

    Muir, Patrick F.

    1999-11-01

    Mechatronic systems incorporate multiple actuators and sensor which must be properly coordinated to achieve the desired system functionality. Many mechatronic systems are designed as one-of-a-kind custom projects without consideration for facilitating future system or alterations and extensions to the current syste. Thus, subsequent changes to the system are slow, different, and costly. It has become apparent that manufacturing processes, and thus the mechatronics which embody them, need to be agile in order to more quickly and easily respond to changing customer demands or market pressures. To achieve agility, both the hardware and software of the system need to be designed such that the creation of new system and the alteration and extension of current system is fast and easy. This paper describes the design of a Universal Mechatronics Coordinator (UMC) which facilitates agile setup and changeover of coordination software for mechatronic systems. The UMC is capable of sequencing continuous and discrete actions that are programmed as stimulus-response pairs, as state machines, or a combination of the two. It facilitates the modular, reusable programing of continuous actions such as servo control algorithms, data collection code, and safety checking routines; and discrete actions such as reporting achieved states, and turning on/off binary devices. The UMC has been applied to the control of a z- theta assembly robot for the Minifactory project and is applicable to a spectrum of widely differing mechatronic systems.

  10. ASteCA - Automated Stellar Cluster Analysis

    CERN Document Server

    Perren, Gabriel I; Piatti, Andrés E

    2014-01-01

    We present ASteCA (Automated Stellar Cluster Analysis), a suit of tools designed to fully automatize the standard tests applied on stellar clusters to determine their basic parameters. The set of functions included in the code make use of positional and photometric data to obtain precise and objective values for a given cluster's center coordinates, radius, luminosity function and integrated color magnitude, as well as characterizing through a statistical estimator its probability of being a true physical cluster rather than a random overdensity of field stars. ASteCA incorporates a Bayesian field star decontamination algorithm capable of assigning membership probabilities using photometric data alone. An isochrone fitting process based on the generation of synthetic clusters from theoretical isochrones and selection of the best fit through a genetic algorithm is also present, which allows ASteCA to provide accurate estimates for a cluster's metallicity, age, extinction and distance values along with its unce...

  11. Automating CPM-GOMS

    Science.gov (United States)

    John, Bonnie; Vera, Alonso; Matessa, Michael; Freed, Michael; Remington, Roger

    2002-01-01

    CPM-GOMS is a modeling method that combines the task decomposition of a GOMS analysis with a model of human resource usage at the level of cognitive, perceptual, and motor operations. CPM-GOMS models have made accurate predictions about skilled user behavior in routine tasks, but developing such models is tedious and error-prone. We describe a process for automatically generating CPM-GOMS models from a hierarchical task decomposition expressed in a cognitive modeling tool called Apex. Resource scheduling in Apex automates the difficult task of interleaving the cognitive, perceptual, and motor resources underlying common task operators (e.g. mouse move-and-click). Apex's UI automatically generates PERT charts, which allow modelers to visualize a model's complex parallel behavior. Because interleaving and visualization is now automated, it is feasible to construct arbitrarily long sequences of behavior. To demonstrate the process, we present a model of automated teller interactions in Apex and discuss implications for user modeling. available to model human users, the Goals, Operators, Methods, and Selection (GOMS) method [6, 21] has been the most widely used, providing accurate, often zero-parameter, predictions of the routine performance of skilled users in a wide range of procedural tasks [6, 13, 15, 27, 28]. GOMS is meant to model routine behavior. The user is assumed to have methods that apply sequences of operators and to achieve a goal. Selection rules are applied when there is more than one method to achieve a goal. Many routine tasks lend themselves well to such decomposition. Decomposition produces a representation of the task as a set of nested goal states that include an initial state and a final state. The iterative decomposition into goals and nested subgoals can terminate in primitives of any desired granularity, the choice of level of detail dependent on the predictions required. Although GOMS has proven useful in HCI, tools to support the

  12. AUTOMATED API TESTING APPROACH

    Directory of Open Access Journals (Sweden)

    SUNIL L. BANGARE

    2012-02-01

    Full Text Available Software testing is an investigation conducted to provide stakeholders with information about the quality of the product or service under test. With the help of software testing we can verify or validate the software product. Normally testing will be done after development of software but we can perform the software testing at the time of development process also. This paper will give you a brief introduction about Automated API Testing Tool. This tool of testing will reduce lots of headache after the whole development of software. It saves time as well as money. Such type of testing is helpful in the Industries & Colleges also.

  13. The automated medical office.

    Science.gov (United States)

    Petreman, M

    1990-08-01

    With shock and surprise many physicians learned in the 1980s that they must change the way they do business. Competition for patients, increasing government regulation, and the rapidly escalating risk of litigation forces physicians to seek modern remedies in office management. The author describes a medical clinic that strives to be paperless using electronic innovation to solve the problems of medical practice management. A computer software program to automate information management in a clinic shows that practical thinking linked to advanced technology can greatly improve office efficiency.

  14. [Automated anesthesia record system].

    Science.gov (United States)

    Zhu, Tao; Liu, Jin

    2005-12-01

    Based on Client/Server architecture, a software of automated anesthesia record system running under Windows operation system and networks has been developed and programmed with Microsoft Visual C++ 6.0, Visual Basic 6.0 and SQL Server. The system can deal with patient's information throughout the anesthesia. It can collect and integrate the data from several kinds of medical equipment such as monitor, infusion pump and anesthesia machine automatically and real-time. After that, the system presents the anesthesia sheets automatically. The record system makes the anesthesia record more accurate and integral and can raise the anesthesiologist's working efficiency.

  15. Optimal shapes for self-propelled swimmers

    Science.gov (United States)

    Koumoutsakos, Petros; van Rees, Wim; Gazzola, Mattia

    2011-11-01

    We optimize swimming shapes of three-dimensional self-propelled swimmers by combining the CMA- Evolution Strategy with a remeshed vortex method. We analyze the robustness of optimal shapes and discuss the near wake vortex dynamics for optimal speed and efficiency at Re=550. We also report preliminary results of optimal shapes and arrangements for multiple coordinated swimmers.

  16. Symmetric two-coordinate photodiode

    Directory of Open Access Journals (Sweden)

    Dobrovolskiy Yu. G.

    2008-12-01

    Full Text Available The two-coordinate photodiode is developed and explored on the longitudinal photoeffect, which allows to get the coordinate descriptions symmetric on the steepness and longitudinal resistance great exactness. It was shown, that the best type of the coordinate description is observed in the case of scanning by the optical probe on the central part of the photosensitive element. The ways of improvement of steepness and linear of its coordinate description were analyzed.

  17. Invariant Manifolds and Collective Coordinates

    CERN Document Server

    Papenbrock, T

    2001-01-01

    We introduce suitable coordinate systems for interacting many-body systems with invariant manifolds. These are Cartesian in coordinate and momentum space and chosen such that several components are identically zero for motion on the invariant manifold. In this sense these coordinates are collective. We make a connection to Zickendraht's collective coordinates and present certain configurations of few-body systems where rotations and vibrations decouple from single-particle motion. These configurations do not depend on details of the interaction.

  18. Automating quantum experiment control

    Science.gov (United States)

    Stevens, Kelly E.; Amini, Jason M.; Doret, S. Charles; Mohler, Greg; Volin, Curtis; Harter, Alexa W.

    2017-03-01

    The field of quantum information processing is rapidly advancing. As the control of quantum systems approaches the level needed for useful computation, the physical hardware underlying the quantum systems is becoming increasingly complex. It is already becoming impractical to manually code control for the larger hardware implementations. In this chapter, we will employ an approach to the problem of system control that parallels compiler design for a classical computer. We will start with a candidate quantum computing technology, the surface electrode ion trap, and build a system instruction language which can be generated from a simple machine-independent programming language via compilation. We incorporate compile time generation of ion routing that separates the algorithm description from the physical geometry of the hardware. Extending this approach to automatic routing at run time allows for automated initialization of qubit number and placement and additionally allows for automated recovery after catastrophic events such as qubit loss. To show that these systems can handle real hardware, we present a simple demonstration system that routes two ions around a multi-zone ion trap and handles ion loss and ion placement. While we will mainly use examples from transport-based ion trap quantum computing, many of the issues and solutions are applicable to other architectures.

  19. Automated Postediting of Documents

    CERN Document Server

    Knight, K; Knight, Kevin; Chander, Ishwar

    1994-01-01

    Large amounts of low- to medium-quality English texts are now being produced by machine translation (MT) systems, optical character readers (OCR), and non-native speakers of English. Most of this text must be postedited by hand before it sees the light of day. Improving text quality is tedious work, but its automation has not received much research attention. Anyone who has postedited a technical report or thesis written by a non-native speaker of English knows the potential of an automated postediting system. For the case of MT-generated text, we argue for the construction of postediting modules that are portable across MT systems, as an alternative to hardcoding improvements inside any one system. As an example, we have built a complete self-contained postediting module for the task of article selection (a, an, the) for English noun phrases. This is a notoriously difficult problem for Japanese-English MT. Our system contains over 200,000 rules derived automatically from online text resources. We report on l...

  20. Automated Test Case Generation

    CERN Document Server

    CERN. Geneva

    2015-01-01

    I would like to present the concept of automated test case generation. I work on it as part of my PhD and I think it would be interesting also for other people. It is also the topic of a workshop paper that I am introducing in Paris. (abstract below) Please note that the talk itself would be more general and not about the specifics of my PhD, but about the broad field of Automated Test Case Generation. I would introduce the main approaches (combinatorial testing, symbolic execution, adaptive random testing) and their advantages and problems. (oracle problem, combinatorial explosion, ...) Abstract of the paper: Over the last decade code-based test case generation techniques such as combinatorial testing or dynamic symbolic execution have seen growing research popularity. Most algorithms and tool implementations are based on finding assignments for input parameter values in order to maximise the execution branch coverage. Only few of them consider dependencies from outside the Code Under Test’s scope such...

  1. Maneuver Automation Software

    Science.gov (United States)

    Uffelman, Hal; Goodson, Troy; Pellegrin, Michael; Stavert, Lynn; Burk, Thomas; Beach, David; Signorelli, Joel; Jones, Jeremy; Hahn, Yungsun; Attiyah, Ahlam; Illsley, Jeannette

    2009-01-01

    The Maneuver Automation Software (MAS) automates the process of generating commands for maneuvers to keep the spacecraft of the Cassini-Huygens mission on a predetermined prime mission trajectory. Before MAS became available, a team of approximately 10 members had to work about two weeks to design, test, and implement each maneuver in a process that involved running many maneuver-related application programs and then serially handing off data products to other parts of the team. MAS enables a three-member team to design, test, and implement a maneuver in about one-half hour after Navigation has process-tracking data. MAS accepts more than 60 parameters and 22 files as input directly from users. MAS consists of Practical Extraction and Reporting Language (PERL) scripts that link, sequence, and execute the maneuver- related application programs: "Pushing a single button" on a graphical user interface causes MAS to run navigation programs that design a maneuver; programs that create sequences of commands to execute the maneuver on the spacecraft; and a program that generates predictions about maneuver performance and generates reports and other files that enable users to quickly review and verify the maneuver design. MAS can also generate presentation materials, initiate electronic command request forms, and archive all data products for future reference.

  2. Automated digital magnetofluidics

    Energy Technology Data Exchange (ETDEWEB)

    Schneider, J; Garcia, A A; Marquez, M [Harrington Department of Bioengineering Arizona State University, Tempe AZ 85287-9709 (United States)], E-mail: tony.garcia@asu.edu

    2008-08-15

    Drops can be moved in complex patterns on superhydrophobic surfaces using a reconfigured computer-controlled x-y metrology stage with a high degree of accuracy, flexibility, and reconfigurability. The stage employs a DMC-4030 controller which has a RISC-based, clock multiplying processor with DSP functions, accepting encoder inputs up to 22 MHz, provides servo update rates as high as 32 kHz, and processes commands at rates as fast as 40 milliseconds. A 6.35 mm diameter cylindrical NdFeB magnet is translated by the stage causing water drops to move by the action of induced magnetization of coated iron microspheres that remain in the drop and are attracted to the rare earth magnet through digital magnetofluidics. Water drops are easily moved in complex patterns in automated digital magnetofluidics at an average speed of 2.8 cm/s over a superhydrophobic polyethylene surface created by solvent casting. With additional components, some potential uses for this automated microfluidic system include characterization of superhydrophobic surfaces, water quality analysis, and medical diagnostics.

  3. The Automated Logistics Element Planning System (ALEPS)

    Science.gov (United States)

    Schwaab, Douglas G.

    1992-01-01

    ALEPS, which is being developed to provide the SSF program with a computer system to automate logistics resupply/return cargo load planning and verification, is presented. ALEPS will make it possible to simultaneously optimize both the resupply flight load plan and the return flight reload plan for any of the logistics carriers. In the verification mode ALEPS will support the carrier's flight readiness reviews and control proper execution of the approved plans. It will also support the SSF inventory management system by providing electronic block updates to the inventory database on the cargo arriving at or departing the station aboard a logistics carrier. A prototype drawer packing algorithm is described which is capable of generating solutions for 3D packing of cargo items into a logistics carrier storage accommodation. It is concluded that ALEPS will provide the capability to generate and modify optimized loading plans for the logistics elements fleet.

  4. Emergency Centre Organization and Automated Triage System

    CERN Document Server

    Golding, Dan; Marwala, Tshilidzi

    2008-01-01

    The excessive rate of patients arriving at accident and emergency centres is a major problem facing South African hospitals. Patients are prioritized for medical care through a triage process. Manual systems allow for inconsistency and error. This paper proposes a novel system to automate accident and emergency centre triage and uses this triage score along with an artificial intelligence estimate of patient-doctor time to optimize the queue order. A fuzzy inference system is employed to triage patients and a similar system estimates the time but adapts continuously through fuzzy Q-learning. The optimal queue order is found using a novel procedure based on genetic algorithms. These components are integrated in a simple graphical user interface. Live tests could not be performed but simulations reveal that the average waiting time can be reduced by 48 minutes and priority is given to urgent patients

  5. Understanding how animal groups achieve coordinated movement.

    Science.gov (United States)

    Herbert-Read, J E

    2016-10-01

    Moving animal groups display remarkable feats of coordination. This coordination is largely achieved when individuals adjust their movement in response to their neighbours' movements and positions. Recent advancements in automated tracking technologies, including computer vision and GPS, now allow researchers to gather large amounts of data on the movements and positions of individuals in groups. Furthermore, analytical techniques from fields such as statistical physics now allow us to identify the precise interaction rules used by animals on the move. These interaction rules differ not only between species, but also between individuals in the same group. These differences have wide-ranging implications, affecting how groups make collective decisions and driving the evolution of collective motion. Here, I describe how trajectory data can be used to infer how animals interact in moving groups. I give examples of the similarities and differences in the spatial and directional organisations of animal groups between species, and discuss the rules that animals use to achieve this organisation. I then explore how groups of the same species can exhibit different structures, and ask whether this results from individuals adapting their interaction rules. I then examine how the interaction rules between individuals in the same groups can also differ, and discuss how this can affect ecological and evolutionary processes. Finally, I suggest areas of future research.

  6. Belief Propagation Methods for Intercell Interference Coordination

    CERN Document Server

    Rangan, Sundeep

    2010-01-01

    We consider a broad class of interference coordination and resource allocation problems for wireless links where the goal is to maximize the sum of functions of individual link rates. Such problems arise in the context of, for example, fractional frequency reuse (FFR) for macro-cellular networks and dynamic interference management in femtocells. The resulting optimization problems are typically hard to solve optimally even using centralized algorithms but are an essential computational step in implementing rate-fair and queue stabilizing scheduling policies in wireless networks. We consider a belief propagation framework to solve such problems approximately. In particular, we construct approximations to the belief propagation iterations to obtain computationally simple and distributed algorithms with low communication overhead. Notably, our methods are very general and apply to, for example, the optimization of transmit powers, transmit beamforming vectors, and sub-band allocation to maximize the above object...

  7. Hierarchical Model Predictive Control for Sustainable Building Automation

    Directory of Open Access Journals (Sweden)

    Barbara Mayer

    2017-02-01

    Full Text Available A hierarchicalmodel predictive controller (HMPC is proposed for flexible and sustainable building automation. The implications of a building automation system for sustainability are defined, and model predictive control is introduced as an ideal tool to cover all requirements. The HMPC is presented as a development suitable for the optimization of modern buildings, as well as retrofitting. The performance and flexibility of the HMPC is demonstrated by simulation studies of a modern office building, and the perfect interaction with future smart grids is shown.

  8. 23rd International Conference on Flexible Automation & Intelligent Manufacturing

    CERN Document Server

    2013-01-01

    The proceedings includes the set of revised papers from the 23rd International Conference on Flexible Automation and Intelligent Manufacturing (FAIM 2013). This conference aims to provide an international forum for the exchange of leading edge scientific knowledge and industrial experience regarding the development and integration of the various aspects of Flexible Automation and Intelligent Manufacturing Systems covering the complete life-cycle of a company’s Products and Processes. Contents will include topics such as: Product, Process and Factory Integrated Design, Manufacturing Technology and Intelligent Systems, Manufacturing Operations Management and Optimization and Manufacturing Networks and MicroFactories.

  9. Knowledge Automation How to Implement Decision Management in Business Processes

    CERN Document Server

    Fish, Alan N

    2012-01-01

    A proven decision management methodology for increased profits and lowered risks Knowledge Automation: How to Implement Decision Management in Business Processes describes a simple but comprehensive methodology for decision management projects, which use business rules and predictive analytics to optimize and automate small, high-volume business decisions. It includes Decision Requirements Analysis (DRA), a new method for taking the crucial first step in any IT project to implement decision management: defining a set of business decisions and identifying all the information-business knowledge

  10. A New Approach to an Automated Air Traffic Control

    Institute of Scientific and Technical Information of China (English)

    Patchev Dragoljub

    2014-01-01

    This paper identifies areas of improvements of the air traffic control system and proposes modification of the concept of automation by using available technologies. With the proposed modification, the current Europe wide en route network structure can be modified in order to make routes more optimal. For this new route network structure, a new concept of automation will be used to manage with the air traffic. The first identified area of improvement is implementation of automation process that will enable decentralization of the air traffic control functionality to each individual aircraft and this will be achieved through automated routing of the aircrafts and CD&R (conflict detection and resolution). The FMS (flight management system) at the aircraft will make decisions for the optimal flight route based on the sensor inputs, information on selection of the routes, next hope points and flight levels, all received by ADS-B (automatic dependant surveillance-broadcast). The second area is processing the information about the deviation from the optimal route as in flight plan due to a traffic management (vectoring, level change) and taking it into consideration when further actions are undertaken. For each action, a cost factor will be calculated from the fuel burned for that action. This factor will be used to select conflict resolution protocol. The proposed concept shall increase the capacity of the network, and enable the air traff~c more efficient and more environmentally friendly while maintaining safe separation.

  11. Coordination Processes in International Organisations

    DEFF Research Database (Denmark)

    Nedergaard, Peter

    2008-01-01

    to coordinate relatively elaborate agreements due to the strength of its coordination as far as professional or technical and political activities (excepting the ILO budget) are concerned. In other more clear-cut or 'simple' policy areas such as the ILO budget, the EU coordination is weak: this contrast...

  12. Quantitative determination of wine highly volatile sulfur compounds by using automated headspace solid-phase microextraction and gas chromatography-pulsed flame photometric detection. Critical study and optimization of a new procedure.

    Science.gov (United States)

    López, Ricardo; Lapeña, Ana Cristina; Cacho, Juan; Ferreira, Vicente

    2007-03-02

    The quantitative determination of wine volatile sulfur compounds by automated headspace solid-phase microextraction (HS-SPME) with a carboxen-polydimethylsiloxane (CAR-PDMS) fiber and subsequent gas chromatography-pulsed flame photometric detection (GC-PFPD) has been evaluated. The direct extraction of the sulfur compounds in 5 ml of wine has been found to suffer from matrix effects and short linear ranges, problems which could not be solved by the use of different internal standards or by multiple headspace SPME. These problems were attributed to saturation of the fiber and to competitive effects between analytes, internal standards and other wine volatiles. Another problem was the oxidation of analytes during the procedure. The reduction in sample volume by a factor 50 (0.1 ml diluted with water or brine) brought about a reduction in the amount of sulfur compounds taken in the fiber by a factor just 3.3. Consequently, a new procedure has been proposed. In a sealed vial containing 4.9 ml of saturated NaCl brine, the air is thoroughly displaced with nitrogen, and the wine (0.1 ml) and the internal standards (0.02 ml) are further introduced with a syringe through the vial septum. This sample is extracted at 35 degrees C for 20 min. This procedure makes a satisfactory determination possible of hydrogen sulfide, methanethiol, ethanethiol, dimethyl sulfide, diethyl sulfide and dimethyl disulfide. The linear dynamic ranges cover the normal ranges of occurrence of these analytes in wine with typical r2 between 0.9823 and 0.9980. Reproducibility in real samples ranges from 10 to 20% and repeatability is better than 10% in most cases. The method accuracy is satisfactory, with errors below 20% for hydrogen sulfide and mostly below 10% for the other compounds. The proposed method has been applied to the analysis of 34 Spanish wines.

  13. Get smart! automate your house!

    NARCIS (Netherlands)

    Van Amstel, P.; Gorter, N.; De Rouw, J.

    2016-01-01

    This "designers' manual" is made during the TIDO-course AR0531 Innovation and Sustainability This manual will help you in reducing both energy usage and costs by automating your home. It gives an introduction to a number of home automation systems that every homeowner can install.

  14. Opening up Library Automation Software

    Science.gov (United States)

    Breeding, Marshall

    2009-01-01

    Throughout the history of library automation, the author has seen a steady advancement toward more open systems. In the early days of library automation, when proprietary systems dominated, the need for standards was paramount since other means of inter-operability and data exchange weren't possible. Today's focus on Application Programming…

  15. Classification of Automated Search Traffic

    Science.gov (United States)

    Buehrer, Greg; Stokes, Jack W.; Chellapilla, Kumar; Platt, John C.

    As web search providers seek to improve both relevance and response times, they are challenged by the ever-increasing tax of automated search query traffic. Third party systems interact with search engines for a variety of reasons, such as monitoring a web site’s rank, augmenting online games, or possibly to maliciously alter click-through rates. In this paper, we investigate automated traffic (sometimes referred to as bot traffic) in the query stream of a large search engine provider. We define automated traffic as any search query not generated by a human in real time. We first provide examples of different categories of query logs generated by automated means. We then develop many different features that distinguish between queries generated by people searching for information, and those generated by automated processes. We categorize these features into two classes, either an interpretation of the physical model of human interactions, or as behavioral patterns of automated interactions. Using the these detection features, we next classify the query stream using multiple binary classifiers. In addition, a multiclass classifier is then developed to identify subclasses of both normal and automated traffic. An active learning algorithm is used to suggest which user sessions to label to improve the accuracy of the multiclass classifier, while also seeking to discover new classes of automated traffic. Performance analysis are then provided. Finally, the multiclass classifier is used to predict the subclass distribution for the search query stream.

  16. Translation: Aids, Robots, and Automation.

    Science.gov (United States)

    Andreyewsky, Alexander

    1981-01-01

    Examines electronic aids to translation both as ways to automate it and as an approach to solve problems resulting from shortage of qualified translators. Describes the limitations of robotic MT (Machine Translation) systems, viewing MAT (Machine-Aided Translation) as the only practical solution and the best vehicle for further automation. (MES)

  17. Automated Methods Of Corrosion Measurements

    DEFF Research Database (Denmark)

    Bech-Nielsen, Gregers; Andersen, Jens Enevold Thaulov; Reeve, John Ch

    1997-01-01

    The chapter describes the following automated measurements: Corrosion Measurements by Titration, Imaging Corrosion by Scanning Probe Microscopy, Critical Pitting Temperature and Application of the Electrochemical Hydrogen Permeation Cell.......The chapter describes the following automated measurements: Corrosion Measurements by Titration, Imaging Corrosion by Scanning Probe Microscopy, Critical Pitting Temperature and Application of the Electrochemical Hydrogen Permeation Cell....

  18. Work Coordination Engine

    Science.gov (United States)

    Zendejas, Silvino; Bui, Tung; Bui, Bach; Malhotra, Shantanu; Chen, Fannie; Kim, Rachel; Allen, Christopher; Luong, Ivy; Chang, George; Sadaqathulla, Syed

    2009-01-01

    The Work Coordination Engine (WCE) is a Java application integrated into the Service Management Database (SMDB), which coordinates the dispatching and monitoring of a work order system. WCE de-queues work orders from SMDB and orchestrates the dispatching of work to a registered set of software worker applications distributed over a set of local, or remote, heterogeneous computing systems. WCE monitors the execution of work orders once dispatched, and accepts the results of the work order by storing to the SMDB persistent store. The software leverages the use of a relational database, Java Messaging System (JMS), and Web Services using Simple Object Access Protocol (SOAP) technologies to implement an efficient work-order dispatching mechanism capable of coordinating the work of multiple computer servers on various platforms working concurrently on different, or similar, types of data or algorithmic processing. Existing (legacy) applications can be wrapped with a proxy object so that no changes to the application are needed to make them available for integration into the work order system as "workers." WCE automatically reschedules work orders that fail to be executed by one server to a different server if available. From initiation to completion, the system manages the execution state of work orders and workers via a well-defined set of events, states, and actions. It allows for configurable work-order execution timeouts by work-order type. This innovation eliminates a current processing bottleneck by providing a highly scalable, distributed work-order system used to quickly generate products needed by the Deep Space Network (DSN) to support space flight operations. WCE is driven by asynchronous messages delivered via JMS indicating the availability of new work or workers. It runs completely unattended in support of the lights-out operations concept in the DSN.

  19. 基于Plant Simulation的大型游乐园区域协调与优化研究%A Research on Area Coordinating and Optimizing of Amusement Park by Using Plant Simulation

    Institute of Scientific and Technical Information of China (English)

    曹欢欢; 许志沛

    2016-01-01

    In order to maximize the customers' satisfaction and improve the operators' economic benefit, the layout plans and intercoordination of the typical amusement equipment and people's routes in the amusement park were analyzed and optimized. The people's routes and the working of amusement equipment were simulated in different customer proportion, different festivals and different amusement equipment proportion by plant simulation. The ways to optimizing the layouts of amusement equipment were explored and reasonable advices for similar parks were proposed.%为使游客达到最大满意度、提高运营经济效益,对典型的大型游乐园游乐设施和游乐线路的布局、规划、相互协调进行了分析和择优。并运用plant simulation模拟软件对不同客源比例、节假日和游乐设施配置游乐园运作状况进行仿真和研究,分析其功能区域间、游乐设施间协调的利弊,探讨优化途径,提出针对性的合理化建议,供同类游乐园规划与运营调整的借鉴与参考。

  20. Advice for Coordination

    DEFF Research Database (Denmark)

    Hankin, Chris; Nielson, Flemming; Nielson, Hanne Riis

    2008-01-01

    demanding than the closed joinpoints in more traditional aspect oriented languages like AspectJ. The usefulness of our approach is demonstrated by mechanisms for discretionary and mandatory access control policies, as usually expressed by reference monitors, as well as mechanisms for logging actions.......We show how to extend a coordination language with support for aspect oriented programming. The main challenge is how to properly deal with the trapping of actions before the actual data have been bound to the formal parameters. This necessitates dealing with open joinpoints – which is more...

  1. Markov stochasticity coordinates

    Science.gov (United States)

    Eliazar, Iddo

    2017-01-01

    Markov dynamics constitute one of the most fundamental models of random motion between the states of a system of interest. Markov dynamics have diverse applications in many fields of science and engineering, and are particularly applicable in the context of random motion in networks. In this paper we present a two-dimensional gauging method of the randomness of Markov dynamics. The method-termed Markov Stochasticity Coordinates-is established, discussed, and exemplified. Also, the method is tweaked to quantify the stochasticity of the first-passage-times of Markov dynamics, and the socioeconomic equality and mobility in human societies.

  2. Coordination Hydrothermal Interconnection Java-Bali Using Simulated Annealing

    Science.gov (United States)

    Wicaksono, B.; Abdullah, A. G.; Saputra, W. S.

    2016-04-01

    Hydrothermal power plant coordination aims to minimize the total cost of operating system that is represented by fuel costand constraints during optimization. To perform the optimization, there are several methods that can be used. Simulated Annealing (SA) is a method that can be used to solve the optimization problems. This method was inspired by annealing or cooling process in the manufacture of materials composed of crystals. The basic principle of hydrothermal power plant coordination includes the use of hydro power plants to support basic load while thermal power plants were used to support the remaining load. This study used two hydro power plant units and six thermal power plant units with 25 buses by calculating transmission losses and considering power limits in each power plant unit aided by MATLAB software during the process. Hydrothermal power plant coordination using simulated annealing plants showed that a total cost of generation for 24 hours is 13,288,508.01.

  3. Bus coordination holding control for transit hubs under APTS

    Institute of Scientific and Technical Information of China (English)

    TENG Jing; ZHAO Ming

    2009-01-01

    To increase the passenger transferring efficiency, the bus coordination holding control for transit hubs,hich is as an important dynamic dispatching method for improving the service level of transit hubs, was studied in the framework of bus coordination dispatching mode. Firstly, the bus coordination holding control flow was studied based on Advanced Public Transportation Systems (APTS) environment. Then a control model was presented to optimize the bus vehicle holding time, and a genetic algorithm was designed as the solving method. In the end, an example was given to illustrate the effectiveness of the control strategy and the algorithm.

  4. Identifiability of linear systems in physical coordinates

    Science.gov (United States)

    Su, Tzu-Jeng; Juang, Jer-Nan

    1992-01-01

    Identifiability of linear, time-invariant systems in physical coordinates is discussed. It is shown that identification of the system matrix in physical coordinates can be accomplished by determining a transformation matrix that relates the physical locations of actuators and sensors to the test-data-derived input and output matrices. For systems with symmetric matrices, the solution of a constrained optimization problem is used to characterize all the possible solutions of the transformation matrix. Conditions for the existence of a unique transformation matrix are established easily from the explicit form of the solutions. For systems with limited inputs and outputs, the question about which part of the system can be uniquely identified is also answered. A simple mass-spring system is used to verify the conclusions of this study.

  5. Automated Standard Hazard Tool

    Science.gov (United States)

    Stebler, Shane

    2014-01-01

    The current system used to generate standard hazard reports is considered cumbersome and iterative. This study defines a structure for this system's process in a clear, algorithmic way so that standard hazard reports and basic hazard analysis may be completed using a centralized, web-based computer application. To accomplish this task, a test server is used to host a prototype of the tool during development. The prototype is configured to easily integrate into NASA's current server systems with minimal alteration. Additionally, the tool is easily updated and provides NASA with a system that may grow to accommodate future requirements and possibly, different applications. Results of this project's success are outlined in positive, subjective reviews complete by payload providers and NASA Safety and Mission Assurance personnel. Ideally, this prototype will increase interest in the concept of standard hazard automation and lead to the full-scale production of a user-ready application.

  6. Robust automated knowledge capture.

    Energy Technology Data Exchange (ETDEWEB)

    Stevens-Adams, Susan Marie; Abbott, Robert G.; Forsythe, James Chris; Trumbo, Michael Christopher Stefan; Haass, Michael Joseph; Hendrickson, Stacey M. Langfitt

    2011-10-01

    This report summarizes research conducted through the Sandia National Laboratories Robust Automated Knowledge Capture Laboratory Directed Research and Development project. The objective of this project was to advance scientific understanding of the influence of individual cognitive attributes on decision making. The project has developed a quantitative model known as RumRunner that has proven effective in predicting the propensity of an individual to shift strategies on the basis of task and experience related parameters. Three separate studies are described which have validated the basic RumRunner model. This work provides a basis for better understanding human decision making in high consequent national security applications, and in particular, the individual characteristics that underlie adaptive thinking.

  7. [From automation to robotics].

    Science.gov (United States)

    1985-01-01

    The introduction of automation into the laboratory of biology seems to be unavoidable. But at which cost, if it is necessary to purchase a new machine for every new application? Fortunately the same image processing techniques, belonging to a theoretic framework called Mathematical Morphology, may be used in visual inspection tasks, both in car industry and in the biology lab. Since the market for industrial robotics applications is much higher than the market of biomedical applications, the price of image processing devices drops, and becomes sometimes less than the price of a complete microscope equipment. The power of the image processing methods of Mathematical Morphology will be illustrated by various examples, as automatic silver grain counting in autoradiography, determination of HLA genotype, electrophoretic gels analysis, automatic screening of cervical smears... Thus several heterogeneous applications may share the same image processing device, provided there is a separate and devoted work station for each of them.

  8. Automated electronic filter design

    CERN Document Server

    Banerjee, Amal

    2017-01-01

    This book describes a novel, efficient and powerful scheme for designing and evaluating the performance characteristics of any electronic filter designed with predefined specifications. The author explains techniques that enable readers to eliminate complicated manual, and thus error-prone and time-consuming, steps of traditional design techniques. The presentation includes demonstration of efficient automation, using an ANSI C language program, which accepts any filter design specification (e.g. Chebyschev low-pass filter, cut-off frequency, pass-band ripple etc.) as input and generates as output a SPICE(Simulation Program with Integrated Circuit Emphasis) format netlist. Readers then can use this netlist to run simulations with any version of the popular SPICE simulator, increasing accuracy of the final results, without violating any of the key principles of the traditional design scheme.

  9. Exploring the Lived Experiences of Program Managers Regarding an Automated Logistics Environment

    Science.gov (United States)

    Allen, Ronald Timothy

    2014-01-01

    Automated Logistics Environment (ALE) is a new term used by Navy and aerospace industry executives to describe the aggregate of logistics-related information systems that support modern aircraft weapon systems. The development of logistics information systems is not always well coordinated among programs, often resulting in solutions that cannot…

  10. Quantitative analysis of spider locomotion employing computer-automated video tracking

    DEFF Research Database (Denmark)

    Baatrup, E; Bayley, M

    1993-01-01

    The locomotor activity of adult specimens of the wolf spider Pardosa amentata was measured in an open-field setup, using computer-automated colour object video tracking. The x,y coordinates of the animal in the digitized image of the test arena were recorded three times per second during four con...

  11. Automated Essay Scoring

    Directory of Open Access Journals (Sweden)

    Semire DIKLI

    2006-01-01

    Full Text Available Automated Essay Scoring Semire DIKLI Florida State University Tallahassee, FL, USA ABSTRACT The impacts of computers on writing have been widely studied for three decades. Even basic computers functions, i.e. word processing, have been of great assistance to writers in modifying their essays. The research on Automated Essay Scoring (AES has revealed that computers have the capacity to function as a more effective cognitive tool (Attali, 2004. AES is defined as the computer technology that evaluates and scores the written prose (Shermis & Barrera, 2002; Shermis & Burstein, 2003; Shermis, Raymat, & Barrera, 2003. Revision and feedback are essential aspects of the writing process. Students need to receive feedback in order to increase their writing quality. However, responding to student papers can be a burden for teachers. Particularly if they have large number of students and if they assign frequent writing assignments, providing individual feedback to student essays might be quite time consuming. AES systems can be very useful because they can provide the student with a score as well as feedback within seconds (Page, 2003. Four types of AES systems, which are widely used by testing companies, universities, and public schools: Project Essay Grader (PEG, Intelligent Essay Assessor (IEA, E-rater, and IntelliMetric. AES is a developing technology. Many AES systems are used to overcome time, cost, and generalizability issues in writing assessment. The accuracy and reliability of these systems have been proven to be high. The search for excellence in machine scoring of essays is continuing and numerous studies are being conducted to improve the effectiveness of the AES systems.

  12. Coordinating complex decision support activities across distributed applications

    Science.gov (United States)

    Adler, Richard M.

    1994-01-01

    Knowledge-based technologies have been applied successfully to automate planning and scheduling in many problem domains. Automation of decision support can be increased further by integrating task-specific applications with supporting database systems, and by coordinating interactions between such tools to facilitate collaborative activities. Unfortunately, the technical obstacles that must be overcome to achieve this vision of transparent, cooperative problem-solving are daunting. Intelligent decision support tools are typically developed for standalone use, rely on incompatible, task-specific representational models and application programming interfaces (API's), and run on heterogeneous computing platforms. Getting such applications to interact freely calls for platform independent capabilities for distributed communication, as well as tools for mapping information across disparate representations. Symbiotics is developing a layered set of software tools (called NetWorks! for integrating and coordinating heterogeneous distributed applications. he top layer of tools consists of an extensible set of generic, programmable coordination services. Developers access these services via high-level API's to implement the desired interactions between distributed applications.

  13. Effect of optimization postural training on the coordination compliance and postoperative complications risk in patients scheduled by thyroid surgery%优化体位训练对行择期甲状腺手术患者配合依从性及术后并发症发生风险的影响

    Institute of Scientific and Technical Information of China (English)

    邵敏; 段晓侠

    2016-01-01

    Objective:To investigate the effects of optimization postural training on the coordination compliance and postoperative complications risk in patients scheduled by thyroid surgery. Methods:Eighty patients scheduled by thyroid surgery were randomly divided into the control group and observation group(40 cases each group). The control group and observation group were treated with conventional postural training and optimization postural training, respectively. The coordination compliance, healthy knowledge, postoperative SF-36 score, postoperative complications and nursing satisfaction between two groups were compared. Results:The coordination compliance,healthy knowledge,postoperative SF-36 score and nursing satisfaction in observation group were higher than those in control group(P<0. 05 to P<0. 01). The incidences of the postoperative headache,nausea and vomiting and waist back pain in observation group were lower than those in control group(P<0. 05 to P<0. 01). Conclusions:The optimization postural training in nursing the patients scheduled by thyroid surgery can effectively improve the coordination compliance,healthy knowledge,quality of life and nurse-patient relationship, reduce the postoperative complications risk, and the clinical effects of which are better than those of conventional postural training.%目的::探讨优化体位训练对行择期甲状腺手术患者配合依从性及术后并发症发生风险的影响。方法:行择期甲状腺手术患者共80例,随机分为对照组和观察组,各40例,分别采用常规体位训练和优化体位训练。比较2组患者配合依从性、健康相关知识掌握率、术后SF-36评分、术后并发症发生率及护理满意度等。结果:观察组患者配合依从性、健康相关知识掌握率、SF-36评分及患者护理满意度均高于对照组(P<0.05~P<0.01);术后并发症头痛、恶心呕吐和腰背痛的发生率均低于对照组(P<0.05~P<0.01)。结论:优化体

  14. Distribution Loss Reduction by Household Consumption Coordination in Smart Grids

    DEFF Research Database (Denmark)

    Juelsgaard, Morten; Andersen, Palle; Wisniewski, Rafal

    2014-01-01

    In this work, we address the problem of optimizing the electrical consumption patterns for a community of closely located households, with a large degree of flexible consumption, and further some degree of local electricity production from solar panels. We describe optimization methods...... for coordinating consumption of electrical energy within the community, with the purpose of reducing grid loading and active power losses. For this we present a simplified model of the electrical grid, including system losses and capacity constraints. Coordination is performed in a distributed fashion, where each...

  15. Automated Tuning of the Advanced Photon Source Booster Synchrotron

    Science.gov (United States)

    Biedron, S. G.; Carwardine, J. A.; Milton, S. V.

    1997-05-01

    The acceleration cycle of the Advanced Photon Source (APS) booster synchrotron is completed within 250 ms and is repeated at 2 Hz. Unless properly corrected, transverse and longitudinal injection errors can lead to inefficient booster performance. Ramped-magnet tracking errors can also lead to losses during the acceleration cycle. In order to simplify daily operation, automated tuning methods have been developed. Through the use of empirically determined response functions, transfer line corrector magnets, and beam position monitor readings, the injection process is optimized by correcting the first turn trajectory to the measured closed orbit. An automated version of this correction technique has been implemented using the feedback-based program sddscontrollaw. Further automation is used to adjust and minimize tracking errors between the five main ramped power supplies. These tuning algorithms and their implementation are described here along with an evaluation of their! performance.

  16. Optimal Reorientation Of Spacecraft Orbit

    Directory of Open Access Journals (Sweden)

    Chelnokov Yuriy Nikolaevich

    2014-06-01

    Full Text Available The problem of optimal reorientation of the spacecraft orbit is considered. For solving the problem we used quaternion equations of motion written in rotating coordinate system. The use of quaternion variables makes this consideration more efficient. The problem of optimal control is solved on the basis of the maximum principle. An example of numerical solution of the problem is given.

  17. 基于关键链优化的课程教学与实践教学契合 OA 绩效方法研究%OA Research on Coordination Methods of Course Teaching and Practical Teaching Based on Critical Chain Optimization

    Institute of Scientific and Technical Information of China (English)

    朱卫未; 黄阳; 李彦东

    2012-01-01

      作为高等教育体系中的两大主要模块,课程教学与实践教学在时间维度、逻辑维度和认知协同维度往往会存在矛盾,导致了教学质量的降低。本文从这三个维度出发,首先确定了具有普适性的教学绩效评估方法;随后根据多项目管理的关键链法提出了教学优化的具体步骤;最后以工商管理专业课程为实例根据上述教学绩效评估方法验证了教学优化的效果。%  As the two major modules in Higher Education System, course teaching and practical teaching are often in contradiction with each other from the dimensions of time, logic and cognition coordination, which lowers the quality of Higher Education. This paper probes into the universal teaching performance evaluation approach from the three dimensions, and concrete teaching optimization procedures using critical chain method are put forward as well. Illustrated by the case of Business Administration major, effect of the teaching optimization using the aforementioned performance e-valuation approach was verified.

  18. Optimization of Spot Power Price in Coordinated Fast Charging Model of Electric Vehicles%智能电网中电动汽车快速有序充电实时电价优化方法

    Institute of Scientific and Technical Information of China (English)

    唐小波; 赵彩虹; 吴薛红; 张娟

    2013-01-01

    针对智能电网中电动汽车的有序充电调度问题,提出了用电价杠杆调节电动汽车快充负荷的实时电价机制,引入了愿望度模型,以电网负荷峰谷差最小为目标函数,充电站愿望度和用户愿望度为约束条件,建立了优化数学模型,并通过遗传算法对该优化模型进行求解。最后基于某地区2020年的预测数据进行算例仿真,结果表明,提出的实时电力定价机制可以有效降低峰谷差,保障充电站利益,满足用户充电需求,达到电网、充电站和用户的共赢。%Aiming at the order charging scheduling problem of the electric vehicles in the intelligent grids,a spot power pricing mechanism as a lever of regulating electricity is proposed to dispatch electric vehicle fast charging load. Considering the peak load shifting effect of grid, the benefit of charging station and the desire of customer, the mathematics model of optimization is founded. And a genetic algorithm is used to solve this problem of the model of opti-mization. Finally,a simulation based on an area's predicted data of year 2020 is made to show that the proposed method can lower the peak-valley difference,promise charging station benefit and meet customers'desires,as well as achieve a win-win result of the grid,the charging stations and the users.

  19. [Coordination among healthcare levels: systematization of tools and measures].

    Science.gov (United States)

    Terraza Núñez, Rebeca; Vargas Lorenzo, Ingrid; Vázquez Navarrete, María Luisa

    2006-01-01

    Improving healthcare coordination is a priority in many healthcare systems, particularly in chronic health problems in which a number of professionals and services intervene. There is an abundance of coordination strategies and mechanisms that should be systematized so that they can be used in the most appropriate context. The present article aims to analyse healthcare coordination and its instruments using the organisational theory. Coordination mechanisms can be classified according to two basic processes used to coordinate activities: programming and feedback. The optimal combination of mechanisms will depend on three factors: the degree to which healthcare activities are differentiated, the volume and type of interdependencies, and the level of uncertainty. Historically, healthcare services have based coordination on skills standardization and, most recently, on processes standardization, through clinical guidelines, maps, and plans. Their utilisation is unsatisfactory in chronic diseases involving intervention by several professionals with reciprocal interdependencies, variability in patients' response to medical interventions, and a large volume of information to be processed. In this case, mechanisms based on feedback, such as working groups, linking professionals and vertical information systems, are more effective. To date, evaluation of healthcare coordination has not been conducted systematically, using structure, process and results indicators. The different strategies and instruments have been applied mainly to long-term care and mental health and one of the challenges to healthcare coordination is to extend and evaluate their use throughout the healthcare continuum.

  20. Extensible automated dispersive liquid–liquid microextraction

    Energy Technology Data Exchange (ETDEWEB)

    Li, Songqing; Hu, Lu; Chen, Ketao; Gao, Haixiang, E-mail: hxgao@cau.edu.cn

    2015-05-04

    Highlights: • An extensible automated dispersive liquid–liquid microextraction was developed. • A fully automatic SPE workstation with a modified operation program was used. • Ionic liquid-based in situ DLLME was used as model method. • SPE columns packed with nonwoven polypropylene fiber was used for phase separation. • The approach was applied to the determination of benzoylurea insecticides in water. - Abstract: In this study, a convenient and extensible automated ionic liquid-based in situ dispersive liquid–liquid microextraction (automated IL-based in situ DLLME) was developed. 1-Octyl-3-methylimidazolium bis[(trifluoromethane)sulfonyl]imide ([C{sub 8}MIM]NTf{sub 2}) is formed through the reaction between [C{sub 8}MIM]Cl and lithium bis[(trifluoromethane)sulfonyl]imide (LiNTf{sub 2}) to extract the analytes. Using a fully automatic SPE workstation, special SPE columns packed with nonwoven polypropylene (NWPP) fiber, and a modified operation program, the procedures of the IL-based in situ DLLME, including the collection of a water sample, injection of an ion exchange solvent, phase separation of the emulsified solution, elution of the retained extraction phase, and collection of the eluent into vials, can be performed automatically. The developed approach, coupled with high-performance liquid chromatography–diode array detection (HPLC–DAD), was successfully applied to the detection and concentration determination of benzoylurea (BU) insecticides in water samples. Parameters affecting the extraction performance were investigated and optimized. Under the optimized conditions, the proposed method achieved extraction recoveries of 80% to 89% for water samples. The limits of detection (LODs) of the method were in the range of 0.16–0.45 ng mL{sup −1}. The intra-column and inter-column relative standard deviations (RSDs) were <8.6%. Good linearity (r > 0.9986) was obtained over the calibration range from 2 to 500 ng mL{sup −1}. The proposed

  1. A fully-automated software pipeline for integrating breast density and parenchymal texture analysis for digital mammograms: parameter optimization in a case-control breast cancer risk assessment study

    Science.gov (United States)

    Zheng, Yuanjie; Wang, Yan; Keller, Brad M.; Conant, Emily; Gee, James C.; Kontos, Despina

    2013-02-01

    Estimating a woman's risk of breast cancer is becoming increasingly important in clinical practice. Mammographic density, estimated as the percent of dense (PD) tissue area within the breast, has been shown to be a strong risk factor. Studies also support a relationship between mammographic texture and breast cancer risk. We have developed a fullyautomated software pipeline for computerized analysis of digital mammography parenchymal patterns by quantitatively measuring both breast density and texture properties. Our pipeline combines advanced computer algorithms of pattern recognition, computer vision, and machine learning and offers a standardized tool for breast cancer risk assessment studies. Different from many existing methods performing parenchymal texture analysis within specific breast subregions, our pipeline extracts texture descriptors for points on a spatial regular lattice and from a surrounding window of each lattice point, to characterize the local mammographic appearance throughout the whole breast. To demonstrate the utility of our pipeline, and optimize its parameters, we perform a case-control study by retrospectively analyzing a total of 472 digital mammography studies. Specifically, we investigate the window size, which is a lattice related parameter, and compare the performance of texture features to that of breast PD in classifying case-control status. Our results suggest that different window sizes may be optimal for raw (12.7mm2) versus vendor post-processed images (6.3mm2). We also show that the combination of PD and texture features outperforms PD alone. The improvement is significant (p=0.03) when raw images and window size of 12.7mm2 are used, having an ROC AUC of 0.66. The combination of PD and our texture features computed from post-processed images with a window size of 6.3 mm2 achieves an ROC AUC of 0.75.

  2. A cost-effective intelligent robotic system with dual-arm dexterous coordination and real-time vision

    Science.gov (United States)

    Marzwell, Neville I.; Chen, Alexander Y. K.

    1991-01-01

    Dexterous coordination of manipulators based on the use of redundant degrees of freedom, multiple sensors, and built-in robot intelligence represents a critical breakthrough in development of advanced manufacturing technology. A cost-effective approach for achieving this new generation of robotics has been made possible by the unprecedented growth of the latest microcomputer and network systems. The resulting flexible automation offers the opportunity to improve the product quality, increase the reliability of the manufacturing process, and augment the production procedures for optimizing the utilization of the robotic system. Moreover, the Advanced Robotic System (ARS) is modular in design and can be upgraded by closely following technological advancements as they occur in various fields. This approach to manufacturing automation enhances the financial justification and ensures the long-term profitability and most efficient implementation of robotic technology. The new system also addresses a broad spectrum of manufacturing demand and has the potential to address both complex jobs as well as highly labor-intensive tasks. The ARS prototype employs the decomposed optimization technique in spatial planning. This technique is implemented to the framework of the sensor-actuator network to establish the general-purpose geometric reasoning system. The development computer system is a multiple microcomputer network system, which provides the architecture for executing the modular network computing algorithms. The knowledge-based approach used in both the robot vision subsystem and the manipulation control subsystems results in the real-time image processing vision-based capability. The vision-based task environment analysis capability and the responsive motion capability are under the command of the local intelligence centers. An array of ultrasonic, proximity, and optoelectronic sensors is used for path planning. The ARS currently has 18 degrees of freedom made up by two

  3. Coordination using Implicit Communication

    CERN Document Server

    Cuff, Paul

    2011-01-01

    We explore a basic noise-free signaling scenario where coordination and communication are naturally merged. A random signal X_1,...,X_n is processed to produce a control signal or action sequence A_1,...,A_n, which is observed and further processed (without access to X_1,...,X_n) to produce a third sequence B_1,...,B_n. The object of interest is the set of empirical joint distributions p(x,a,b) that can be achieved in this setting. We show that H(A) >= I(X;A,B) is the necessary and sufficient condition for achieving p(x,a,b) when no causality constraints are enforced on the encoders. We also give results for various causality constraints. This setting sheds light on the embedding of digital information in analog signals, a concept that is exploited in digital watermarking, steganography, cooperative communication, and strategic play in team games such as bridge.

  4. Coordinating Group report

    Energy Technology Data Exchange (ETDEWEB)

    1994-01-01

    In December 1992, western governors and four federal agencies established a Federal Advisory Committee to Develop On-site Innovative Technologies for Environmental Restoration and Waste Management (the DOIT Committee). The purpose of the Committee is to advise the federal government on ways to improve waste cleanup technology development and the cleanup of federal sites in the West. The Committee directed in January 1993 that information be collected from a wide range of potential stakeholders and that innovative technology candidate projects be identified, organized, set in motion, and evaluated to test new partnerships, regulatory approaches, and technologies which will lead to improve site cleanup. Five working groups were organized, one to develop broad project selection and evaluation criteria and four to focus on specific contaminant problems. A Coordinating Group comprised of working group spokesmen and federal and state representatives, was set up to plan and organize the routine functioning of these working groups. The working groups were charged with defining particular contaminant problems; identifying shortcomings in technology development, stakeholder involvement, regulatory review, and commercialization which impede the resolution of these problems; and identifying candidate sites or technologies which could serve as regional innovative demonstration projects to test new approaches to overcome the shortcomings. This report from the Coordinating Group to the DOIT Committee highlights the key findings and opportunities uncovered by these fact-finding working groups. It provides a basis from which recommendations from the DOIT Committee to the federal government can be made. It also includes observations from two public roundtables, one on commercialization and another on regulatory and institutional barriers impeding technology development and cleanup.

  5. Automation and control of the MMT thermal system

    Science.gov (United States)

    Gibson, J. D.; Porter, Dallan; Goble, William

    2016-07-01

    This study investigates the software automation and control framework for the MMT thermal system. Thermal-related effects on observing and telescope behavior have been considered during the entire software development process. Regression analysis of telescope and observatory subsystem data is used to characterize and model these thermal-related effects. The regression models help predict expected changes in focus and overall astronomical seeing that result from temperature variations within the telescope structure, within the primary mirror glass, and between the primary mirror glass and adjacent air (i.e., mirror seeing). This discussion is followed by a description of ongoing upgrades to the heating, ventilation and air conditioning (HVAC) system and the associated software controls. The improvements of the MMT thermal system have two objectives: 1) to provide air conditioning capabilities for the MMT facilities, and 2) to modernize and enhance the primary mirror (M1) ventilation system. The HVAC upgrade necessitates changes to the automation and control of the M1 ventilation system. The revised control system must factor in the additional requirements of the HVAC system, while still optimizing performance of the M1 ventilation system and the M1's optical behavior. An industry-standard HVAC communication and networking protocol, BACnet (Building Automation and Control network), has been adopted. Integration of the BACnet protocol into the existing software framework at the MMT is discussed. Performance of the existing automated system is evaluated and a preliminary upgraded automated control system is presented. Finally, user interfaces to the new HVAC system are discussed.

  6. Open Automated Demand Response Communications Specification (Version 1.0)

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Ghatikar, Girish; Kiliccote, Sila; Koch, Ed; Hennage, Dan; Palensky, Peter; McParland, Charles

    2009-02-28

    The development of the Open Automated Demand Response Communications Specification, also known as OpenADR or Open Auto-DR, began in 2002 following the California electricity crisis. The work has been carried out by the Demand Response Research Center (DRRC), which is managed by Lawrence Berkeley National Laboratory. This specification describes an open standards-based communications data model designed to facilitate sending and receiving demand response price and reliability signals from a utility or Independent System Operator to electric customers. OpenADR is one element of the Smart Grid information and communications technologies that are being developed to improve optimization between electric supply and demand. The intention of the open automated demand response communications data model is to provide interoperable signals to building and industrial control systems that are preprogrammed to take action based on a demand response signal, enabling a demand response event to be fully automated, with no manual intervention. The OpenADR specification is a flexible infrastructure to facilitate common information exchange between the utility or Independent System Operator and end-use participants. The concept of an open specification is intended to allow anyone to implement the signaling systems, the automation server or the automation clients.

  7. Coordination of engineering applications: Project summary

    Energy Technology Data Exchange (ETDEWEB)

    Cassidy, P.J.

    1996-08-31

    The purpose of this project was to focus on and coordinate several active engineering applications projects to optimize their integration. The end result of the project was to develop and demonstrate the capability of electronically receiving a part from the originating design agency, performing computer-aided engineering analyses, developing process plans, adding electronic input from numerous onsite systems, and producing an online operation sheet (manual) for viewing on a shop floor workstation. A successful demonstration of these applications was performed in December 1988.

  8. Building Automation Using Wired Communication.

    Directory of Open Access Journals (Sweden)

    Ms. Supriya Gund*,

    2014-04-01

    Full Text Available In this paper, we present the design and implementation of a building automation system where communication technology LAN has been used. This paper mainly focuses on the controlling of home appliances remotely and providing security when the user is away from the place. This system provides ideal solution to the problems faced by home owners in daily life. This system provides security against intrusion as well as automates various home appliances using LAN. To demonstrate the feasibility and effectiveness of the proposed system, the device such as fire sensor, gas sensor, panic switch, intruder switch along with the smartcard have been developed and evaluated with the building automation system. These techniques are successfully merged in a single building automation system. This system offers a complete, low cost powerful and user friendly way of real-time monitoring and remote control of a building.

  9. Evolution of Home Automation Technology

    Directory of Open Access Journals (Sweden)

    Mohd. Rihan

    2009-01-01

    Full Text Available In modern society home and office automation has becomeincreasingly important, providing ways to interconnectvarious home appliances. This interconnection results infaster transfer of information within home/offices leading tobetter home management and improved user experience.Home Automation, in essence, is a technology thatintegrates various electrical systems of a home to provideenhanced comfort and security. Users are grantedconvenient and complete control over all the electrical homeappliances and they are relieved from the tasks thatpreviously required manual control. This paper tracks thedevelopment of home automation technology over the lasttwo decades. Various home automation technologies havebeen explained briefly, giving a chronological account of theevolution of one of the most talked about technologies ofrecent times.

  10. Home automation with Intel Galileo

    CERN Document Server

    Dundar, Onur

    2015-01-01

    This book is for anyone who wants to learn Intel Galileo for home automation and cross-platform software development. No knowledge of programming with Intel Galileo is assumed, but knowledge of the C programming language is essential.

  11. Automating the Purple Crow Lidar

    Directory of Open Access Journals (Sweden)

    Hicks Shannon

    2016-01-01

    Full Text Available The Purple Crow LiDAR (PCL was built to measure short and long term coupling between the lower, middle, and upper atmosphere. The initial component of my MSc. project is to automate two key elements of the PCL: the rotating liquid mercury mirror and the Zaber alignment mirror. In addition to the automation of the Zaber alignment mirror, it is also necessary to describe the mirror’s movement and positioning errors. Its properties will then be added into the alignment software. Once the alignment software has been completed, we will compare the new alignment method with the previous manual procedure. This is the first among several projects that will culminate in a fully-automated lidar. Eventually, we will be able to work remotely, thereby increasing the amount of data we collect. This paper will describe the motivation for automation, the methods we propose, preliminary results for the Zaber alignment error analysis, and future work.

  12. Network based automation for SMEs

    DEFF Research Database (Denmark)

    Shahabeddini Parizi, Mohammad; Radziwon, Agnieszka

    2017-01-01

    The implementation of appropriate automation concepts which increase productivity in Small and Medium Sized Enterprises (SMEs) requires a lot of effort, due to their limited resources. Therefore, it is strongly recommended for small firms to open up for the external sources of knowledge, which...... automation solutions. The empirical data collection involved application of a combination of comparative case study method with action research elements. This article provides an outlook over the challenges in implementing technological improvements and the way how it could be resolved in collaboration...... with other members of the same regional ecosystem. The findings highlight two main automation related areas where manufacturing SMEs could leverage on external sources on knowledge – these are assistance in defining automation problem as well as appropriate solution and provider selection. Consequently...

  13. National Automated Conformity Inspection Process -

    Data.gov (United States)

    Department of Transportation — The National Automated Conformity Inspection Process (NACIP) Application is intended to expedite the workflow process as it pertains to the FAA Form 81 0-10 Request...

  14. molSimplify: A toolkit for automating discovery in inorganic chemistry.

    Science.gov (United States)

    Ioannidis, Efthymios I; Gani, Terry Z H; Kulik, Heather J

    2016-08-15

    We present an automated, open source toolkit for the first-principles screening and discovery of new inorganic molecules and intermolecular complexes. Challenges remain in the automatic generation of candidate inorganic molecule structures due to the high variability in coordination and bonding, which we overcome through a divide-and-conquer tactic that flexibly combines force-field preoptimization of organic fragments with alignment to first-principles-trained metal-ligand distances. Exploration of chemical space is enabled through random generation of ligands and intermolecular complexes from large chemical databases. We validate the generated structures with the root mean squared (RMS) gradients evaluated from density functional theory (DFT), which are around 0.02 Ha/au across a large 150 molecule test set. Comparison of molSimplify results to full optimization with the universal force field reveals that RMS DFT gradients are improved by 40%. Seamless generation of input files, preparation and execution of electronic structure calculations, and post-processing for each generated structure aids interpretation of underlying chemical and energetic trends. © 2016 Wiley Periodicals, Inc.

  15. Invariant manifolds and collective coordinates

    Energy Technology Data Exchange (ETDEWEB)

    Papenbrock, T. [Centro Internacional de Ciencias, Cuernavaca, Morelos (Mexico); Institute for Nuclear Theory, University of Washington, Seattle, WA (United States); Seligman, T.H. [Centro Internacional de Ciencias, Cuernavaca, Morelos (Mexico); Centro de Ciencias Fisicas, University of Mexico (UNAM), Cuernavaca (Mexico)

    2001-09-14

    We introduce suitable coordinate systems for interacting many-body systems with invariant manifolds. These are Cartesian in coordinate and momentum space and chosen such that several components are identically zero for motion on the invariant manifold. In this sense these coordinates are collective. We make a connection to Zickendraht's collective coordinates and present certain configurations of few-body systems where rotations and vibrations decouple from single-particle motion. These configurations do not depend on details of the interaction. (author)

  16. 有功无功协调的主动配电网鲁棒电压控制%Active and Reactive Power Coordinated Robust Optimization for Active Distribution Networks

    Institute of Scientific and Technical Information of China (English)

    王永杰; 吴文传; 张伯明; 鄂志君; 姚维平

    2016-01-01

    光伏出力具有较强的波动性且难以预测,同时配电网中量测装置不足导致负荷估计精度差,因此传统的确定性电压控制方法不完全适用于主动配电网。电压越限是导致配电网中光伏发电弃光或停运的主要原因,文中提出了一种考虑光伏出力和负荷不确定性的鲁棒区间电压控制方法。该方法计算出最优无功补偿决策的同时,给出光伏电站的允许有功出力区间。所提出的方法在保证系统安全的基础上使得光伏电站尽可能运行在最大功率点跟踪模式。在改进的 IEEE 33节点算例系统上进行了分析,通过蒙特卡洛仿真验证了所提方法的鲁棒性。%The output of photovoltaic(PV)power is frequently fluctuated and difficult to be predicted.Besides,the estimation errors of load demands are significant due to the limited real-time measurement devices in distribution networks.Therefore, the conventional deterministic voltage control strategies cannot be well applied into active distribution networks.The over-limit voltage is the main reason for the PV curtailment or outage in distribution networks.This paper proposes a robust interval-voltage control strategy considering the uncertainties of PV output and load demand.The optimal var compensation strategies and the allowable active power interval for the PV station are calculated in the strategy.The method proposed guarantees system safety and is able to make the PV station running at the maximum power point tracking mode at most possible.Finally, by taking the modified IEEE 33-bus system as an example,the robustness of the proposed method is verified through Monte Carlo simulation.

  17. Using Automated Scores of Student Essays to Support Teacher Guidance in Classroom Inquiry

    Science.gov (United States)

    Gerard, Libby F.; Linn, Marcia C.

    2016-02-01

    Computer scoring of student written essays about an inquiry topic can be used to diagnose student progress both to alert teachers to struggling students and to generate automated guidance. We identify promising ways for teachers to add value to automated guidance to improve student learning. Three teachers from two schools and their 386 students participated. We draw on evidence from student progress, observations of how teachers interact with students, and reactions of teachers. The findings suggest that alerts for teachers prompted rich teacher-student conversations about energy in photosynthesis. In one school, the combination of the automated guidance plus teacher guidance was more effective for student science learning than two rounds of personalized, automated guidance. In the other school, both approaches resulted in equal learning gains. These findings suggest optimal combinations of automated guidance and teacher guidance to support students to revise explanations during inquiry and build integrated understanding of science.

  18. Evolution of Home Automation Technology

    OpenAIRE

    Mohd. Rihan; M. Salim Beg

    2009-01-01

    In modern society home and office automation has becomeincreasingly important, providing ways to interconnectvarious home appliances. This interconnection results infaster transfer of information within home/offices leading tobetter home management and improved user experience.Home Automation, in essence, is a technology thatintegrates various electrical systems of a home to provideenhanced comfort and security. Users are grantedconvenient and complete control over all the electrical homeappl...

  19. Technology modernization assessment flexible automation

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, D.W.; Boyd, D.R.; Hansen, N.H.; Hansen, M.A.; Yount, J.A.

    1990-12-01

    The objectives of this report are: to present technology assessment guidelines to be considered in conjunction with defense regulations before an automation project is developed to give examples showing how assessment guidelines may be applied to a current project to present several potential areas where automation might be applied successfully in the depot system. Depots perform primarily repair and remanufacturing operations, with limited small batch manufacturing runs. While certain activities (such as Management Information Systems and warehousing) are directly applicable to either environment, the majority of applications will require combining existing and emerging technologies in different ways, with the special needs of depot remanufacturing environment. Industry generally enjoys the ability to make revisions to its product lines seasonally, followed by batch runs of thousands or more. Depot batch runs are in the tens, at best the hundreds, of parts with a potential for large variation in product mix; reconfiguration may be required on a week-to-week basis. This need for a higher degree of flexibility suggests a higher level of operator interaction, and, in turn, control systems that go beyond the state of the art for less flexible automation and industry in general. This report investigates the benefits and barriers to automation and concludes that, while significant benefits do exist for automation, depots must be prepared to carefully investigate the technical feasibility of each opportunity and the life-cycle costs associated with implementation. Implementation is suggested in two ways: (1) develop an implementation plan for automation technologies based on results of small demonstration automation projects; (2) use phased implementation for both these and later stage automation projects to allow major technical and administrative risk issues to be addressed. 10 refs., 2 figs., 2 tabs. (JF)

  20. Aprendizaje automático

    OpenAIRE

    Moreno, Antonio

    1994-01-01

    En este libro se introducen los conceptos básicos en una de las ramas más estudiadas actualmente dentro de la inteligencia artificial: el aprendizaje automático. Se estudian temas como el aprendizaje inductivo, el razonamiento analógico, el aprendizaje basado en explicaciones, las redes neuronales, los algoritmos genéticos, el razonamiento basado en casos o las aproximaciones teóricas al aprendizaje automático.

  1. 2015 Chinese Intelligent Automation Conference

    CERN Document Server

    Li, Hongbo

    2015-01-01

    Proceedings of the 2015 Chinese Intelligent Automation Conference presents selected research papers from the CIAC’15, held in Fuzhou, China. The topics include adaptive control, fuzzy control, neural network based control, knowledge based control, hybrid intelligent control, learning control, evolutionary mechanism based control, multi-sensor integration, failure diagnosis, reconfigurable control, etc. Engineers and researchers from academia, industry and the government can gain valuable insights into interdisciplinary solutions in the field of intelligent automation.

  2. Automated Supernova Discovery (Abstract)

    Science.gov (United States)

    Post, R. S.

    2015-12-01

    (Abstract only) We are developing a system of robotic telescopes for automatic recognition of Supernovas as well as other transient events in collaboration with the Puckett Supernova Search Team. At the SAS2014 meeting, the discovery program, SNARE, was first described. Since then, it has been continuously improved to handle searches under a wide variety of atmospheric conditions. Currently, two telescopes are used to build a reference library while searching for PSN with a partial library. Since data is taken every night without clouds, we must deal with varying atmospheric and high background illumination from the moon. Software is configured to identify a PSN, reshoot for verification with options to change the run plan to acquire photometric or spectrographic data. The telescopes are 24-inch CDK24, with Alta U230 cameras, one in CA and one in NM. Images and run plans are sent between sites so the CA telescope can search while photometry is done in NM. Our goal is to find bright PSNs with magnitude 17.5 or less which is the limit of our planned spectroscopy. We present results from our first automated PSN discoveries and plans for PSN data acquisition.

  3. Multifunction automated crawling system

    Science.gov (United States)

    Bar-Cohen, Yoseph (Inventor); Joffe, Benjamin (Inventor); Backes, Paul Gregory (Inventor)

    1999-01-01

    The present invention is an automated crawling robot system including a platform, a first leg assembly, a second leg assembly, first and second rails attached to the platform, and an onboard electronic computer controller. The first leg assembly has an intermittent coupling device and the second leg assembly has an intermittent coupling device for intermittently coupling the respective first and second leg assemblies to a particular object. The first and second leg assemblies are slidably coupled to the rail assembly and are slidably driven by motors to thereby allow linear movement. In addition, the first leg assembly is rotary driven by a rotary motor to thereby provide rotary motion relative to the platform. To effectuate motion, the intermittent coupling devices of the first and second leg assemblies alternately couple the respective first and second leg assemblies to an object. This motion is done while simultaneously moving one of the leg assemblies linearly in the desired direction and preparing the next step. This arrangement allows the crawler of the present invention to traverse an object in a range of motion covering 360 degrees.

  4. Automated ISS Flight Utilities

    Science.gov (United States)

    Offermann, Jan Tuzlic

    2016-01-01

    During my internship at NASA Johnson Space Center, I worked in the Space Radiation Analysis Group (SRAG), where I was tasked with a number of projects focused on the automation of tasks and activities related to the operation of the International Space Station (ISS). As I worked on a number of projects, I have written short sections below to give a description for each, followed by more general remarks on the internship experience. My first project is titled "General Exposure Representation EVADOSE", also known as "GEnEVADOSE". This project involved the design and development of a C++/ ROOT framework focused on radiation exposure for extravehicular activity (EVA) planning for the ISS. The utility helps mission managers plan EVAs by displaying information on the cumulative radiation doses that crew will receive during an EVA as a function of the egress time and duration of the activity. SRAG uses a utility called EVADOSE, employing a model of the space radiation environment in low Earth orbit to predict these doses, as while outside the ISS the astronauts will have less shielding from charged particles such as electrons and protons. However, EVADOSE output is cumbersome to work with, and prior to GEnEVADOSE, querying data and producing graphs of ISS trajectories and cumulative doses versus egress time required manual work in Microsoft Excel. GEnEVADOSE automates all this work, reading in EVADOSE output file(s) along with a plaintext file input by the user providing input parameters. GEnEVADOSE will output a text file containing all the necessary dosimetry for each proposed EVA egress time, for each specified EVADOSE file. It also plots cumulative dose versus egress time and the ISS trajectory, and displays all of this information in an auto-generated presentation made in LaTeX. New features have also been added, such as best-case scenarios (egress times corresponding to the least dose), interpolated curves for trajectories, and the ability to query any time in the

  5. Automated Gas Distribution System

    Science.gov (United States)

    Starke, Allen; Clark, Henry

    2012-10-01

    The cyclotron of Texas A&M University is one of the few and prized cyclotrons in the country. Behind the scenes of the cyclotron is a confusing, and dangerous setup of the ion sources that supplies the cyclotron with particles for acceleration. To use this machine there is a time consuming, and even wasteful step by step process of switching gases, purging, and other important features that must be done manually to keep the system functioning properly, while also trying to maintain the safety of the working environment. Developing a new gas distribution system to the ion source prevents many of the problems generated by the older manually setup process. This developed system can be controlled manually in an easier fashion than before, but like most of the technology and machines in the cyclotron now, is mainly operated based on software programming developed through graphical coding environment Labview. The automated gas distribution system provides multi-ports for a selection of different gases to decrease the amount of gas wasted through switching gases, and a port for the vacuum to decrease the amount of time spent purging the manifold. The Labview software makes the operation of the cyclotron and ion sources easier, and safer for anyone to use.

  6. Genetic circuit design automation.

    Science.gov (United States)

    Nielsen, Alec A K; Der, Bryan S; Shin, Jonghyeon; Vaidyanathan, Prashant; Paralanov, Vanya; Strychalski, Elizabeth A; Ross, David; Densmore, Douglas; Voigt, Christopher A

    2016-04-01

    Computation can be performed in living cells by DNA-encoded circuits that process sensory information and control biological functions. Their construction is time-intensive, requiring manual part assembly and balancing of regulator expression. We describe a design environment, Cello, in which a user writes Verilog code that is automatically transformed into a DNA sequence. Algorithms build a circuit diagram, assign and connect gates, and simulate performance. Reliable circuit design requires the insulation of gates from genetic context, so that they function identically when used in different circuits. We used Cello to design 60 circuits forEscherichia coli(880,000 base pairs of DNA), for which each DNA sequence was built as predicted by the software with no additional tuning. Of these, 45 circuits performed correctly in every output state (up to 10 regulators and 55 parts), and across all circuits 92% of the output states functioned as predicted. Design automation simplifies the incorporation of genetic circuits into biotechnology projects that require decision-making, control, sensing, or spatial organization.

  7. Automated sugar analysis

    Directory of Open Access Journals (Sweden)

    Tadeu Alcides MARQUES

    2016-03-01

    Full Text Available Abstract Sugarcane monosaccharides are reducing sugars, and classical analytical methodologies (Lane-Eynon, Benedict, complexometric-EDTA, Luff-Schoorl, Musson-Walker, Somogyi-Nelson are based on reducing copper ions in alkaline solutions. In Brazil, certain factories use Lane-Eynon, others use the equipment referred to as “REDUTEC”, and additional factories analyze reducing sugars based on a mathematic model. The objective of this paper is to understand the relationship between variations in millivolts, mass and tenors of reducing sugars during the analysis process. Another objective is to generate an automatic model for this process. The work herein uses the equipment referred to as “REDUTEC”, a digital balance, a peristaltic pump, a digital camcorder, math programs and graphics programs. We conclude that the millivolts, mass and tenors of reducing sugars exhibit a good mathematical correlation, and the mathematical model generated was benchmarked to low-concentration reducing sugars (<0.3%. Using the model created herein, reducing sugars analyses can be automated using the new equipment.

  8. Parmodel: a web server for automated comparative modeling of proteins.

    Science.gov (United States)

    Uchôa, Hugo Brandão; Jorge, Guilherme Eberhart; Freitas Da Silveira, Nelson José; Camera, João Carlos; Canduri, Fernanda; De Azevedo, Walter Filgueira

    2004-12-24

    Parmodel is a web server for automated comparative modeling and evaluation of protein structures. The aim of this tool is to help inexperienced users to perform modeling, assessment, visualization, and optimization of protein models as well as crystallographers to evaluate structures solved experimentally. It is subdivided in four modules: Parmodel Modeling, Parmodel Assessment, Parmodel Visualization, and Parmodel Optimization. The main module is the Parmodel Modeling that allows the building of several models for a same protein in a reduced time, through the distribution of modeling processes on a Beowulf cluster. Parmodel automates and integrates the main softwares used in comparative modeling as MODELLER, Whatcheck, Procheck, Raster3D, Molscript, and Gromacs. This web server is freely accessible at .

  9. Automated Guide Vehicles Dynamic Scheduling Based on Annealing Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Zou Gan

    2013-05-01

    Full Text Available Dispatching automated guided vehicles (AGVs is the common approach for AGVs scheduling in practice, the information about load arrivals in advance was not used to optimize the performance of the automated guided vehicles system (AGVsS. According to the characteristics of the AGVsS, the mathematical model of AGVs scheduling was established. A heuristic algorithm called Annealing Genetic Algorithm (AGA was presented to deal with the AGVs scheduling problem,and applied the algorithm dynamically by using it repeatedly under a combined rolling optimization strategy. the performance of the proposed approach for AGVs scheduling was compared with the dispatching rules by simulation. Results showed that the approach performs significantly better than the dispatching rules and proved that it is really effective for AGVsS.

  10. Generación automática de variantes de trayectorias aplicada al diseño óptimo bajo criterios múltiples de redes hidráulicas de abasto. // Automatic generation of trajectory variants applied to the optimal design of water supply system under multiple criter

    Directory of Open Access Journals (Sweden)

    J. R. Hechavarría Hernández

    2007-05-01

    Full Text Available La determinación de las trayectorias más eficientes que deben tener las redes, instalaciones o vías de transporte es unproblema que motiva a muchos investigadores de diversas ingenierías: informática, civil, mecánica, hidráulica, etc., cuyassoluciones requieren ser realizadas sobre la base de la elevada integración de la información durante el proceso de análisisy estudio de la tarea, de la aplicación de los métodos modernos de preparación y toma de decisiones, así como laorganización racional de los procedimientos de cálculo de ingeniería. Para definir el trazado de trayectorias en losproyectos de ingeniería se tienen en cuenta determinadas condiciones propias del entorno. Estas trayectorias, pese a sudiferente designación, pueden coincidir en determinadas zonas y compartir espacios limitados. Por esta razón, un sistemapara la generación automática de variantes de trayectorias deberá considerar las limitaciones del espacio disponible alestablecer el tipo y dimensiones límites. En el artículo se presenta un procedimiento que apoyado en un sistema informáticopermite obtener de manera automática variantes de trayectorias cerradas las cuales dependiendo de su destino de servicioserán optimizadas bajo criterios de eficiencias.Palabras claves: Sistema CAD, Generación trayectorias, red hidráulica, diseño óptimo._____________________________________________________________________________Abstract:Determining the most efficient trajectories of networks, installations or transportation roads is a problem to a lot ofinvestigators of different fields: Computer Science, Civil Engineering, Mechanical Engineering, Hydraulics, etc.Solutions should be carried out taking into account a high integration of information in the analysis and study processof the task, the application of the modern methods of preparation and decision making, as well as the rationalorganization of the procedures of engineering calculation. Several

  11. Optimization and Optimal Control

    CERN Document Server

    Chinchuluun, Altannar; Enkhbat, Rentsen; Tseveendorj, Ider

    2010-01-01

    During the last four decades there has been a remarkable development in optimization and optimal control. Due to its wide variety of applications, many scientists and researchers have paid attention to fields of optimization and optimal control. A huge number of new theoretical, algorithmic, and computational results have been observed in the last few years. This book gives the latest advances, and due to the rapid development of these fields, there are no other recent publications on the same topics. Key features: Provides a collection of selected contributions giving a state-of-the-art accou

  12. Optimally Stopped Optimization

    Science.gov (United States)

    Vinci, Walter; Lidar, Daniel A.

    2016-11-01

    We combine the fields of heuristic optimization and optimal stopping. We propose a strategy for benchmarking randomized optimization algorithms that minimizes the expected total cost for obtaining a good solution with an optimal number of calls to the solver. To do so, rather than letting the objective function alone define a cost to be minimized, we introduce a further cost-per-call of the algorithm. We show that this problem can be formulated using optimal stopping theory. The expected cost is a flexible figure of merit for benchmarking probabilistic solvers that can be computed when the optimal solution is not known and that avoids the biases and arbitrariness that affect other measures. The optimal stopping formulation of benchmarking directly leads to a real-time optimal-utilization strategy for probabilistic optimizers with practical impact. We apply our formulation to benchmark simulated annealing on a class of maximum-2-satisfiability (MAX2SAT) problems. We also compare the performance of a D-Wave 2X quantum annealer to the Hamze-Freitas-Selby (HFS) solver, a specialized classical heuristic algorithm designed for low-tree-width graphs. On a set of frustrated-loop instances with planted solutions defined on up to N =1098 variables, the D-Wave device is 2 orders of magnitude faster than the HFS solver, and, modulo known caveats related to suboptimal annealing times, exhibits identical scaling with problem size.

  13. Automated microinjection system for adherent cells

    Science.gov (United States)

    Youoku, Sachihiro; Suto, Yoshinori; Ando, Moritoshi; Ito, Akio

    2007-07-01

    We have developed an automated microinjection system that can handle more than 500 cells an hour. Microinjection injects foreign agents directly into cells using a micro-capillary. It can randomly introduce agents such as DNA, proteins and drugs into various types of cells. However, conventional methods require a skilled operator and suffer from low throughput. The new automated microinjection techniques we have developed consist of a Petri dish height measuring method and a capillary apex position measuring method. The dish surface height is measured by analyzing the images of cells that adhere to the dish surface. The contrast between the cell images is minimized when the focus plane of an object lens coincides with the dish surface. We have developed an optimized focus searching method with a height accuracy of +/-0.2 um. The capillary apex position detection method consists of three steps: rough, middle, and precise. These steps are employed sequentially to cover capillary displacements of up to +/-2 mm, and to ultimately accomplish an alignment accuracy of less than one micron. Experimental results using this system we developed show that it can introduce fluorescent material (Alexa488) into adherent cells, HEK293, with a success rate of 88.5%.

  14. Automated full matrix capture for industrial processes

    Science.gov (United States)

    Brown, Roy H.; Pierce, S. Gareth; Collison, Ian; Dutton, Ben; Dziewierz, Jerzy; Jackson, Joseph; Lardner, Timothy; MacLeod, Charles; Morozov, Maxim

    2015-03-01

    Full matrix capture (FMC) ultrasound can be used to generate a permanent re-focusable record of data describing the geometry of a part; a valuable asset for an inspection process. FMC is a desirable acquisition mode for automated scanning of complex geometries, as it allows compensation for surface shape in post processing and application of the total focusing method. However, automating the delivery of such FMC inspection remains a significant challenge for real industrial processes due to the high data overhead associated with the ultrasonic acquisition. The benefits of NDE delivery using six-axis industrial robots are well versed when considering complex inspection geometries, but such an approach brings additional challenges to scanning speed and positional accuracy when combined with FMC inspection. This study outlines steps taken to optimize the scanning speed and data management of a process to scan the diffusion bonded membrane of a titanium test plate. A system combining a KUKA robotic arm and a reconfigurable FMC phased array controller is presented. The speed and data implications of different scanning methods are compared, and the impacts on data visualization quality are discussed with reference to this study. For the 0.5 m2 sample considered, typical acquisitions of 18 TB/m2 were measured for a triple back wall FMC acquisition, illustrating the challenge of combining high data throughput with acceptable scanning speeds.

  15. An automated procedure for evaluating song imitation.

    Directory of Open Access Journals (Sweden)

    Yael Mandelblat-Cerf

    Full Text Available Songbirds have emerged as an excellent model system to understand the neural basis of vocal and motor learning. Like humans, songbirds learn to imitate the vocalizations of their parents or other conspecific "tutors." Young songbirds learn by comparing their own vocalizations to the memory of their tutor song, slowly improving until over the course of several weeks they can achieve an excellent imitation of the tutor. Because of the slow progression of vocal learning, and the large amounts of singing generated, automated algorithms for quantifying vocal imitation have become increasingly important for studying the mechanisms underlying this process. However, methodologies for quantifying song imitation are complicated by the highly variable songs of either juvenile birds or those that learn poorly because of experimental manipulations. Here we present a method for the evaluation of song imitation that incorporates two innovations: First, an automated procedure for selecting pupil song segments, and, second, a new algorithm, implemented in Matlab, for computing both song acoustic and sequence similarity. We tested our procedure using zebra finch song and determined a set of acoustic features for which the algorithm optimally differentiates between similar and non-similar songs.

  16. Repairing Multiple Failures with Coordinated and Adaptive Regenerating Codes

    CERN Document Server

    Kermarrec, Anne-Marie; Scouarnec, Nicolas Le

    2011-01-01

    Erasure correcting codes are widely used to ensure data persistence in distributed storage systems. This paper addresses the repair of such codes in the presence of simultaneous failures. It is crucial to maintain the required redundancy over time to prevent permanent data losses. We go beyond existing work (i.e., regenerating codes by Dimakis et al.) and propose coordinated regenerating codes allowing devices to coordinate during simultaneous repairs thus reducing the costs further. We provide closed form expressions of the communication costs of our new codes depending on the number of live devices and the number of devices being repaired. We prove that deliberately delaying repairs does not bring additional gains in itself. This means that regenerating codes are optimal as long as each failure can be repaired before a second one occurs. Yet, when multiple failures are detected simultaneously, we prove that our coordinated regenerating codes are optimal and outperform uncoordinated repairs (with respect to ...

  17. Data-assimilation by delay-coordinate nudging

    CERN Document Server

    Pazó, D; López, J M

    2015-01-01

    A new nudging method for data assimilation, delay-coordinate nudging, is presented. Delay-coordinate nudging makes explicit use of present and past observations in the formulation of the forcing driving the model evolution at each time-step. Numerical experiments with a low order chaotic system show that the new method systematically outperforms standard nudging in different model and observational scenarios, also when using an un-optimized formulation of the delay-nudging coefficients. A connection between the optimal delay and the dominant Lyapunov exponent of the dynamics is found based on heuristic arguments and is confirmed by the numerical results, providing a guideline for the practical implementation of the algorithm. Delay-coordinate nudging preserves the easiness of implementation, the intuitive functioning and the reduced computational cost of the standard nudging, making it a potential alternative especially in the field of seasonal-to-decadal predictions with large Earth system models that limit ...

  18. A Care Coordination Program for Substance-Exposed Newborns

    Science.gov (United States)

    Twomey, Jean E.; Caldwell, Donna; Soave, Rosemary; Fontaine, Lynne Andreozzi; Lester, Barry M.

    2011-01-01

    The Vulnerable Infants Program of Rhode Island (VIP-RI) was established as a care coordination program to promote permanency for substance-exposed newborns in the child welfare system. Goals of VIP-RI were to optimize parents' opportunities for reunification and increase the efficacy of social service systems involved with families affected by…

  19. Distributed coordination of energy storage with distributed generators

    NARCIS (Netherlands)

    Yang, Tao; Wu, Di; Stoorvogel, Anton A.; Stoustrup, Jakob

    2016-01-01

    With a growing emphasis on energy efficiency and system flexibility, a great effort has been made recently in developing distributed energy resources (DER), including distributed generators and energy storage systems. This paper first formulates an optimal DER coordination problem considering constr

  20. Rowing Crew Coordination Dynamics at Increasing Stroke Rates

    NARCIS (Netherlands)

    Cuijpers, Laura S.; Zaal, Frank T. J. M.; de Poel, Harjo J.

    2015-01-01

    In rowing, perfect synchronisation is important for optimal performance of a crew. Remarkably, a recent study on ergometers demonstrated that antiphase crew coordination might be mechanically more efficient by reducing the power lost to within-cycle velocity fluctuations of the boat. However, couple