WorldWideScience

Sample records for automated optimal coordination

  1. Optimal Control and Coordination of Connected and Automated Vehicles at Urban Traffic Intersections

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Yue J. [Boston University; Malikopoulos, Andreas [ORNL; Cassandras, Christos G. [Boston University

    2016-01-01

    We address the problem of coordinating online a continuous flow of connected and automated vehicles (CAVs) crossing two adjacent intersections in an urban area. We present a decentralized optimal control framework whose solution yields for each vehicle the optimal acceleration/deceleration at any time in the sense of minimizing fuel consumption. The solu- tion, when it exists, allows the vehicles to cross the intersections without the use of traffic lights, without creating congestion on the connecting road, and under the hard safety constraint of collision avoidance. The effectiveness of the proposed solution is validated through simulation considering two intersections located in downtown Boston, and it is shown that coordination of CAVs can reduce significantly both fuel consumption and travel time.

  2. Nonparametric variational optimization of reaction coordinates

    Energy Technology Data Exchange (ETDEWEB)

    Banushkina, Polina V.; Krivov, Sergei V., E-mail: s.krivov@leeds.ac.uk [Astbury Center for Structural Molecular Biology, Faculty of Biological Sciences, University of Leeds, Leeds LS2 9JT (United Kingdom)

    2015-11-14

    State of the art realistic simulations of complex atomic processes commonly produce trajectories of large size, making the development of automated analysis tools very important. A popular approach aimed at extracting dynamical information consists of projecting these trajectories into optimally selected reaction coordinates or collective variables. For equilibrium dynamics between any two boundary states, the committor function also known as the folding probability in protein folding studies is often considered as the optimal coordinate. To determine it, one selects a functional form with many parameters and trains it on the trajectories using various criteria. A major problem with such an approach is that a poor initial choice of the functional form may lead to sub-optimal results. Here, we describe an approach which allows one to optimize the reaction coordinate without selecting its functional form and thus avoiding this source of error.

  3. Nonparametric variational optimization of reaction coordinates

    Science.gov (United States)

    Banushkina, Polina V.; Krivov, Sergei V.

    2015-11-01

    State of the art realistic simulations of complex atomic processes commonly produce trajectories of large size, making the development of automated analysis tools very important. A popular approach aimed at extracting dynamical information consists of projecting these trajectories into optimally selected reaction coordinates or collective variables. For equilibrium dynamics between any two boundary states, the committor function also known as the folding probability in protein folding studies is often considered as the optimal coordinate. To determine it, one selects a functional form with many parameters and trains it on the trajectories using various criteria. A major problem with such an approach is that a poor initial choice of the functional form may lead to sub-optimal results. Here, we describe an approach which allows one to optimize the reaction coordinate without selecting its functional form and thus avoiding this source of error.

  4. Optimized coordinates for anharmonic vibrational structure theories.

    Science.gov (United States)

    Yagi, Kiyoshi; Keçeli, Murat; Hirata, So

    2012-11-28

    A procedure to determine optimal vibrational coordinates is developed on the basis of an earlier idea of Thompson and Truhlar [J. Chem. Phys. 77, 3031 (1982)]. For a given molecule, these coordinates are defined as the unitary transform of the normal coordinates that minimizes the energy of the vibrational self-consistent-field (VSCF) method for the ground state. They are justified by the fact that VSCF in these coordinates becomes exact in two limiting cases: harmonic oscillators, where the optimized coordinates are normal, and noninteracting anharmonic oscillators, in which the optimized coordinates are localized on individual oscillators. A robust and general optimization algorithm is developed, which decomposes the transformation matrix into a product of Jacobi matrices, determines the rotation angle of each Jacobi matrix that minimizes the energy, and iterates the process until a minimum in the whole high dimension is reached. It is shown that the optimized coordinates are neither entirely localized nor entirely delocalized (or normal) in any of the molecules (the water, water dimer, and ethylene molecules) examined (apart from the aforementioned limiting cases). Rather, high-frequency stretching modes tend to be localized, whereas low-frequency skeletal vibrations remain normal. On the basis of these coordinates, we introduce two new vibrational structure methods: optimized-coordinate VSCF (oc-VSCF) and optimized-coordinate vibrational configuration interaction (oc-VCI). For the modes that become localized, oc-VSCF is found to outperform VSCF, whereas, for both classes of modes, oc-VCI exhibits much more rapid convergence than VCI with respect to the rank of excitations. We propose a rational configuration selection for oc-VCI when the optimized coordinates are localized. The use of the optimized coordinates in VCI with this configuration selection scheme reduces the mean absolute errors in the frequencies of the fundamentals and the first overtones

  5. Hybrid Optimized and Localized Vibrational Coordinates.

    Science.gov (United States)

    Klinting, Emil Lund; König, Carolin; Christiansen, Ove

    2015-11-01

    We present a new type of vibrational coordinates denoted hybrid optimized and localized coordinates (HOLCs) aiming at a good set of rectilinear vibrational coordinates supporting fast convergence in vibrational stucture calculations. The HOLCs are obtained as a compromise between the recently promoted optimized coordinates (OCs) and localized coordinates (LCs). The three sets of coordinates are generally different from each other and differ from standard normal coordinates (NCs) as well. In determining the HOLCs, we optimize the vibrational self-consistent field (VSCF) energy with respect to orthogonal transformation of the coordinates, which is similar to determining OCs but for HOLCs we additionally introduce a penalty for delocalization, by using a measure of localization similar to that employed in determining LCs. The same theory and implementation covers OCs, LCs, and HOLCs. It is shown that varying one penalty parameter allows for connecting OCs and LCs. The HOLCs are compared to NCs, OCs, and LCs in their nature and performance as basis for vibrational coupled cluster (VCC) response calculations of vibrational anharmonic energies for a small set of simple systems comprising water, formaldehyde, and ethylene. It is found that surprisingly good results can be obtained with HOLCs by using potential energy surfaces as simple as quadratic Taylor expansions. Quite similar coordinates are found for the already established OCs but obtaining these OCs requires much more elaborate and expensive potential energy surfaces and localization is generally not guaranteed. The ability to compute HOLCs for somewhat larger systems is demonstrated for coumarin and the alanine quadramer. The good agreement between HOLCs and OCs, together with the much easier applicability of HOLCs for larger systems, suggests that HOLCs may be a pragmatically very interesting option for anharmonic calculations on medium to large molecular systems.

  6. Automated selection of LEDs by luminance and chromaticity coordinate

    CERN Document Server

    Fischer, Ulrich H P; Reinboth, Christian

    2010-01-01

    The increased use of LEDs for lighting purposes has led to the development of numerous applications requiring a pre-selection of LEDs by their luminance and / or their chromaticity coordinate. This paper demonstrates how a manual pre-selection process can be realized using a relatively simple configuration. Since a manual selection service can only be commercially viable as long as only small quantities of LEDs need to be sorted, an automated solution suggests itself. This paper introduces such a solution, which has been developed by Harzoptics in close cooperation with Rundfunk Gernrode. The paper also discusses current challenges in measurement technology as well as market trends.

  7. Optimal Coordination of Automatic Line Switches for Distribution Systems

    Directory of Open Access Journals (Sweden)

    Jyh-Cherng Gu

    2012-04-01

    Full Text Available For the Taiwan Power Company (Taipower, the margins of coordination times between the lateral circuit breakers (LCB of underground 4-way automatic line switches and the protection equipment of high voltage customers are often too small. This could lead to sympathy tripping by the feeder circuit breaker (FCB of the distribution feeder and create difficulties in protection coordination between upstream and downstream protection equipment, identification of faults, and restoration operations. In order to solve the problem, it is necessary to reexamine the protection coordination between LCBs and high voltage customers’ protection equipment, and between LCBs and FCBs, in order to bring forth new proposals for settings and operations. This paper applies linear programming to optimize the protection coordination of protection devices, and proposes new time current curves (TCCs for the overcurrent (CO and low-energy overcurrent (LCO relays used in normally open distribution systems by performing simulations in the Electrical Transient Analyzer Program (ETAP environment. The simulation results show that the new TCCs solve the coordination problems among high voltage customer, lateral, feeder, bus-interconnection, and distribution transformer. The new proposals also satisfy the requirements of Taipower on protection coordination of the distribution feeder automation system (DFAS. Finally, the authors believe that the system configuration, operation experience, and relevant criteria mentioned in this paper may serve as valuable references for other companies or utilities when building DFAS of their own.

  8. Optimal coordinated voltage control of power systems

    Institute of Scientific and Technical Information of China (English)

    LI Yan-jun; HILL David J.; WU Tie-jun

    2006-01-01

    An immune algorithm solution is proposed in this paper to deal with the problem of optimal coordination of local physically based controllers in order to preserve or retain mid and long term voltage stability. This problem is in fact a global coordination control problem which involves not only sequencing and timing different control devices but also tuning the parameters of controllers. A multi-stage coordinated control scheme is presented, aiming at retaining good voltage levels with minimal control efforts and costs after severe disturbances in power systems. A self-pattern-recognized vaccination procedure is developed to transfer effective heuristic information into the new generation of solution candidates to speed up the convergence of the search procedure to global optima. An example of four bus power system case study is investigated to show the effectiveness and efficiency of the proposed algorithm, compared with several existing approaches such as differential dynamic programming and tree-search.

  9. Optimizing Vibrational Coordinates To Modulate Intermode Coupling.

    Science.gov (United States)

    Zimmerman, Paul M; Smereka, Peter

    2016-04-12

    The choice of coordinate system strongly affects the convergence properties of vibrational structure computations. Two methods for efficient generation of improved vibrational coordinates are presented and justified by analysis of a model anharmonic two-mode Hessian and numerical computations on polyatomic molecules. To produce optimal coordinates, metrics which quantify off-diagonal couplings over a grid of Hessian matrices are minimized through unitary rotations of the vibrational basis. The first proposed metric minimizes the total squared off-diagonal coupling, and the second minimizes the total squared change in off-diagonal coupling. In this procedure certain anharmonic modes tend to localize, for example X-H stretches. The proposed methods do not rely on prior fitting of the potential energy, vibrational structure computations, or localization metrics, so they are unique from previous vibrational coordinate generation algorithms and are generally applicable to polyatomic molecules. Fitting the potential to the approximate n-mode representation in the optimized bases for all-trans polyenes shows that off-diagonal anharmonic couplings are substantially reduced by the new choices of coordinate system. Convergence of vibrational energies is examined in detail for ethylene, and it is shown that coupling-optimized modes converge in vibrational configuration interaction computations to within 1 cm(-1) using only 3-mode couplings, where normal modes require 4-mode couplings for convergence. Comparison of the vibrational configuration interaction convergence with respect to excitation level for the two proposed metrics shows that minimization of the total off-diagonal coupling is most effective for low-cost vibrational structure computations.

  10. Optimization-based Method for Automated Road Network Extraction

    Energy Technology Data Exchange (ETDEWEB)

    Xiong, D

    2001-09-18

    Automated road information extraction has significant applicability in transportation. It provides a means for creating, maintaining, and updating transportation network databases that are needed for purposes ranging from traffic management to automated vehicle navigation and guidance. This paper is to review literature on the subject of road extraction and to describe a study of an optimization-based method for automated road network extraction.

  11. Automated Cache Performance Analysis And Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Mohror, Kathryn [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-12-23

    While there is no lack of performance counter tools for coarse-grained measurement of cache activity, there is a critical lack of tools for relating data layout to cache behavior to application performance. Generally, any nontrivial optimizations are either not done at all, or are done ”by hand” requiring significant time and expertise. To the best of our knowledge no tool available to users measures the latency of memory reference instructions for partic- ular addresses and makes this information available to users in an easy-to-use and intuitive way. In this project, we worked to enable the Open|SpeedShop performance analysis tool to gather memory reference latency information for specific instructions and memory ad- dresses, and to gather and display this information in an easy-to-use and intuitive way to aid performance analysts in identifying problematic data structures in their codes. This tool was primarily designed for use in the supercomputer domain as well as grid, cluster, cloud-based parallel e-commerce, and engineering systems and middleware. Ultimately, we envision a tool to automate optimization of application cache layout and utilization in the Open|SpeedShop performance analysis tool. To commercialize this soft- ware, we worked to develop core capabilities for gathering enhanced memory usage per- formance data from applications and create and apply novel methods for automatic data structure layout optimizations, tailoring the overall approach to support existing supercom- puter and cluster programming models and constraints. In this Phase I project, we focused on infrastructure necessary to gather performance data and present it in an intuitive way to users. With the advent of enhanced Precise Event-Based Sampling (PEBS) counters on recent Intel processor architectures and equivalent technology on AMD processors, we are now in a position to access memory reference information for particular addresses. Prior to the introduction of PEBS counters

  12. Automated Cache Performance Analysis And Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Mohror, Kathryn [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-12-23

    While there is no lack of performance counter tools for coarse-grained measurement of cache activity, there is a critical lack of tools for relating data layout to cache behavior to application performance. Generally, any nontrivial optimizations are either not done at all, or are done ”by hand” requiring significant time and expertise. To the best of our knowledge no tool available to users measures the latency of memory reference instructions for partic- ular addresses and makes this information available to users in an easy-to-use and intuitive way. In this project, we worked to enable the Open|SpeedShop performance analysis tool to gather memory reference latency information for specific instructions and memory ad- dresses, and to gather and display this information in an easy-to-use and intuitive way to aid performance analysts in identifying problematic data structures in their codes. This tool was primarily designed for use in the supercomputer domain as well as grid, cluster, cloud-based parallel e-commerce, and engineering systems and middleware. Ultimately, we envision a tool to automate optimization of application cache layout and utilization in the Open|SpeedShop performance analysis tool. To commercialize this soft- ware, we worked to develop core capabilities for gathering enhanced memory usage per- formance data from applications and create and apply novel methods for automatic data structure layout optimizations, tailoring the overall approach to support existing supercom- puter and cluster programming models and constraints. In this Phase I project, we focused on infrastructure necessary to gather performance data and present it in an intuitive way to users. With the advent of enhanced Precise Event-Based Sampling (PEBS) counters on recent Intel processor architectures and equivalent technology on AMD processors, we are now in a position to access memory reference information for particular addresses. Prior to the introduction of PEBS counters

  13. Optimization based automated curation of metabolic reconstructions

    Directory of Open Access Journals (Sweden)

    Maranas Costas D

    2007-06-01

    Full Text Available Abstract Background Currently, there exists tens of different microbial and eukaryotic metabolic reconstructions (e.g., Escherichia coli, Saccharomyces cerevisiae, Bacillus subtilis with many more under development. All of these reconstructions are inherently incomplete with some functionalities missing due to the lack of experimental and/or homology information. A key challenge in the automated generation of genome-scale reconstructions is the elucidation of these gaps and the subsequent generation of hypotheses to bridge them. Results In this work, an optimization based procedure is proposed to identify and eliminate network gaps in these reconstructions. First we identify the metabolites in the metabolic network reconstruction which cannot be produced under any uptake conditions and subsequently we identify the reactions from a customized multi-organism database that restores the connectivity of these metabolites to the parent network using four mechanisms. This connectivity restoration is hypothesized to take place through four mechanisms: a reversing the directionality of one or more reactions in the existing model, b adding reaction from another organism to provide functionality absent in the existing model, c adding external transport mechanisms to allow for importation of metabolites in the existing model and d restore flow by adding intracellular transport reactions in multi-compartment models. We demonstrate this procedure for the genome- scale reconstruction of Escherichia coli and also Saccharomyces cerevisiae wherein compartmentalization of intra-cellular reactions results in a more complex topology of the metabolic network. We determine that about 10% of metabolites in E. coli and 30% of metabolites in S. cerevisiae cannot carry any flux. Interestingly, the dominant flow restoration mechanism is directionality reversals of existing reactions in the respective models. Conclusion We have proposed systematic methods to identify and

  14. Automated firewall analytics design, configuration and optimization

    CERN Document Server

    Al-Shaer, Ehab

    2014-01-01

    This book provides a comprehensive and in-depth study of automated firewall policy analysis for designing, configuring and managing distributed firewalls in large-scale enterpriser networks. It presents methodologies, techniques and tools for researchers as well as professionals to understand the challenges and improve the state-of-the-art of managing firewalls systematically in both research and application domains. Chapters explore set-theory, managing firewall configuration globally and consistently, access control list with encryption, and authentication such as IPSec policies. The author

  15. Agent Technology Application in Automating the Coordination and Decision-Making in Supply Chain

    Institute of Scientific and Technical Information of China (English)

    JIE Hui; JI Jian-hua

    2005-01-01

    Coordinating all the activities among all the parties involved in supply chain can be a daunting task. This paper put forth the viewpoint of applying agent technology to automate the coordination and decision-making tasks in a typical home PC industry supply chain. The main features of the proposed approach, which differentiate it cesses and issues faced by parties in the supply chain. A prototype and the overall process flow were also described.

  16. Automated global optimization of commercial SAGD operations

    International Nuclear Information System (INIS)

    The economic optimization of steam assisted gravity drainage (SAGD) operations has been largely conducted through the use of simulations to identify optimal steam use approaches. In this study, the cumulative steam to oil ratio (CSOR) was optimized by altering the steam injection pressure throughout the evolution of the process in a detailed, 3-d reservoir model. A generic Athabasca simulation model was used along with a thermal reservoir simulator which used a corner point grid. A line heater was specified in the grid cells containing the well bores to mimic steam circulation. During heating, the injection and production locations were allowed to produce reservoir fluids from the reservoir to relieve pressure associated with the thermal expansion of oil sand. After steam circulation, the well bores were switched to an SAGD operation. At the producer well the operating constraint imposed a maximum temperature difference between the saturation temperature corresponding to the pressure of the fluids and the temperature in the wellbore equal to 5 degrees C. At the injection well, the steam injection pressure was specified according to the optimizer. A response surface was constructed by fitting the parameter sets and corresponding cost functions to a biquadratic function. After the minimum from the cost function was determined, a new set of parameters was selected to complete the iterations. Results indicated that optimization of SAGD is feasible with complex and detailed reservoir models by using parallel calculations. The general trend determined by the optimization algorithm developed in the research indicated that before the steam chamber contacts the overburden, the operating pressure should be relatively high. After contact is made, the injection pressure should be lowered to reduce heat losses. 17 refs., 1 tab., 5 figs

  17. Controller Design Automation for Aeroservoelastic Design Optimization of Wind Turbines

    NARCIS (Netherlands)

    Ashuri, T.; Van Bussel, G.J.W.; Zaayer, M.B.; Van Kuik, G.A.M.

    2010-01-01

    The purpose of this paper is to integrate the controller design of wind turbines with structure and aerodynamic analysis and use the final product in the design optimization process (DOP) of wind turbines. To do that, the controller design is automated and integrated with an aeroelastic simulation t

  18. Coordination and Emergence in the Cellular Automated Fashion Game

    CERN Document Server

    Cao, Zhigang; Qu, Xinglong; Yang, Mingmin; Yang, Xiaoguang

    2012-01-01

    We investigate a heterogeneous cellular automaton, where there are two types of agents, conformists and rebels. Each agent has to choose between two actions, 0 and 1. A conformist likes to choose an action that most of her neighbors choose, while in contrast a rebel wants to be different with most of her neighbors. Theoretically, this model is equivalent to the matching pennies game on regular networks. We study the dynamical process by assuming that each agent takes a myopic updating rule. An uniform updating probability is also introduced for each agent to study the whole spectrum from synchronous updating to asynchronous updating. Our model characterizes the phenomenon of fashion very well and has a great potential in the study of the finance and stock markets. A large number of simulations show that in most case agents can reach extraordinarily high degree of coordination. This process is also quite fast and steady. Considering that these dynamics are really simple, agents are selfish, myopic, and have ve...

  19. Optimal Protection Coordination for Microgrid under Different Operating Modes

    OpenAIRE

    Ming-Ta Yang; Li-Feng Chang

    2013-01-01

    Significant consequences result when a microgrid is connected to a distribution system. This study discusses the impacts of bolted three-phase faults and bolted single line-to-ground faults on the protection coordination of a distribution system connected by a microgrid which operates in utility-only mode or in grid-connected mode. The power system simulation software is used to build the test system. The linear programming method is applied to optimize the coordination of relays, and the rel...

  20. Towards Automated Design, Analysis and Optimization of Declarative Curation Workflows

    Directory of Open Access Journals (Sweden)

    Tianhong Song

    2014-10-01

    Full Text Available Data curation is increasingly important. Our previous work on a Kepler curation package has demonstrated advantages that come from automating data curation pipelines by using workflow systems. However, manually designed curation workflows can be error-prone and inefficient due to a lack of user understanding of the workflow system, misuse of actors, or human error. Correcting problematic workflows is often very time-consuming. A more proactive workflow system can help users avoid such pitfalls. For example, static analysis before execution can be used to detect the potential problems in a workflow and help the user to improve workflow design. In this paper, we propose a declarative workflow approach that supports semi-automated workflow design, analysis and optimization. We show how the workflow design engine helps users to construct data curation workflows, how the workflow analysis engine detects different design problems of workflows and how workflows can be optimized by exploiting parallelism.

  1. Automated Multivariate Optimization Tool for Energy Analysis: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Ellis, P. G.; Griffith, B. T.; Long, N.; Torcellini, P. A.; Crawley, D.

    2006-07-01

    Building energy simulations are often used for trial-and-error evaluation of ''what-if'' options in building design--a limited search for an optimal solution, or ''optimization''. Computerized searching has the potential to automate the input and output, evaluate many options, and perform enough simulations to account for the complex interactions among combinations of options. This paper describes ongoing efforts to develop such a tool. The optimization tool employs multiple modules, including a graphical user interface, a database, a preprocessor, the EnergyPlus simulation engine, an optimization engine, and a simulation run manager. Each module is described and the overall application architecture is summarized.

  2. Optimal deadlock avoidance Petri net supervisors for automated manufacturing systems

    Institute of Scientific and Technical Information of China (English)

    Keyi XING; Feng TIAN; Xiaojun YANG

    2007-01-01

    Deadlock avoidance problems are investigated for automated manufacturing systems with flexible routings.Based on the Petri net models of the systems, this paper proposes, for the first time, the concept of perfect maximal resourcetransition circuits and their saturated states. The concept facilitates the development of system liveness characterization and deadlock avoidance Petri net supervisors. Deadlock is characterized as some perfect maximal resource-transition circuits reaching their saturated states. For a large class of manufacturing systems, which do not contain center resources, the optimal deadlock avoidance Petri net supervisors are presented. For a general manufacturing system, a method is proposed for reducing the system Petri net model so that the reduced model does not contain center resources and, hence, has optimal deadlock avoidance Petri net supervisor. The controlled reduced Petri net model can then be used as the liveness supervisor of the system.

  3. Optimal design of coordination control strategy for distributed generation system

    Institute of Scientific and Technical Information of China (English)

    WANG Ai-hua; Norapon Kanjanapadit

    2009-01-01

    This paper presents a novel design procedure for optimizing the power distribution strategy in distributed generation system. A coordinating controller, responsible to distribute the total load power request among multiple DG units, is suggested based on the conception of hierarchical control structure in the dynamic system.The optimal control problem was formulated as a nonlinear optimization problem subject to set of constraints.The resulting problem was solved using the Kutm-Tucker method. Computer simulation results demonstrate that the proposed method can provide better efficiency in terms of reducing total costs compared to existing methods.In addition, the proposed optimal load distribution strategy can be easily implemented in real-time thanks to the simplicity of closed-form solutions.

  4. Block-Coordinate Frank-Wolfe Optimization for Structural SVMs

    OpenAIRE

    Lacoste-Julien, Simon; Jaggi, Martin; Schmidt, Mark; Pletscher, Patrick

    2012-01-01

    We propose a randomized block-coordinate variant of the classic Frank-Wolfe algorithm for convex optimization with block-separable constraints. Despite its lower iteration cost, we show that it achieves a similar convergence rate in duality gap as the full Frank-Wolfe algorithm. We also show that, when applied to the dual structural support vector machine (SVM) objective, this yields an online algorithm that has the same low iteration complexity as primal stochastic subgradient methods. Howev...

  5. Coordinated parallelizing compiler optimizations and high-level synthesis

    OpenAIRE

    S Gupta; Gupta, R. K.; Dutt, N D; Nicolau, A.

    2004-01-01

    We present a high-level synthesis methodology that applies a coordinated set of coarse-grain and fine-grain parallelizing transformations. The transformations are applied both during a presynthesis phase and during scheduling, with the objective of optimizing the results of synthesis and reducing the impact of control flow constructs on the quality of results. We first apply a set of source level presynthesis transformations that include common sub-expression elimination (CSE), copy propagati...

  6. Optimal Protection Coordination for Microgrid under Different Operating Modes

    Directory of Open Access Journals (Sweden)

    Ming-Ta Yang

    2013-01-01

    Full Text Available Significant consequences result when a microgrid is connected to a distribution system. This study discusses the impacts of bolted three-phase faults and bolted single line-to-ground faults on the protection coordination of a distribution system connected by a microgrid which operates in utility-only mode or in grid-connected mode. The power system simulation software is used to build the test system. The linear programming method is applied to optimize the coordination of relays, and the relays coordination simulation software is used to verify if the coordination time intervals (CTIs of the primary/backup relay pairs are adequate. In addition, this study also proposes a relays protection coordination strategy when the microgrid operates in islanding mode during a utility power outage. Because conventional CO/LCO relays are not capable of detecting high impedance fault, intelligent electrical device (IED combined with wavelet transformer and neural network is proposed to accurately detect high impedance fault and identify the fault phase.

  7. An optimized method for automated analysis of algal pigments by HPLC

    NARCIS (Netherlands)

    van Leeuwe, M. A.; Villerius, L. A.; Roggeveld, J.; Visser, R. J. W.; Stefels, J.

    2006-01-01

    A recent development in algal pigment analysis by high-performance liquid chromatography (HPLC) is the application of automation. An optimization of a complete sampling and analysis protocol applied specifically in automation has not yet been performed. In this paper we show that automation can only

  8. A New Hybrid Nelder-Mead Particle Swarm Optimization for Coordination Optimization of Directional Overcurrent Relays

    Directory of Open Access Journals (Sweden)

    An Liu

    2012-01-01

    Full Text Available Coordination optimization of directional overcurrent relays (DOCRs is an important part of an efficient distribution system. This optimization problem involves obtaining the time dial setting (TDS and pickup current (Ip values of each DOCR. The optimal results should have the shortest primary relay operating time for all fault lines. Recently, the particle swarm optimization (PSO algorithm has been considered an effective tool for linear/nonlinear optimization problems with application in the protection and coordination of power systems. With a limited runtime period, the conventional PSO considers the optimal solution as the final solution, and an early convergence of PSO results in decreased overall performance and an increase in the risk of mistaking local optima for global optima. Therefore, this study proposes a new hybrid Nelder-Mead simplex search method and particle swarm optimization (proposed NM-PSO algorithm to solve the DOCR coordination optimization problem. PSO is the main optimizer, and the Nelder-Mead simplex search method is used to improve the efficiency of PSO due to its potential for rapid convergence. To validate the proposal, this study compared the performance of the proposed algorithm with that of PSO and original NM-PSO. The findings demonstrate the outstanding performance of the proposed NM-PSO in terms of computation speed, rate of convergence, and feasibility.

  9. Optimizing wireless LAN for longwall coal mine automation

    Energy Technology Data Exchange (ETDEWEB)

    Hargrave, C.O.; Ralston, J.C.; Hainsworth, D.W. [Exploration & Mining Commonwealth Science & Industrial Research Organisation, Pullenvale, Qld. (Australia)

    2007-01-15

    A significant development in underground longwall coal mining automation has been achieved with the successful implementation of wireless LAN (WLAN) technology for communication on a longwall shearer. WIreless-FIdelity (Wi-Fi) was selected to meet the bandwidth requirements of the underground data network, and several configurations were installed on operating longwalls to evaluate their performance. Although these efforts demonstrated the feasibility of using WLAN technology in longwall operation, it was clear that new research and development was required in order to establish optimal full-face coverage. By undertaking an accurate characterization of the target environment, it has been possible to achieve great improvements in WLAN performance over a nominal Wi-Fi installation. This paper discusses the impact of Fresnel zone obstructions and multipath effects on radio frequency propagation and reports an optimal antenna and system configuration. Many of the lessons learned in the longwall case are immediately applicable to other underground mining operations, particularly wherever there is a high degree of obstruction from mining equipment.

  10. Knowledge Network Driven Coordination and Robust Optimization to Support Concurrent and Collaborative Parameter Design

    OpenAIRE

    Hu, Jie; Peng, Yinghong; Xiong, Guangleng

    2007-01-01

    Abstract This study presents a parameter coordination and robust optimization approach based on knowledge network modeling. The method allows multidisciplinary designer to synthetically coordinate and optimize parameter considering multidisciplinary knowledge. First, a knowledge network model is established, including design knowledge from assembly, manufacture, performance, and simulation. Second, the parameter coordination method is presented to solve the knowledge network model,...

  11. Optimizing ELISAs for precision and robustness using laboratory automation and statistical design of experiments.

    Science.gov (United States)

    Joelsson, Daniel; Moravec, Phil; Troutman, Matthew; Pigeon, Joseph; DePhillips, Pete

    2008-08-20

    Transferring manual ELISAs to automated platforms requires optimizing the assays for each particular robotic platform. These optimization experiments are often time consuming and difficult to perform using a traditional one-factor-at-a-time strategy. In this manuscript we describe the development of an automated process using statistical design of experiments (DOE) to quickly optimize immunoassays for precision and robustness on the Tecan EVO liquid handler. By using fractional factorials and a split-plot design, five incubation time variables and four reagent concentration variables can be optimized in a short period of time.

  12. Application of Advanced Particle Swarm Optimization Techniques to Wind-thermal Coordination

    DEFF Research Database (Denmark)

    Singh, Sri Niwas; Østergaard, Jacob; Yadagiri, J.

    2009-01-01

    wind-thermal coordination algorithm is necessary to determine the optimal proportion of wind and thermal generator capacity that can be integrated into the system. In this paper, four versions of Particle Swarm Optimization (PSO) techniques are proposed for solving wind-thermal coordination problem...

  13. Review of Automated Design and Optimization of MEMS

    DEFF Research Database (Denmark)

    Achiche, Sofiane; Fan, Zhun; Bolognini, Francesca

    2007-01-01

    In recent years MEMS saw a very rapid development. Although many advances have been reached, due to the multiphysics nature of MEMS, their design is still a difficult task carried on mainly by hand calculation. In order to help to overtake such difficulties, attempts to automate MEMS design were...... carried out. This paper presents a review of these techniques. The design task of MEMS is usually divided into four main stages: System Level, Device Level, Physical Level and the Process Level. The state of the art o automated MEMS design in each of these levels is investigated....

  14. Collective Tuning Initiative: automating and accelerating development and optimization of computing systems

    OpenAIRE

    Fursin, Grigori

    2009-01-01

    International audience Computing systems rarely deliver best possible performance due to ever increasing hardware and software complexity and limitations of the current optimization technology. Additional code and architecture optimizations are often required to improve execution time, size, power consumption, reliability and other important characteristics of computing systems. However, it is often a tedious, repetitive, isolated and time consuming process. In order to automate, simplify ...

  15. A New View on Geometry Optimization: the Quasi-Independent Curvilinear Coordinate Approximation

    OpenAIRE

    Németh, Károly; Challacombe, Matt

    2004-01-01

    This article presents a new and efficient alternative to well established algorithms for molecular geometry optimization. The new approach exploits the approximate decoupling of molecular energetics in a curvilinear internal coordinate system, allowing separation of the 3N-dimensional optimization problem into an O(N) set of quasi-independent one-dimensional problems. Each uncoupled optimization is developed by a weighted least squares fit of energy gradients in the internal coordinate system...

  16. Optimizing Patient-centered Communication and Multidisciplinary Care Coordination in Emergency Diagnostic Imaging: A Research Agenda.

    Science.gov (United States)

    Sabbatini, Amber K; Merck, Lisa H; Froemming, Adam T; Vaughan, William; Brown, Michael D; Hess, Erik P; Applegate, Kimberly E; Comfere, Nneka I

    2015-12-01

    Patient-centered emergency diagnostic imaging relies on efficient communication and multispecialty care coordination to ensure optimal imaging utilization. The construct of the emergency diagnostic imaging care coordination cycle with three main phases (pretest, test, and posttest) provides a useful framework to evaluate care coordination in patient-centered emergency diagnostic imaging. This article summarizes findings reached during the patient-centered outcomes session of the 2015 Academic Emergency Medicine consensus conference "Diagnostic Imaging in the Emergency Department: A Research Agenda to Optimize Utilization." The primary objective was to develop a research agenda focused on 1) defining component parts of the emergency diagnostic imaging care coordination process, 2) identifying gaps in communication that affect emergency diagnostic imaging, and 3) defining optimal methods of communication and multidisciplinary care coordination that ensure patient-centered emergency diagnostic imaging. Prioritized research questions provided the framework to define a research agenda for multidisciplinary care coordination in emergency diagnostic imaging.

  17. Toward an Integrated Framework for Automated Development and Optimization of Online Advertising Campaigns

    OpenAIRE

    Thomaidou, Stamatina; Vazirgiannis, Michalis; Liakopoulos, Kyriakos

    2012-01-01

    Creating and monitoring competitive and cost-effective pay-per-click advertisement campaigns through the web-search channel is a resource demanding task in terms of expertise and effort. Assisting or even automating the work of an advertising specialist will have an unrivaled commercial value. In this paper we propose a methodology, an architecture, and a fully functional framework for semi- and fully- automated creation, monitoring, and optimization of cost-efficient pay-per-click campaigns ...

  18. Optimal Coordinated Strategy Analysis for the Procurement Logistics of a Steel Group

    Directory of Open Access Journals (Sweden)

    Lianbo Deng

    2014-01-01

    Full Text Available This paper focuses on the optimization of an internal coordinated procurement logistics system in a steel group and the decision on the coordinated procurement strategy by minimizing the logistics costs. Considering the coordinated procurement strategy and the procurement logistics costs, the aim of the optimization model was to maximize the degree of quality satisfaction and to minimize the procurement logistics costs. The model was transformed into a single-objective model and solved using a simulated annealing algorithm. In the algorithm, the supplier of each subsidiary was selected according to the evaluation result for independent procurement. Finally, the effect of different parameters on the coordinated procurement strategy was analysed. The results showed that the coordinated strategy can clearly save procurement costs; that the strategy appears to be more cooperative when the quality requirement is not stricter; and that the coordinated costs have a strong effect on the coordinated procurement strategy.

  19. Optimal Coordination of Automatic Line Switches for Distribution Systems

    OpenAIRE

    Jyh-Cherng Gu; Ming-Ta Yang

    2012-01-01

    For the Taiwan Power Company (Taipower), the margins of coordination times between the lateral circuit breakers (LCB) of underground 4-way automatic line switches and the protection equipment of high voltage customers are often too small. This could lead to sympathy tripping by the feeder circuit breaker (FCB) of the distribution feeder and create difficulties in protection coordination between upstream and downstream protection equipment, identification of faults, and restoration operations....

  20. Optimal Coordination of Directional Overcurrent Relays Using PSO-TVAC Considering Series Compensation

    Directory of Open Access Journals (Sweden)

    Nabil Mancer

    2015-01-01

    Full Text Available The integration of system compensation such as Series Compensator (SC into the transmission line makes the coordination of directional overcurrent in a practical power system important and complex. This article presents an efficient variant of Particle Swarm Optimization (PSO algorithm based on Time-Varying Acceleration Coefficients (PSO-TVAC for optimal coordination of directional overcurrent relays (DOCRs considering the integration of series compensation. Simulation results are compared to other methods to confirm the efficiency of the proposed variant PSO in solving the optimal coordination of directional overcurrent relay in the presence of series compensation.

  1. Lyapunov-based Low-thrust Optimal Orbit Transfer: An approach in Cartesian coordinates

    CERN Document Server

    Zhang, Hantian; Cao, Qingjie

    2014-01-01

    This paper presents a simple approach to low-thrust optimal-fuel and optimal-time transfer problems between two elliptic orbits using the Cartesian coordinates system. In this case, an orbit is described by its specific angular momentum and Laplace vectors with a free injection point. Trajectory optimization with the pseudospectral method and nonlinear programming are supported by the initial guess generated from the Chang-Chichka-Marsden Lyapunov-based transfer controller. This approach successfully solves several low-thrust optimal problems. Numerical results show that the Lyapunov-based initial guess overcomes the difficulty in optimization caused by the strong oscillation of variables in the Cartesian coordinates system. Furthermore, a comparison of the results shows that obtaining the optimal transfer solution through the polynomial approximation by utilizing Cartesian coordinates is easier than using orbital elements, which normally produce strongly nonlinear equations of motion. In this paper, the Eart...

  2. A PLM-based automated inspection planning system for coordinate measuring machine

    Science.gov (United States)

    Zhao, Haibin; Wang, Junying; Wang, Boxiong; Wang, Jianmei; Chen, Huacheng

    2006-11-01

    With rapid progress of Product Lifecycle Management (PLM) in manufacturing industry, automatic generation of inspection planning of product and the integration with other activities in product lifecycle play important roles in quality control. But the techniques for these purposes are laggard comparing with techniques of CAD/CAM. Therefore, an automatic inspection planning system for Coordinate Measuring Machine (CMM) was developed to improve the automatization of measuring based on the integration of inspection system in PLM. Feature information representation is achieved based on a PLM canter database; measuring strategy is optimized through the integration of multi-sensors; reasonable number and distribution of inspection points are calculated and designed with the guidance of statistic theory and a synthesis distribution algorithm; a collision avoidance method is proposed to generate non-collision inspection path with high efficiency. Information mapping is performed between Neutral Interchange Files (NIFs), such as STEP, DML, DMIS, XML, etc., to realize information integration with other activities in the product lifecycle like design, manufacturing and inspection execution, etc. Simulation was carried out to demonstrate the feasibility of the proposed system. As a result, the inspection process is becoming simpler and good result can be got based on the integration in PLM.

  3. TO THE QUESTION OF SOLVING OF THE PROBLEM OF OPTIMIZING PARAMETERS OF TRAFFIC FLOW COORDINATED CONTROL

    OpenAIRE

    L. Abramova; Chernobaev, N.

    2007-01-01

    A short review of main methods of traffic flow control is represented, great attention is paid to methods of coordinated control and quality characteristics of traffic control. The problem of parameter optimization of traffic coordinated control on the basis of vehicle delay minimizing at highway intersections has been defined.

  4. apsis - Framework for Automated Optimization of Machine Learning Hyper Parameters

    OpenAIRE

    Diehl, Frederik; Jauch, Andreas

    2015-01-01

    The apsis toolkit presented in this paper provides a flexible framework for hyperparameter optimization and includes both random search and a bayesian optimizer. It is implemented in Python and its architecture features adaptability to any desired machine learning code. It can easily be used with common Python ML frameworks such as scikit-learn. Published under the MIT License other researchers are heavily encouraged to check out the code, contribute or raise any suggestions. The code can be ...

  5. Automated Finite Element Modeling of Wing Structures for Shape Optimization

    Science.gov (United States)

    Harvey, Michael Stephen

    1993-01-01

    The displacement formulation of the finite element method is the most general and most widely used technique for structural analysis of airplane configurations. Modem structural synthesis techniques based on the finite element method have reached a certain maturity in recent years, and large airplane structures can now be optimized with respect to sizing type design variables for many load cases subject to a rich variety of constraints including stress, buckling, frequency, stiffness and aeroelastic constraints (Refs. 1-3). These structural synthesis capabilities use gradient based nonlinear programming techniques to search for improved designs. For these techniques to be practical a major improvement was required in computational cost of finite element analyses (needed repeatedly in the optimization process). Thus, associated with the progress in structural optimization, a new perspective of structural analysis has emerged, namely, structural analysis specialized for design optimization application, or.what is known as "design oriented structural analysis" (Ref. 4). This discipline includes approximation concepts and methods for obtaining behavior sensitivity information (Ref. 1), all needed to make the optimization of large structural systems (modeled by thousands of degrees of freedom and thousands of design variables) practical and cost effective.

  6. COORDINATION MECHANISM COMBINING SUPPLY CHAIN OPTIMIZATION AND RULE IN EXCHANGE

    OpenAIRE

    JINSHI ZHAO; JIAZHEN HUO

    2013-01-01

    There are two kinds of option pricing. The option pricing in exchange follows the Black–Scholes rule but does not consider the optimizing of supply chain. The traditional supply chain option contract can optimize supply chain but does not meet the Black–Scholes rule. We integrate the assumption of above two kinds of option pricing, and design a model to combine the Black–Scholes rule and traditional option contract of optimizing in a supplier-led supply chain. Our combined model can guide the...

  7. Optimization of composite carriage for a coordinate measurement machine

    OpenAIRE

    Lombardi, Marco

    1994-01-01

    The growing need for high quality and reliability of products requires the control of the accuracy of dimensions and shape of product components. Coordinate Measurement Machines (CMM) are now able to measure the dimensions and/or the shape of objects with submicron precision. The desire for high-speed measurement, has stimulated the interest of CMM manufacturers in the use of composite materials for the structure of their machines. Composites are lighter than conventional mater...

  8. Automated Discovery of Elementary Chemical Reaction Steps Using Freezing String and Berny Optimization Methods

    OpenAIRE

    Suleimanov, Yury V.; Green, William H.

    2015-01-01

    We present a simple protocol which allows fully automated discovery of elementary chemical reaction steps using in cooperation single- and double-ended transition-state optimization algorithms - the freezing string and Berny optimization methods, respectively. To demonstrate the utility of the proposed approach, the reactivity of several systems of combustion and atmospheric chemistry importance is investigated. The proposed algorithm allowed us to detect without any human intervention not on...

  9. Automated Optimization of Walking Parameters for the Nao Humanoid Robot

    NARCIS (Netherlands)

    N. Girardi; C. Kooijman; A.J. Wiggers; A. Visser

    2013-01-01

    This paper describes a framework for optimizing walking parameters for a Nao humanoid robot. In this case an omnidirectional walk is learned. The parameters are learned in simulation with an evolutionary approach. The best performance was obtained for a combination of a low mutation rate and a high

  10. Automated Optimization of Walking Parameters for the Nao Humanoid Robot

    OpenAIRE

    Girardi, N.; Kooijman, C.; Wiggers, A.J.; de Visser, A.

    2013-01-01

    This paper describes a framework for optimizing walking parameters for a Nao humanoid robot. In this case an omnidirectional walk is learned. The parameters are learned in simulation with an evolutionary approach. The best performance was obtained for a combination of a low mutation rate and a high crossover rate.

  11. A sensitivity-based coordination method for optimization of product families

    Science.gov (United States)

    Zou, Jun; Yao, Wei-Xing; Xia, Tian-Xiang

    2016-07-01

    This article provides an introduction to a decomposition-based method for the optimization of product families with predefined platforms. To improve the efficiency of the system coordinator, a new sensitivity-based coordination method (SCM) is proposed. The key idea in SCM is that the system level coordinates share variables by using sensitivity information to make trade-offs between the product subsystems. The coordinated shared variables are determined by minimizing the performance deviation with respect to the optimal design of subproblems and constraint violation incurred by sharing. Each subproblem has a significant degree of independence and can be solved in a simultaneous way. The numerical performance of SCM is investigated, and the results suggest that the new approach is robust and leads to a substantial reduction in computational effort compared with the analytical target cascading method. Then, the proposed methodology is applied to the structural optimization of a family of automotive body side-frames.

  12. Advanced Coordinating Control System for Power Plant

    Institute of Scientific and Technical Information of China (English)

    WU Peng; WEI Shuangying

    2006-01-01

    The coordinating control system is popular used in power plant. This paper describes the advanced coordinating control by control methods and optimal operation, introduces their principals and features by using the examples of power plant operation. It is wealthy for automation application in optimal power plant operation.

  13. Optimal number of stimulation contacts for coordinated reset neuromodulation

    Directory of Open Access Journals (Sweden)

    Borys eLysyansky

    2013-07-01

    Full Text Available In this computational study we investigatecoordinated reset (CR neuromodulation designed for an effective controlof synchronization by multi-site stimulation of neuronal target populations. This method was suggested to effectively counteract pathological neuronal synchronycharacteristic for several neurological disorders. We studyhow many stimulation sites are required for optimal CR-induced desynchronization. We found that a moderate increase of the number of stimulation sitesmay significantly prolong the post-stimulation desynchronized transientafter the stimulation is completely switched off. This can, in turn,reduce the amount of the administered stimulation current for theintermittent ON-OFF CR stimulation protocol, where time intervalswith stimulation ON are recurrently followed by time intervals withstimulation OFF. In addition, we found that the optimal number ofstimulation sites essentially depends on how strongly the administeredcurrent decays within the neuronal tissue with increasing distancefrom the stimulation site. In particular, for a broad spatial stimulationprofile, i.e., for a weak spatial decay rate of the stimulation current,CR stimulation can optimally be delivered via a small number of stimulationsites. Our findings may contribute to an optimization of therapeutic applications of CR neuromodulation.

  14. Novel Particle Swarm Optimization and Its Application in Calibrating the Underwater Transponder Coordinates

    Directory of Open Access Journals (Sweden)

    Zheping Yan

    2014-01-01

    Full Text Available A novel improved particle swarm algorithm named competition particle swarm optimization (CPSO is proposed to calibrate the Underwater Transponder coordinates. To improve the performance of the algorithm, TVAC algorithm is introduced into CPSO to present an extension competition particle swarm optimization (ECPSO. The proposed method is tested with a set of 10 standard optimization benchmark problems and the results are compared with those obtained through existing PSO algorithms, basic particle swarm optimization (BPSO, linear decreasing inertia weight particle swarm optimization (LWPSO, exponential inertia weight particle swarm optimization (EPSO, and time-varying acceleration coefficient (TVAC. The results demonstrate that CPSO and ECPSO manifest faster searching speed, accuracy, and stability. The searching performance for multimodulus function of ECPSO is superior to CPSO. At last, calibration of the underwater transponder coordinates is present using particle swarm algorithm, and novel improved particle swarm algorithm shows better performance than other algorithms.

  15. Multi-objective intelligent coordinating optimization blending system based on qualitative and quantitative synthetic model

    Institute of Scientific and Technical Information of China (English)

    WANG Ya-lin; MA Jie; GUI Wei-hua; YANG Chun-hua; ZHANG Chuan-fu

    2006-01-01

    A multi-objective intelligent coordinating optimization strategy based on qualitative and quantitative synthetic model for Pb-Zn sintering blending process was proposed to obtain optimal mixture ratio. The mechanism and neural network quantitative models for predicting compositions and rule models for expert reasoning were constructed based on statistical data and empirical knowledge. An expert reasoning method based on these models were proposed to solve blending optimization problem, including multi-objective optimization for the first blending process and area optimization for the second blending process, and to determine optimal mixture ratio which will meet the requirement of intelligent coordination. The results show that the qualified rates of agglomerate Pb, Zn and S compositions are increased by 7.1%, 6.5% and 6.9%, respectively, and the fluctuation of sintering permeability is reduced by 7.0 %, which effectively stabilizes the agglomerate compositions and the permeability.

  16. Axon Membrane Skeleton Structure is Optimized for Coordinated Sodium Propagation

    CERN Document Server

    Zhang, Yihao; Li, He; Tzingounis, Anastasios V; Lykotrafitis, George

    2016-01-01

    Axons transmit action potentials with high fidelity and minimal jitter. This unique capability is likely the result of the spatiotemporal arrangement of sodium channels along the axon. Super-resolution microscopy recently revealed that the axon membrane skeleton is structured as a series of actin rings connected by spectrin filaments that are held under entropic tension. Sodium channels also exhibit a periodic distribution pattern, as they bind to ankyrin G, which associates with spectrin. Here, we elucidate the relationship between the axon membrane skeleton structure and the function of the axon. By combining cytoskeletal dynamics and continuum diffusion modeling, we show that spectrin filaments under tension minimize the thermal fluctuations of sodium channels and prevent overlap of neighboring channel trajectories. Importantly, this axon skeletal arrangement allows for a highly reproducible band-like activation of sodium channels leading to coordinated sodium propagation along the axon.

  17. Temporal mammogram image registration using optimized curvilinear coordinates.

    Science.gov (United States)

    Abdel-Nasser, Mohamed; Moreno, Antonio; Puig, Domenec

    2016-04-01

    Registration of mammograms plays an important role in breast cancer computer-aided diagnosis systems. Radiologists usually compare mammogram images in order to detect abnormalities. The comparison of mammograms requires a registration between them. A temporal mammogram registration method is proposed in this paper. It is based on the curvilinear coordinates, which are utilized to cope both with global and local deformations in the breast area. Temporal mammogram pairs are used to validate the proposed method. After registration, the similarity between the mammograms is maximized, and the distance between manually defined landmarks is decreased. In addition, a thorough comparison with the state-of-the-art mammogram registration methods is performed to show its effectiveness.

  18. Hybrid optimal online-overnight charging coordination of plug-in electric vehicles in smart grid

    Science.gov (United States)

    Masoum, Mohammad A. S.; Nabavi, Seyed M. H.

    2016-10-01

    Optimal coordinated charging of plugged-in electric vehicles (PEVs) in smart grid (SG) can be beneficial for both consumers and utilities. This paper proposes a hybrid optimal online followed by overnight charging coordination of high and low priority PEVs using discrete particle swarm optimization (DPSO) that considers the benefits of both consumers and electric utilities. Objective functions are online minimization of total cost (associated with grid losses and energy generation) and overnight valley filling through minimization of the total load levels. The constraints include substation transformer loading, node voltage regulations and the requested final battery state of charge levels (SOCreq). The main challenge is optimal selection of the overnight starting time (toptimal-overnight,start) to guarantee charging of all vehicle batteries to the SOCreq levels before the requested plug-out times (treq) which is done by simultaneously solving the online and overnight objective functions. The online-overnight PEV coordination approach is implemented on a 449-node SG; results are compared for uncoordinated and coordinated battery charging as well as a modified strategy using cost minimizations for both online and overnight coordination. The impact of toptimal-overnight,start on performance of the proposed PEV coordination is investigated.

  19. Micro-simulation Modeling of Coordination of Automated Guided Vehicles at Intersection

    OpenAIRE

    Makarem, Laleh; Pham, Minh Hai; Dumont, André-Gilles; Gillet, Denis

    2012-01-01

    One of the challenging problems with autonomous vehicles is their performance at intersections. This paper shows an alternative control method for the coordination of autonomous vehicles at intersections. The proposed approach is grounded in multi-robot coordination and it also takes into account vehicle dynamics as well as realistic communication constraints. The existing concept of decentralized navigation functions is combined with a sensing model and a crossing strategy is developed. It i...

  20. Automation for pattern library creation and in-design optimization

    Science.gov (United States)

    Deng, Rock; Zou, Elain; Hong, Sid; Wang, Jinyan; Zhang, Yifan; Sweis, Jason; Lai, Ya-Chieh; Ding, Hua; Huang, Jason

    2015-03-01

    contain remedies built in so that fixing happens either automatically or in a guided manner. Building a comprehensive library of patterns is a very difficult task especially when a new technology node is being developed or the process keeps changing. The main dilemma is not having enough representative layouts to use for model simulation where pattern locations can be marked and extracted. This paper will present an automatic pattern library creation flow by using a few known yield detractor patterns to systematically expand the pattern library and generate optimized patterns. We will also look at the specific fixing hints in terms of edge movements, additive, or subtractive changes needed during optimization. Optimization will be shown for both the digital physical implementation and custom design methods.

  1. Path optimization by a variational reaction coordinate method. II. Improved computational efficiency through internal coordinates and surface interpolation.

    Science.gov (United States)

    Birkholz, Adam B; Schlegel, H Bernhard

    2016-05-14

    Reaction path optimization is being used more frequently as an alternative to the standard practice of locating a transition state and following the path downhill. The Variational Reaction Coordinate (VRC) method was proposed as an alternative to chain-of-states methods like nudged elastic band and string method. The VRC method represents the path using a linear expansion of continuous basis functions, allowing the path to be optimized variationally by updating the expansion coefficients to minimize the line integral of the potential energy gradient norm, referred to as the Variational Reaction Energy (VRE) of the path. When constraints are used to control the spacing of basis functions and to couple the minimization of the VRE with the optimization of one or more individual points along the path (representing transition states and intermediates), an approximate path as well as the converged geometries of transition states and intermediates along the path are determined in only a few iterations. This algorithmic efficiency comes at a high per-iteration cost due to numerical integration of the VRE derivatives. In the present work, methods for incorporating redundant internal coordinates and potential energy surface interpolation into the VRC method are described. With these methods, the per-iteration cost, in terms of the number of potential energy surface evaluations, of the VRC method is reduced while the high algorithmic efficiency is maintained. PMID:27179465

  2. Path optimization by a variational reaction coordinate method. II. Improved computational efficiency through internal coordinates and surface interpolation.

    Science.gov (United States)

    Birkholz, Adam B; Schlegel, H Bernhard

    2016-05-14

    Reaction path optimization is being used more frequently as an alternative to the standard practice of locating a transition state and following the path downhill. The Variational Reaction Coordinate (VRC) method was proposed as an alternative to chain-of-states methods like nudged elastic band and string method. The VRC method represents the path using a linear expansion of continuous basis functions, allowing the path to be optimized variationally by updating the expansion coefficients to minimize the line integral of the potential energy gradient norm, referred to as the Variational Reaction Energy (VRE) of the path. When constraints are used to control the spacing of basis functions and to couple the minimization of the VRE with the optimization of one or more individual points along the path (representing transition states and intermediates), an approximate path as well as the converged geometries of transition states and intermediates along the path are determined in only a few iterations. This algorithmic efficiency comes at a high per-iteration cost due to numerical integration of the VRE derivatives. In the present work, methods for incorporating redundant internal coordinates and potential energy surface interpolation into the VRC method are described. With these methods, the per-iteration cost, in terms of the number of potential energy surface evaluations, of the VRC method is reduced while the high algorithmic efficiency is maintained.

  3. Automated gamma knife radiosurgery treatment planning with image registration, data-mining, and Nelder-Mead simplex optimization

    International Nuclear Information System (INIS)

    Gamma knife treatments are usually planned manually, requiring much expertise and time. We describe a new, fully automatic method of treatment planning. The treatment volume to be planned is first compared with a database of past treatments to find volumes closely matching in size and shape. The treatment parameters of the closest matches are used as starting points for the new treatment plan. Further optimization is performed with the Nelder-Mead simplex method: the coordinates and weight of the isocenters are allowed to vary until a maximally conformal plan specific to the new treatment volume is found. The method was tested on a randomly selected set of 10 acoustic neuromas and 10 meningiomas. Typically, matching a new volume took under 30 seconds. The time for simplex optimization, on a 3 GHz Xeon processor, ranged from under a minute for small volumes (30 000 cubic mm,>20 isocenters). In 8/10 acoustic neuromas and 8/10 meningiomas, the automatic method found plans with conformation number equal or better than that of the manual plan. In 4/10 acoustic neuromas and 5/10 meningiomas, both overtreatment and undertreatment ratios were equal or better in automated plans. In conclusion, data-mining of past treatments can be used to derive starting parameters for treatment planning. These parameters can then be computer optimized to give good plans automatically

  4. Optimizing Supply Chain Performance in China with Country-Specific Supply Chain Coordination

    OpenAIRE

    Herczeg, András; Vastag, Gyula

    2012-01-01

    The implementation of country-specific supply chain coordination techniques ensures an optimal global supply chain performance. This paper looks at success factors going with a supply chain coordination strategy within the global supply chain of a successful, medium-sized, privately-owned company, one having locations in North America (USA), Europe (Hungary) and Asia (China). Through the shown GSL’s Chinese plant, we will endeavor to argue that increased collaboration in the supply network wi...

  5. Integrated Coordinated Optimization Control of Automatic Generation Control and Automatic Voltage Control in Regional Power Grids

    OpenAIRE

    Qiu-Yu Lu; Wei Hu; Le Zheng; Yong Min; Miao Li; Xiao-Ping Li; Wei-Chun Ge; Zhi-Ming Wang

    2012-01-01

    Automatic Generation Control (AGC) and Automatic Voltage Control (AVC) are key approaches to frequency and voltage regulation in power systems. However, based on the assumption of decoupling of active and reactive power control, the existing AGC and AVC systems work independently without any coordination. In this paper, a concept and method of hybrid control is introduced to set up an Integrated Coordinated Optimization Control (ICOC) system for AGC and AVC. Concerning the diversity of contro...

  6. A new comprehensive genetic algorithm method for optimal overcurrent relays coordination

    Energy Technology Data Exchange (ETDEWEB)

    Razavi, Farzad; Abyaneh, Hossein Askarian; Mohammadi, Reza [Department of Electrical Engineering, Amirkabir University of Technology (Iran); Al-Dabbagh, Majid [Hydro Tasmania Consulting (Australia); Torkaman, Hossein [Department of Electrical Engineering, Shahid Beheshti University (Iran)

    2008-04-15

    For optimal co-ordination of overcurrent relays, linear programming techniques such as simplex, two-phase simplex and dual simplex are used. Another way of optimal coordination program is using artificial intelligent system such as genetic algorithm (GA). In this paper, a powerful optimal coordination method based on GA is introduced. The objective function (OF) is developed to solve the problems of miscoordination and continuous or discrete time setting multiplier (TSM) or time dial setting (TDS). In other words; the novelty of the paper is the modification of the existing objective function of GA, by introducing a new parameter and adding a new term to OF, to handle miscoordination problems both for continues and discrete TSM or TDS. The method is applied to two different power system networks and from the obtained results it is revealed that the new method is efficient, accurate and flexible. (author)

  7. Analytical study on coordinative optimization of convection in tubes with variable heat flux

    Institute of Scientific and Technical Information of China (English)

    YUAN Zhongxian; ZHANG Jianguo; JIANG Mingjian

    2004-01-01

    The laminar heat transfer in the thermal entrance region in round tubes, which has a variable surface heat flux boundary condition, is analytically studied. The results show that the heat transfer coefficient is closely related to the wall temperature gradient along the tube axis. The greater the gradient, the higher the heat transfer rate. Furthermore, the coordination of the velocity and the temperature gradient fields is also analysed under different surface heat fluxes. The validity of the field coordination principle is verified by checking the correlation of heat transfer coefficient and the coordination degree. The results also demonstrate that optimizing the thermal boundary condition is a way to enhance heat transfer.

  8. Constraints Adjustment and Objectives Coordination of Satisfying Optimal Control Applied to Heavy Oil Fractionators

    Institute of Scientific and Technical Information of China (English)

    邹涛; 李少远

    2005-01-01

    In this paper, the feasibility and objectives coordination of real-time optimization (RTO) are systemically investigated under soft constraints. The reason for requiring soft constraints adjustment and objective relaxation simultaneously is that the result is not satisfactory when the feasible region is apart from the desired working point or the optimization problem is infeasible. The mixed logic method is introduced to describe the priority of the constraints and objectives, thereby the soft constraints adjustment and objectives coordination are solved together in RTO. A case study on the Shell heavy oil fractionators benchmark problem illustrating the method is finally presented.

  9. Geometry Optimization of Crystals by the Quasi-Independent Curvilinear Coordinate Approximation

    CERN Document Server

    Németh, K

    2005-01-01

    The quasi-independent curvilinear coordinate approximation (QUICCA) method [K. N\\'emeth and M. Challacombe, J. Chem. Phys. {\\bf 121}, 2877, (2004)] is extended to the optimization of crystal structures. We demonstrate that QUICCA is valid under periodic boundary conditions, enabling simultaneous relaxation of the lattice and atomic coordinates, as illustrated by tight optimization of polyethylene, hexagonal boron-nitride, a (10,0) carbon-nanotube, hexagonal ice, quartz and sulfur at the $\\Gamma$-point RPBE/STO-3G level of theory.

  10. Dynamic Coordinated Shifting Control of Automated Mechanical Transmissions without a Clutch in a Plug-In Hybrid Electric Vehicle

    Directory of Open Access Journals (Sweden)

    Xinlei Liu

    2012-08-01

    Full Text Available On the basis of the shifting process of automated mechanical transmissions (AMTs for traditional hybrid electric vehicles (HEVs, and by combining the features of electric machines with fast response speed, the dynamic model of the hybrid electric AMT vehicle powertrain is built up, the dynamic characteristics of each phase of shifting process are analyzed, and a control strategy in which torque and speed of the engine and electric machine are coordinatively controlled to achieve AMT shifting control for a plug-in hybrid electric vehicle (PHEV without clutch is proposed. In the shifting process, the engine and electric machine are well controlled, and the shift jerk and power interruption and restoration time are reduced. Simulation and real car test results show that the proposed control strategy can more efficiently improve the shift quality for PHEVs equipped with AMTs.

  11. Determination of the Optimized Automation Rate considering Effects of Automation on Human Operators in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Automation refers to the use of a device or a system to perform a function previously performed by a human operator. It is introduced to reduce the human errors and to enhance the performance in various industrial fields, including the nuclear industry. However, these positive effects are not always achieved in complex systems such as nuclear power plants (NPPs). An excessive introduction of automation can generate new roles for human operators and change activities in unexpected ways. As more automation systems are accepted, the ability of human operators to detect automation failures and resume manual control is diminished. This disadvantage of automation is called the Out-of-the- Loop (OOTL) problem. We should consider the positive and negative effects of automation at the same time to determine the appropriate level of the introduction of automation. Thus, in this paper, we suggest an estimation method to consider the positive and negative effects of automation at the same time to determine the appropriate introduction of automation. This concept is limited in that it does not consider the effects of automation on human operators. Thus, a new estimation method for automation rate was suggested to overcome this problem

  12. Optimization of Fuse-Recloser Coordination and Dispersed Generation Capacity in Distribution Systems

    OpenAIRE

    Morteza Nojavan; Heresh Seyedi,; Kazem Zare; Arash Mahari

    2014-01-01

    In this paper, a novel protection of coordinating optimization algorithm is proposed. Maximizing the penetration‘s dispersed generation and at the same time minimizing the fuse’s operating time are the targets. A novel optimization technique, the Imperialistic Competition Algorithm (ICA), is applied to solve the problem. The results of simulations confirm that the proposed method leads to lower operating times of protective devices and higher possible DG penetration, compared with the tra...

  13. Applying Hybrid PSO to Optimize Directional Overcurrent Relay Coordination in Variable Network Topologies

    OpenAIRE

    Ming-Ta Yang; An Liu

    2013-01-01

    In power systems, determining the values of time dial setting (TDS) and the plug setting (PS) for directional overcurrent relays (DOCRs) is an extremely constrained optimization problem that has been previously described and solved as a nonlinear programming problem. Optimization coordination problems of near-end faults and far-end faults occurring simultaneously in circuits with various topologies, including fixed and variable network topologies, are considered in this study. The aim of thi...

  14. Optimal Stochastic Coordinated Beamforming for Wireless Cooperative Networks with CSI Uncertainty

    OpenAIRE

    Shi, Yuanming; Zhang, Jun; Letaief, Khaled B.

    2013-01-01

    Transmit optimization and resource allocation for wireless cooperative networks with channel state information (CSI) uncertainty are important but challenging problems in terms of both the uncertainty modeling and performance op- timization. In this paper, we establish a generic stochastic coordinated beamforming (SCB) framework that provides flex- ibility in the channel uncertainty modeling, while guaranteeing optimality in the transmission strategies. We adopt a general stochastic model for...

  15. Growing string method with interpolation and optimization in internal coordinates: method and examples.

    Science.gov (United States)

    Zimmerman, Paul M

    2013-05-14

    The growing string method (GSM) has proven especially useful for locating chemical reaction paths at low computational cost. While many string methods use Cartesian coordinates, these methods can be substantially improved by changes in the coordinate system used for interpolation and optimization steps. The quality of the interpolation scheme is especially important because it determines how close the initial path is to the optimized reaction path, and this strongly affects the rate of convergence. In this article, a detailed description of the generation of internal coordinates (ICs) suitable for use in GSM as reactive tangents and in string optimization is given. Convergence of reaction paths is smooth because the IC tangent and orthogonal directions are better representations of chemical bonding compared to Cartesian coordinates. This is not only important quantitatively for reducing computational cost but also allows reaction paths to be described with smoothly varying chemically relevant coordinates. Benchmark computations with challenging reactions are compared to previous versions of GSM and show significant speedups. Finally, a climbing image scheme is included to improve the quality of the transition state approximation, ensuring high reliability of the method.

  16. A coordinated dispatch model for electricity and heat in a Microgrid via particle swarm optimization

    DEFF Research Database (Denmark)

    Xu, Lizhong; Yang, Guangya; Xu, Zhao;

    2013-01-01

    . Particle swarm optimization (PSO) is employed to solve this model for the operation schedule to minimize the total operational cost of Microgrid by coordinating the CHP, electric heater, boiler and heat storage. The efficacy of the model and methodology is verified with different operation scenarios....

  17. Comparison and Application of Metaheuristic Population-Based Optimization Algorithms in Manufacturing Automation

    Directory of Open Access Journals (Sweden)

    Rhythm Suren Wadhwa

    2011-11-01

    Full Text Available The paper presents a comparison and application of metaheuristic population-based optimization algorithms to a flexible manufacturing automation scenario in a metacasting foundry. It presents a novel application and comparison of Bee Colony Algorithm (BCA with variations of Particle Swarm Optimization (PSO and Ant Colony Optimization (ACO for object recognition problem in a robot material handling system. To enable robust pick and place activity of metalcasted parts by a six axis industrial robot manipulator, it is important that the correct orientation of the parts is input to the manipulator, via the digital image captured by the vision system. This information is then used for orienting the robot gripper to grip the part from a moving conveyor belt. The objective is to find the reference templates on the manufactured parts from the target landscape picture which may contain noise. The Normalized cross-correlation (NCC function is used as an objection function in the optimization procedure. The ultimate goal is to test improved algorithms that could prove useful in practical manufacturing automation scenarios.

  18. Coordinated Optimization of Distributed Energy Resources and Smart Loads in Distribution Systems: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Rui; Zhang, Yingchen

    2016-08-01

    Distributed energy resources (DERs) and smart loads have the potential to provide flexibility to the distribution system operation. A coordinated optimization approach is proposed in this paper to actively manage DERs and smart loads in distribution systems to achieve the optimal operation status. A three-phase unbalanced Optimal Power Flow (OPF) problem is developed to determine the output from DERs and smart loads with respect to the system operator's control objective. This paper focuses on coordinating PV systems and smart loads to improve the overall voltage profile in distribution systems. Simulations have been carried out in a 12-bus distribution feeder and results illustrate the superior control performance of the proposed approach.

  19. The optimization of total laboratory automation by simulation of a pull-strategy.

    Science.gov (United States)

    Yang, Taho; Wang, Teng-Kuan; Li, Vincent C; Su, Chia-Lo

    2015-01-01

    Laboratory results are essential for physicians to diagnose medical conditions. Because of the critical role of medical laboratories, an increasing number of hospitals use total laboratory automation (TLA) to improve laboratory performance. Although the benefits of TLA are well documented, systems occasionally become congested, particularly when hospitals face peak demand. This study optimizes TLA operations. Firstly, value stream mapping (VSM) is used to identify the non-value-added time. Subsequently, batch processing control and parallel scheduling rules are devised and a pull mechanism that comprises a constant work-in-process (CONWIP) is proposed. Simulation optimization is then used to optimize the design parameters and to ensure a small inventory and a shorter average cycle time (CT). For empirical illustration, this approach is applied to a real case. The proposed methodology significantly improves the efficiency of laboratory work and leads to a reduction in patient waiting times and increased service level.

  20. Automated Software Testing Using Metahurestic Technique Based on An Ant Colony Optimization

    CERN Document Server

    Srivastava, Praveen Ranjan

    2011-01-01

    Software testing is an important and valuable part of the software development life cycle. Due to time, cost and other circumstances, exhaustive testing is not feasible that's why there is a need to automate the software testing process. Testing effectiveness can be achieved by the State Transition Testing (STT) which is commonly used in real time, embedded and web-based type of software systems. Aim of the current paper is to present an algorithm by applying an ant colony optimization technique, for generation of optimal and minimal test sequences for behavior specification of software. Present paper approach generates test sequence in order to obtain the complete software coverage. This paper also discusses the comparison between two metaheuristic techniques (Genetic Algorithm and Ant Colony optimization) for transition based testing

  1. Effect of coordination of optimal reclosing and fuzzy controlled braking resistor on transient stability during unsuccessful reclosing

    OpenAIRE

    Ali, Mohd.Hasan; Murata, Toshiaki; Tamura, Junji

    2006-01-01

    This paper analyzes the effect of the coordination of optimal reclosing and fuzzy logic-controlled braking resistor on the transient stability of a multimachine power system in case of an unsuccessful reclosing of circuit breakers. The transient stability performance of the coordinated operation of optimal reclosing and fuzzy controlled braking resistor is compared to that of the coordinated operation of conventional auto-reclosing and fuzzy controlled braking resistor. The effectiveness of t...

  2. Optimal Coordinated Control of Power Extraction in LES of a Wind Farm with Entrance Effects

    Directory of Open Access Journals (Sweden)

    Jay P. Goit

    2016-01-01

    Full Text Available We investigate the use of optimal coordinated control techniques in large eddy simulations of wind farm boundary layer interaction with the aim of increasing the total energy extraction in wind farms. The individual wind turbines are considered as flow actuators, and their energy extraction is dynamically regulated in time, so as to optimally influence the flow field. We extend earlier work on wind farm optimal control in the fully-developed regime (Goit and Meyers 2015, J. Fluid Mech. 768, 5–50 to a ‘finite’ wind farm case, in which entrance effects play an important role. For the optimal control, a receding horizon framework is employed in which turbine thrust coefficients are optimized in time and per turbine. Optimization is performed with a conjugate gradient method, where gradients of the cost functional are obtained using adjoint large eddy simulations. Overall, the energy extraction is increased 7% by the optimal control. This increase in energy extraction is related to faster wake recovery throughout the farm. For the first row of turbines, the optimal control increases turbulence levels and Reynolds stresses in the wake, leading to better wake mixing and an inflow velocity for the second row that is significantly higher than in the uncontrolled case. For downstream rows, the optimal control mainly enhances the sideways mean transport of momentum. This is different from earlier observations by Goit and Meyers (2015 in the fully-developed regime, where mainly vertical transport was enhanced.

  3. RootGraph: a graphic optimization tool for automated image analysis of plant roots.

    Science.gov (United States)

    Cai, Jinhai; Zeng, Zhanghui; Connor, Jason N; Huang, Chun Yuan; Melino, Vanessa; Kumar, Pankaj; Miklavcic, Stanley J

    2015-11-01

    This paper outlines a numerical scheme for accurate, detailed, and high-throughput image analysis of plant roots. In contrast to existing root image analysis tools that focus on root system-average traits, a novel, fully automated and robust approach for the detailed characterization of root traits, based on a graph optimization process is presented. The scheme, firstly, distinguishes primary roots from lateral roots and, secondly, quantifies a broad spectrum of root traits for each identified primary and lateral root. Thirdly, it associates lateral roots and their properties with the specific primary root from which the laterals emerge. The performance of this approach was evaluated through comparisons with other automated and semi-automated software solutions as well as against results based on manual measurements. The comparisons and subsequent application of the algorithm to an array of experimental data demonstrate that this method outperforms existing methods in terms of accuracy, robustness, and the ability to process root images under high-throughput conditions.

  4. Optimal combined overcurrent and distance relays co-ordination using a new genetic algorithm method

    Energy Technology Data Exchange (ETDEWEB)

    Kamangar, S.S.H.; Abyaneh, H.A.; Chabanloo, R.M. [Amirkabir Univ. of Technology, Tehran (Iran, Islamic Republic of). Dept. of Electrical Engineering; Razavi, F. [Tafresh Univ. (Iran, Islamic Republic of). Dept. of Electrical Engineering

    2010-04-15

    This paper introduced a new method to optimize the coordination of overcurrent (OC) relays using genetic algorithm (GA). GA is an intelligent optimization technique that can adjust the setting of relays without being based on an initial guess or trapped in the local minimum values, which is the disadvantage of linear programming techniques, such as simplex, 2-phase simplex, and dual simplex techniques. The objective function (OF) of GA is modified by adding a new term to OF to fulfill the coordination of both OC and distance relays. Two power network systems were analyzed using the new computer program, and the results that were obtained show that the method is both efficient and accurate. Transmission and subtransmission protection systems commonly use OC and distance relays. 12 refs., 6 tabs., 5 figs.

  5. OPTIMAL SUBSTRUCTURE OF SET-VALUED SOLUTIONS OF NORMAL-FORM GAMES AND COORDINATION

    Institute of Scientific and Technical Information of China (English)

    Norimasa KOBAYASHI; Kyoichi KIJIMA

    2009-01-01

    A number of solution concepts of normal-form games have been proposed in the literature on subspaces of action profiles that have Nash type stability. While the literature mainly focuses on the minimal of such stable subspaces, this paper argues that non-minimal stable subspaces represent well the multi-agent situations to which neither Nash equilibrium nor rationalizability may be applied with satisfaction. As a theoretical support, the authors prove the optimal substructure of stable subspaces regarding the restriction of a game. It is further argued that the optimal substructure characterizes hierarchical diversity of coordination and interim phases in learning.

  6. Energy Coordinative Optimization of Wind-Storage-Load Microgrids Based on Short-Term Prediction

    OpenAIRE

    Changbin Hu; Shanna Luo; Zhengxi Li; Xin Wang; Li Sun

    2015-01-01

    According to the topological structure of wind-storage-load complementation microgrids, this paper proposes a method for energy coordinative optimization which focuses on improvement of the economic benefits of microgrids in the prediction framework. First of all, the external characteristic mathematical model of distributed generation (DG) units including wind turbines and storage batteries are established according to the requirements of the actual constraints. Meanwhile, using the minimum ...

  7. Distributed convex optimization via continuous-time coordination algorithms with discrete-time communication

    OpenAIRE

    Kia, Solmaz S.; Cortes, Jorge; Martinez, Sonia

    2014-01-01

    This paper proposes a novel class of distributed continuous-time coordination algorithms to solve network optimization problems whose cost function is a sum of local cost functions associated to the individual agents. We establish the exponential convergence of the proposed algorithm under (i) strongly connected and weight-balanced digraph topologies when the local costs are strongly convex with globally Lipschitz gradients, and (ii) connected graph topologies when the local costs are strongl...

  8. Automated Discovery of Elementary Chemical Reaction Steps Using Freezing String and Berny Optimization Methods.

    Science.gov (United States)

    Suleimanov, Yury V; Green, William H

    2015-09-01

    We present a simple protocol which allows fully automated discovery of elementary chemical reaction steps using in cooperation double- and single-ended transition-state optimization algorithms--the freezing string and Berny optimization methods, respectively. To demonstrate the utility of the proposed approach, the reactivity of several single-molecule systems of combustion and atmospheric chemistry importance is investigated. The proposed algorithm allowed us to detect without any human intervention not only "known" reaction pathways, manually detected in the previous studies, but also new, previously "unknown", reaction pathways which involve significant atom rearrangements. We believe that applying such a systematic approach to elementary reaction path finding will greatly accelerate the discovery of new chemistry and will lead to more accurate computer simulations of various chemical processes. PMID:26575920

  9. Automation of Optimized Gabor Filter Parameter Selection for Road Cracks Detection

    Directory of Open Access Journals (Sweden)

    Haris Ahmad Khan

    2016-03-01

    Full Text Available Automated systems for road crack detection are extremely important in road maintenance for vehicle safety and traveler’s comfort. Emerging cracks in roads need to be detected and accordingly repaired as early as possible to avoid further damage thus reducing rehabilitation cost. In this paper, a robust method for Gabor filter parameters optimization for automatic road crack detection is discussed. Gabor filter has been used in previous literature for similar applications. However, there is a need for automatic selection of optimized Gabor filter parameters due to variation in texture of roads and cracks. The problem of change of background, which in fact is road texture, is addressed through a learning process by using synthetic road crack generation for Gabor filter parameter tuning. Tuned parameters are then tested on real cracks and a thorough quantitative analysis is performed for performance evaluation.

  10. Automated Discovery of Elementary Chemical Reaction Steps Using Freezing String and Berny Optimization Methods

    CERN Document Server

    Suleimanov, Yury V

    2015-01-01

    We present a simple protocol which allows fully automated discovery of elementary chemical reaction steps using in cooperation single- and double-ended transition-state optimization algorithms - the freezing string and Berny optimization methods, respectively. To demonstrate the utility of the proposed approach, the reactivity of several systems of combustion and atmospheric chemistry importance is investigated. The proposed algorithm allowed us to detect without any human intervention not only "known" reaction pathways, manually detected in the previous studies, but also new, previously "unknown", reaction pathways which involve significant atom rearrangements. We believe that applying such a systematic approach to elementary reaction path finding will greatly accelerate the possibility of discovery of new chemistry and will lead to more accurate computer simulations of various chemical processes.

  11. Automated Discovery of Elementary Chemical Reaction Steps Using Freezing String and Berny Optimization Methods.

    Science.gov (United States)

    Suleimanov, Yury V; Green, William H

    2015-09-01

    We present a simple protocol which allows fully automated discovery of elementary chemical reaction steps using in cooperation double- and single-ended transition-state optimization algorithms--the freezing string and Berny optimization methods, respectively. To demonstrate the utility of the proposed approach, the reactivity of several single-molecule systems of combustion and atmospheric chemistry importance is investigated. The proposed algorithm allowed us to detect without any human intervention not only "known" reaction pathways, manually detected in the previous studies, but also new, previously "unknown", reaction pathways which involve significant atom rearrangements. We believe that applying such a systematic approach to elementary reaction path finding will greatly accelerate the discovery of new chemistry and will lead to more accurate computer simulations of various chemical processes.

  12. Automated Portfolio Optimization Based on a New Test for Structural Breaks

    Directory of Open Access Journals (Sweden)

    Tobias Berens

    2014-04-01

    Full Text Available We present a completely automated optimization strategy which combines the classical Markowitz mean-variance portfolio theory with a recently proposed test for structural breaks in covariance matrices. With respect to equity portfolios, global minimum-variance optimizations, which base solely on the covariance matrix, yield considerable results in previous studies. However, financial assets cannot be assumed to have a constant covariance matrix over longer periods of time. Hence, we estimate the covariance matrix of the assets by respecting potential change points. The resulting approach resolves the issue of determining a sample for parameter estimation. Moreover, we investigate if this approach is also appropriate for timing the reoptimizations. Finally, we apply the approach to two datasets and compare the results to relevant benchmark techniques by means of an out-of-sample study. It is shown that the new approach outperforms equally weighted portfolios and plain minimum-variance portfolios on average.

  13. Applying Hybrid PSO to Optimize Directional Overcurrent Relay Coordination in Variable Network Topologies

    Directory of Open Access Journals (Sweden)

    Ming-Ta Yang

    2013-01-01

    Full Text Available In power systems, determining the values of time dial setting (TDS and the plug setting (PS for directional overcurrent relays (DOCRs is an extremely constrained optimization problem that has been previously described and solved as a nonlinear programming problem. Optimization coordination problems of near-end faults and far-end faults occurring simultaneously in circuits with various topologies, including fixed and variable network topologies, are considered in this study. The aim of this study was to apply the Nelder-Mead (NM simplex search method and particle swarm optimization (PSO to solve this optimization problem. The proposed NM-PSO method has the advantage of NM algorithm, with a quicker movement toward optimal solution, as well as the advantage of PSO algorithm in the ability to obtain globally optimal solution. Neither a conventional PSO nor the proposed NM-PSO method is capable of dealing with constrained optimization problems. Therefore, we use the gradient-based repair method embedded in a conventional PSO and the proposed NM-PSO. This study used an IEEE 8-bus test system as a case study to compare the convergence performance of the proposed NM-PSO method and a conventional PSO approach. The results demonstrate that a robust and optimal solution can be obtained efficiently by implementing the proposal.

  14. Energy Coordinative Optimization of Wind-Storage-Load Microgrids Based on Short-Term Prediction

    Directory of Open Access Journals (Sweden)

    Changbin Hu

    2015-02-01

    Full Text Available According to the topological structure of wind-storage-load complementation microgrids, this paper proposes a method for energy coordinative optimization which focuses on improvement of the economic benefits of microgrids in the prediction framework. First of all, the external characteristic mathematical model of distributed generation (DG units including wind turbines and storage batteries are established according to the requirements of the actual constraints. Meanwhile, using the minimum consumption costs from the external grid as the objective function, a grey prediction model with residual modification is introduced to output the predictive wind turbine power and load at specific periods. Second, based on the basic framework of receding horizon optimization, an intelligent genetic algorithm (GA is applied to figure out the optimum solution in the predictive horizon for the complex non-linear coordination control model of microgrids. The optimum results of the GA are compared with the receding solution of mixed integer linear programming (MILP. The obtained results show that the method is a viable approach for energy coordinative optimization of microgrid systems for energy flow and reasonable schedule. The effectiveness and feasibility of the proposed method is verified by examples.

  15. Selection of an optimal neural network architecture for computer-aided detection of microcalcifications - Comparison of automated optimization techniques

    International Nuclear Information System (INIS)

    Many computer-aided diagnosis (CAD) systems use neural networks (NNs) for either detection or classification of abnormalities. Currently, most NNs are 'optimized' by manual search in a very limited parameter space. In this work, we evaluated the use of automated optimization methods for selecting an optimal convolution neural network (CNN) architecture. Three automated methods, the steepest descent (SD), the simulated annealing (SA), and the genetic algorithm (GA), were compared. We used as an example the CNN that classifies true and false microcalcifications detected on digitized mammograms by a prescreening algorithm. Four parameters of the CNN architecture were considered for optimization, the numbers of node groups and the filter kernel sizes in the first and second hidden layers, resulting in a search space of 432 possible architectures. The area Az under the receiver operating characteristic (ROC) curve was used to design a cost function. The SA experiments were conducted with four different annealing schedules. Three different parent selection methods were compared for the GA experiments. An available data set was split into two groups with approximately equal number of samples. By using the two groups alternately for training and testing, two different cost surfaces were evaluated. For the first cost surface, the SD method was trapped in a local minimum 91% (392/432) of the time. The SA using the Boltzman schedule selected the best architecture after evaluating, on average, 167 architectures. The GA achieved its best performance with linearly scaled roulette-wheel parent selection; however, it evaluated 391 different architectures, on average, to find the best one. The second cost surface contained no local minimum. For this surface, a simple SD algorithm could quickly find the global minimum, but the SA with the very fast reannealing schedule was still the most efficient. The same SA scheme, however, was trapped in a local minimum on the first cost surface

  16. Aircraft wing structural design optimization based on automated finite element modelling and ground structure approach

    Science.gov (United States)

    Yang, Weizhu; Yue, Zhufeng; Li, Lei; Wang, Peiyan

    2016-01-01

    An optimization procedure combining an automated finite element modelling (AFEM) technique with a ground structure approach (GSA) is proposed for structural layout and sizing design of aircraft wings. The AFEM technique, based on CATIA VBA scripting and PCL programming, is used to generate models automatically considering the arrangement of inner systems. GSA is used for local structural topology optimization. The design procedure is applied to a high-aspect-ratio wing. The arrangement of the integral fuel tank, landing gear and control surfaces is considered. For the landing gear region, a non-conventional initial structural layout is adopted. The positions of components, the number of ribs and local topology in the wing box and landing gear region are optimized to obtain a minimum structural weight. Constraints include tank volume, strength, buckling and aeroelastic parameters. The results show that the combined approach leads to a greater weight saving, i.e. 26.5%, compared with three additional optimizations based on individual design approaches.

  17. Leveraging information storage to select forecast-optimal parameters for delay-coordinate reconstructions

    Science.gov (United States)

    Garland, Joshua; James, Ryan G.; Bradley, Elizabeth

    2016-02-01

    Delay-coordinate reconstruction is a proven modeling strategy for building effective forecasts of nonlinear time series. The first step in this process is the estimation of good values for two parameters, the time delay and the embedding dimension. Many heuristics and strategies have been proposed in the literature for estimating these values. Few, if any, of these methods were developed with forecasting in mind, however, and their results are not optimal for that purpose. Even so, these heuristics—intended for other applications—are routinely used when building delay coordinate reconstruction-based forecast models. In this paper, we propose an alternate strategy for choosing optimal parameter values for forecast methods that are based on delay-coordinate reconstructions. The basic calculation involves maximizing the shared information between each delay vector and the future state of the system. We illustrate the effectiveness of this method on several synthetic and experimental systems, showing that this metric can be calculated quickly and reliably from a relatively short time series, and that it provides a direct indication of how well a near-neighbor based forecasting method will work on a given delay reconstruction of that time series. This allows a practitioner to choose reconstruction parameters that avoid any pathologies, regardless of the underlying mechanism, and maximize the predictive information contained in the reconstruction.

  18. Leveraging information storage to select forecast-optimal parameters for delay-coordinate reconstructions.

    Science.gov (United States)

    Garland, Joshua; James, Ryan G; Bradley, Elizabeth

    2016-02-01

    Delay-coordinate reconstruction is a proven modeling strategy for building effective forecasts of nonlinear time series. The first step in this process is the estimation of good values for two parameters, the time delay and the embedding dimension. Many heuristics and strategies have been proposed in the literature for estimating these values. Few, if any, of these methods were developed with forecasting in mind, however, and their results are not optimal for that purpose. Even so, these heuristics-intended for other applications-are routinely used when building delay coordinate reconstruction-based forecast models. In this paper, we propose an alternate strategy for choosing optimal parameter values for forecast methods that are based on delay-coordinate reconstructions. The basic calculation involves maximizing the shared information between each delay vector and the future state of the system. We illustrate the effectiveness of this method on several synthetic and experimental systems, showing that this metric can be calculated quickly and reliably from a relatively short time series, and that it provides a direct indication of how well a near-neighbor based forecasting method will work on a given delay reconstruction of that time series. This allows a practitioner to choose reconstruction parameters that avoid any pathologies, regardless of the underlying mechanism, and maximize the predictive information contained in the reconstruction. PMID:26986345

  19. Optimal training dataset composition for SVM-based, age-independent, automated epileptic seizure detection.

    Science.gov (United States)

    Bogaarts, J G; Gommer, E D; Hilkman, D M W; van Kranen-Mastenbroek, V H J M; Reulen, J P H

    2016-08-01

    Automated seizure detection is a valuable asset to health professionals, which makes adequate treatment possible in order to minimize brain damage. Most research focuses on two separate aspects of automated seizure detection: EEG feature computation and classification methods. Little research has been published regarding optimal training dataset composition for patient-independent seizure detection. This paper evaluates the performance of classifiers trained on different datasets in order to determine the optimal dataset for use in classifier training for automated, age-independent, seizure detection. Three datasets are used to train a support vector machine (SVM) classifier: (1) EEG from neonatal patients, (2) EEG from adult patients and (3) EEG from both neonates and adults. To correct for baseline EEG feature differences among patients feature, normalization is essential. Usually dedicated detection systems are developed for either neonatal or adult patients. Normalization might allow for the development of a single seizure detection system for patients irrespective of their age. Two classifier versions are trained on all three datasets: one with feature normalization and one without. This gives us six different classifiers to evaluate using both the neonatal and adults test sets. As a performance measure, the area under the receiver operating characteristics curve (AUC) is used. With application of FBC, it resulted in performance values of 0.90 and 0.93 for neonatal and adult seizure detection, respectively. For neonatal seizure detection, the classifier trained on EEG from adult patients performed significantly worse compared to both the classifier trained on EEG data from neonatal patients and the classier trained on both neonatal and adult EEG data. For adult seizure detection, optimal performance was achieved by either the classifier trained on adult EEG data or the classifier trained on both neonatal and adult EEG data. Our results show that age

  20. An automated approach to magnetic divertor configuration design, using an efficient optimization methodology

    Energy Technology Data Exchange (ETDEWEB)

    Blommaert, Maarten; Reiter, Detlev [Institute of Energy and Climate Research (IEK-4), FZ Juelich GmbH, D-52425 Juelich (Germany); Heumann, Holger [Centre de Recherche INRIA Sophia Antipolis, BP 93 06902 Sophia Antipolis (France); Baelmans, Martine [KU Leuven, Department of Mechanical Engineering, 3001 Leuven (Belgium); Gauger, Nicolas Ralph [TU Kaiserslautern, Chair for Scientific Computing, 67663 Kaiserslautern (Germany)

    2015-05-01

    At present, several plasma boundary codes exist that attempt to describe the complex interactions in the divertor SOL (Scrape-Off Layer). The predictive capability of these edge codes is still very limited. Yet, in parallel to major efforts to mature edge codes, we face the design challenges for next step fusion devices. One of them is the design of the helium and heat exhaust system. In past automated design studies, results indicated large potential reductions in peak heat load by an increased magnetic flux divergence towards the target structures. In the present study, a free boundary magnetic equilibrium solver is included into the simulation chain to verify these tendencies. Additionally, we expanded the applicability of the automated design method by introducing advanced ''adjoint'' sensitivity computations. This method, inherited from airfoil shape optimization in aerodynamics, allows for a large number of design variables at no additional computational cost. Results are shown for a design application of the new WEST divertor.

  1. Modeling and performance optimization of automated antenna alignment for telecommunication transceivers

    Directory of Open Access Journals (Sweden)

    Md. Ahsanul Hoque

    2015-09-01

    Full Text Available Antenna alignment is very cumbersome in telecommunication industry and it especially affects the MW links due to environmental anomalies or physical degradation over a period of time. While in recent years a more conventional approach of redundancy has been employed but to ensure the LOS link stability, novel automation techniques are needed. The basic principle is to capture the desired Received Signal Level (RSL by means of an outdoor unit installed on tower top and analyzing the RSL in indoor unit by means of a GUI interface. We have proposed a new smart antenna system where automation is initiated when the transceivers receive low signal strength and report the finding to processing comparator unit. Series architecture is used that include loop antenna, RCX Robonics, LabVIEW interface coupled with a tunable external controller. Denavit–Hartenberg parameters are used in analytical modeling and numerous control techniques have been investigated to overcome imminent overshoot problems for the transport link. With this novel approach, a solution has been put forward for the communication industry where any antenna could achieve optimal directivity for desired RSL with low overshoot and fast steady state response.

  2. Integrated Coordinated Optimization Control of Automatic Generation Control and Automatic Voltage Control in Regional Power Grids

    Directory of Open Access Journals (Sweden)

    Qiu-Yu Lu

    2012-09-01

    Full Text Available Automatic Generation Control (AGC and Automatic Voltage Control (AVC are key approaches to frequency and voltage regulation in power systems. However, based on the assumption of decoupling of active and reactive power control, the existing AGC and AVC systems work independently without any coordination. In this paper, a concept and method of hybrid control is introduced to set up an Integrated Coordinated Optimization Control (ICOC system for AGC and AVC. Concerning the diversity of control devices and the characteristics of discrete control interaction with a continuously operating power system, the ICOC system is designed in a hierarchical structure and driven by security, quality and economic events, consequently reducing optimization complexity and realizing multi-target quasi-optimization. In addition, an innovative model of Loss Minimization Control (LMC taking into consideration active and reactive power regulation is proposed to achieve a substantial reduction in network losses and a cross iterative method for AGC and AVC instructions is also presented to decrease negative interference between control systems. The ICOC system has already been put into practice in some provincial regional power grids in China. Open-looping operation tests have proved the validity of the presented control strategies.

  3. An Automated Tool for Optimization of FMS Scheduling With Meta Heuristic Approach

    Directory of Open Access Journals (Sweden)

    A. V. S. Sreedhar Kumar

    2014-03-01

    Full Text Available The evolutions of manufacturing systems have reflected the need and requirement of the market which varies from time to time. Flexible manufacturing systems have contributed a lot to the development of efficient manufacturing process and production of variety of customized limited volume products as per the market demand based on customer needs. Scheduling of FMS is a crucial operation in maximizing throughput, reducing the wastages and increasing the overall efficiency of the manufacturing process. The dynamic nature of the Flexible Manufacturing Systems makes them unique and hence a generalized solution for scheduling is difficult to be abstracted. Any Solution for optimizing the scheduling should take in to account a multitude of parameters before proposing any solution. The primary objective of the proposed research is to design a tool to automate the optimization of scheduling process by searching for solution in the search spaces using Meta heuristic approaches. The research also validates the use of reward as means for optimizing the scheduling by including it as one of the parameters in the Combined Objective Function.

  4. Numerical and experimental analysis of a ducted propeller designed by a fully automated optimization process under open water condition

    Science.gov (United States)

    Yu, Long; Druckenbrod, Markus; Greve, Martin; Wang, Ke-qi; Abdel-Maksoud, Moustafa

    2015-10-01

    A fully automated optimization process is provided for the design of ducted propellers under open water conditions, including 3D geometry modeling, meshing, optimization algorithm and CFD analysis techniques. The developed process allows the direct integration of a RANSE solver in the design stage. A practical ducted propeller design case study is carried out for validation. Numerical simulations and open water tests are fulfilled and proved that the optimum ducted propeller improves hydrodynamic performance as predicted.

  5. Optimization and coordination of South-to-North Water Diversion supply chain with strategic customer behavior

    Directory of Open Access Journals (Sweden)

    Zhi-song CHEN

    2012-12-01

    Full Text Available The South-to-North Water Diversion (SNWD Project is a significant engineering project meant to solve water shortage problems in North China. Faced with market operations management of the water diversion system, this study defined the supply chain system for the SNWD Project, considering the actual project conditions, built a decentralized decision model and a centralized decision model with strategic customer behavior (SCB using a floating pricing mechanism (FPM, and constructed a coordination mechanism via a revenue-sharing contract. The results suggest the following: (1 owing to water shortage supplements and the excess water sale policy provided by the FPM, the optimal ordering quantity of water resources is less than that without the FPM, and the optimal profits of the whole supply chain, supplier, and external distributor are higher than they would be without the FPM; (2 wholesale pricing and supplementary wholesale pricing with SCB are higher than those without SCB, and the optimal profits of the whole supply chain, supplier, and external distributor are higher than they would be without SCB; and (3 considering SCB and introducing the FPM help increase the optimal profits of the whole supply chain, supplier, and external distributor, and improve the efficiency of water resources usage.

  6. Understanding Innovation Engines: Automated Creativity and Improved Stochastic Optimization via Deep Learning.

    Science.gov (United States)

    Nguyen, A; Yosinski, J; Clune, J

    2016-01-01

    The Achilles Heel of stochastic optimization algorithms is getting trapped on local optima. Novelty Search mitigates this problem by encouraging exploration in all interesting directions by replacing the performance objective with a reward for novel behaviors. This reward for novel behaviors has traditionally required a human-crafted, behavioral distance function. While Novelty Search is a major conceptual breakthrough and outperforms traditional stochastic optimization on certain problems, it is not clear how to apply it to challenging, high-dimensional problems where specifying a useful behavioral distance function is difficult. For example, in the space of images, how do you encourage novelty to produce hawks and heroes instead of endless pixel static? Here we propose a new algorithm, the Innovation Engine, that builds on Novelty Search by replacing the human-crafted behavioral distance with a Deep Neural Network (DNN) that can recognize interesting differences between phenotypes. The key insight is that DNNs can recognize similarities and differences between phenotypes at an abstract level, wherein novelty means interesting novelty. For example, a DNN-based novelty search in the image space does not explore in the low-level pixel space, but instead creates a pressure to create new types of images (e.g., churches, mosques, obelisks, etc.). Here, we describe the long-term vision for the Innovation Engine algorithm, which involves many technical challenges that remain to be solved. We then implement a simplified version of the algorithm that enables us to explore some of the algorithm's key motivations. Our initial results, in the domain of images, suggest that Innovation Engines could ultimately automate the production of endless streams of interesting solutions in any domain: for example, producing intelligent software, robot controllers, optimized physical components, and art. PMID:27367139

  7. Understanding Innovation Engines: Automated Creativity and Improved Stochastic Optimization via Deep Learning.

    Science.gov (United States)

    Nguyen, A; Yosinski, J; Clune, J

    2016-01-01

    The Achilles Heel of stochastic optimization algorithms is getting trapped on local optima. Novelty Search mitigates this problem by encouraging exploration in all interesting directions by replacing the performance objective with a reward for novel behaviors. This reward for novel behaviors has traditionally required a human-crafted, behavioral distance function. While Novelty Search is a major conceptual breakthrough and outperforms traditional stochastic optimization on certain problems, it is not clear how to apply it to challenging, high-dimensional problems where specifying a useful behavioral distance function is difficult. For example, in the space of images, how do you encourage novelty to produce hawks and heroes instead of endless pixel static? Here we propose a new algorithm, the Innovation Engine, that builds on Novelty Search by replacing the human-crafted behavioral distance with a Deep Neural Network (DNN) that can recognize interesting differences between phenotypes. The key insight is that DNNs can recognize similarities and differences between phenotypes at an abstract level, wherein novelty means interesting novelty. For example, a DNN-based novelty search in the image space does not explore in the low-level pixel space, but instead creates a pressure to create new types of images (e.g., churches, mosques, obelisks, etc.). Here, we describe the long-term vision for the Innovation Engine algorithm, which involves many technical challenges that remain to be solved. We then implement a simplified version of the algorithm that enables us to explore some of the algorithm's key motivations. Our initial results, in the domain of images, suggest that Innovation Engines could ultimately automate the production of endless streams of interesting solutions in any domain: for example, producing intelligent software, robot controllers, optimized physical components, and art.

  8. Optimal coordination of maximal-effort horizontal and vertical jump motions – a computer simulation study

    Directory of Open Access Journals (Sweden)

    Komura Taku

    2007-06-01

    Full Text Available Abstract Background The purpose of this study was to investigate the coordination strategy of maximal-effort horizontal jumping in comparison with vertical jumping, using the methodology of computer simulation. Methods A skeletal model that has nine rigid body segments and twenty degrees of freedom was developed. Thirty-two Hill-type lower limb muscles were attached to the model. The excitation-contraction dynamics of the contractile element, the tissues around the joints to limit the joint range of motion, as well as the foot-ground interaction were implemented. Simulations were initiated from an identical standing posture for both motions. Optimal pattern of the activation input signal was searched through numerical optimization. For the horizontal jumping, the goal was to maximize the horizontal distance traveled by the body's center of mass. For the vertical jumping, the goal was to maximize the height reached by the body's center of mass. Results As a result, it was found that the hip joint was utilized more vigorously in the horizontal jumping than in the vertical jumping. The muscles that have a function of joint flexion such as the m. iliopsoas, m. rectus femoris and m. tibialis anterior were activated to a greater level during the countermovement in the horizontal jumping with an effect of moving the body's center of mass in the forward direction. Muscular work was transferred to the mechanical energy of the body's center of mass more effectively in the horizontal jump, which resulted in a greater energy gain of the body's center of mass throughout the motion. Conclusion These differences in the optimal coordination strategy seem to be caused from the requirement that the body's center of mass needs to be located above the feet in a vertical jumping, whereas this requirement is not so strict in a horizontal jumping.

  9. Path optimization by a variational reaction coordinate method. I. Development of formalism and algorithms.

    Science.gov (United States)

    Birkholz, Adam B; Schlegel, H Bernhard

    2015-12-28

    The development of algorithms to optimize reaction pathways between reactants and products is an active area of study. Existing algorithms typically describe the path as a discrete series of images (chain of states) which are moved downhill toward the path, using various reparameterization schemes, constraints, or fictitious forces to maintain a uniform description of the reaction path. The Variational Reaction Coordinate (VRC) method is a novel approach that finds the reaction path by minimizing the variational reaction energy (VRE) of Quapp and Bofill. The VRE is the line integral of the gradient norm along a path between reactants and products and minimization of VRE has been shown to yield the steepest descent reaction path. In the VRC method, we represent the reaction path by a linear expansion in a set of continuous basis functions and find the optimized path by minimizing the VRE with respect to the linear expansion coefficients. Improved convergence is obtained by applying constraints to the spacing of the basis functions and coupling the minimization of the VRE to the minimization of one or more points along the path that correspond to intermediates and transition states. The VRC method is demonstrated by optimizing the reaction path for the Müller-Brown surface and by finding a reaction path passing through 5 transition states and 4 intermediates for a 10 atom Lennard-Jones cluster. PMID:26723645

  10. An Arterial Signal Coordination Optimization Model for Trams Based on Modified AM-BAND

    Directory of Open Access Journals (Sweden)

    Yangfan Zhou

    2016-01-01

    Full Text Available Modern trams are developing fast because of their characteristics like medium capability and energy saving. Exclusive way is always set in practice to avoid interruption from general vehicles, while trams have to stop at intersections frequently due to signal rules in the road network. Therefore, signal optimization has great effects on operational efficiency of trams system. In this paper, an arterial signal coordination optimization model is proposed for trams progression based on the Asymmetrical Multi-BAND (AM-BAND method. The AM-BAND is modified from the following aspects. Firstly, BAM-BAND is developed by supplementing active bandwidth constraints to AM-BAND. Assisted by the IBM ILOG CPLEX Optimization Studio, two arterial signals plans with eight intersections are achieved from AM-BAND and BAM-BAND for comparison. Secondly, based on the modified BAM-BAND, a BAM-TRAMBAND model is presented, which incorporates three constraints regarding tram operations, including dwell time at stations, active signal priority, and minimum bandwidth value. The case study and VISSIM simulation results show that travel times of trams decrease with signal plan from BAM-TRAMBAND comparing with the original signal plan. Moreover, traffic performance indicators such as stops and delay are improved significantly.

  11. Path optimization by a variational reaction coordinate method. I. Development of formalism and algorithms.

    Science.gov (United States)

    Birkholz, Adam B; Schlegel, H Bernhard

    2015-12-28

    The development of algorithms to optimize reaction pathways between reactants and products is an active area of study. Existing algorithms typically describe the path as a discrete series of images (chain of states) which are moved downhill toward the path, using various reparameterization schemes, constraints, or fictitious forces to maintain a uniform description of the reaction path. The Variational Reaction Coordinate (VRC) method is a novel approach that finds the reaction path by minimizing the variational reaction energy (VRE) of Quapp and Bofill. The VRE is the line integral of the gradient norm along a path between reactants and products and minimization of VRE has been shown to yield the steepest descent reaction path. In the VRC method, we represent the reaction path by a linear expansion in a set of continuous basis functions and find the optimized path by minimizing the VRE with respect to the linear expansion coefficients. Improved convergence is obtained by applying constraints to the spacing of the basis functions and coupling the minimization of the VRE to the minimization of one or more points along the path that correspond to intermediates and transition states. The VRC method is demonstrated by optimizing the reaction path for the Müller-Brown surface and by finding a reaction path passing through 5 transition states and 4 intermediates for a 10 atom Lennard-Jones cluster.

  12. Computer-automated multi-disciplinary analysis and design optimization of internally cooled turbine blades

    Science.gov (United States)

    Martin, Thomas Joseph

    This dissertation presents the theoretical methodology, organizational strategy, conceptual demonstration and validation of a fully automated computer program for the multi-disciplinary analysis, inverse design and optimization of convectively cooled axial gas turbine blades and vanes. Parametric computer models of the three-dimensional cooled turbine blades and vanes were developed, including the automatic generation of discretized computational grids. Several new analysis programs were written and incorporated with existing computational tools to provide computer models of the engine cycle, aero-thermodynamics, heat conduction and thermofluid physics of the internally cooled turbine blades and vanes. A generalized information transfer protocol was developed to provide the automatic mapping of geometric and boundary condition data between the parametric design tool and the numerical analysis programs. A constrained hybrid optimization algorithm controlled the overall operation of the system and guided the multi-disciplinary internal turbine cooling design process towards the objectives and constraints of engine cycle performance, aerodynamic efficiency, cooling effectiveness and turbine blade and vane durability. Several boundary element computer programs were written to solve the steady-state non-linear heat conduction equation inside the internally cooled and thermal barrier-coated turbine blades and vanes. The boundary element method (BEM) did not require grid generation inside the internally cooled turbine blades and vanes, so the parametric model was very robust. Implicit differentiations of the BEM thermal and thereto-elastic analyses were done to compute design sensitivity derivatives faster and more accurately than via explicit finite differencing. A factor of three savings of computer processing time was realized for two-dimensional thermal optimization problems, and a factor of twenty was obtained for three-dimensional thermal optimization problems

  13. Optimized Multi Agent Coordination using Evolutionary Algorithm: Special Impact in Online Education

    Directory of Open Access Journals (Sweden)

    Subrat P Pattanayak

    2012-08-01

    Full Text Available Intelligent multi-agent systems are contemporary direction ofartificial intelligence that is being built up as a result ofresearchers in information processing, distributed systems,network technologies for problem solving. Multi agentcoordination is a vital area where agents coordinate amongthemselves to achieve a particular goal, which either can not besolved by a single agent or is not time effective by a singleagent. The agent’s role in education field is rapidly increasing.Information retrieval, students information processing system,Learning Information System, Pedagogical Agents are variedwork done by different agent technology. The novice usersspecifically are the most useful learners of an E-Tutoringsystem. A multi-agent system plays a vital role in this type ofE-tutoring system. Online Education is an emerging field inEducation System. To improve the interaction betweenlearners and tutors with personalized communication, weproposed an Optimized Multi Agent System (OMAS bywhich a learner can get sufficient information to achieve theirobjective. This conceptual framework is based on the idea that,adaptiveness is the best match between a particular learnersprofile and its course contents. We also try to optimize theprocedure using evolutionary process so that style of thelearner and the learning methods with respect to the learner ismatched with high fitness value. The agent technology hasbeen applied in a varied type of applications for education, butthis system may work as a user friendly conceptual systemwhichcan be integrated with any e-learning software. Use ofthe GUI user interface can make the system moreenriched. When a particular request comes from thelearner, the agents coordinate themselves to get the bestpossible solution. The solution can be represented in ananimated way in front of the learner, so that the noviceusers those who are new to the system, can adopt it veryeasily and with ease.

  14. Extending the NIF DISCO framework to automate complex workflow: coordinating the harvest and integration of data from diverse neuroscience information resources

    Directory of Open Access Journals (Sweden)

    Luis eMarenco

    2014-05-01

    Full Text Available This paper describes how DISCO, the data-aggregator that supports the Neuroscience Information Framework (NIF, has been extended to play a central role in automating the complex workflow required to support and coordinate the NIF's data integration capabilities. The NIF is an NIH Neuroscience Blueprint initiative designed to help researchers access the wealth of data related to the neurosciences available via the Internet. A central component is the NIF Federation, a searchable database that currently contains data from 231 data and information resources regularly harvested, updated, and warehoused in the DISCO system. In the past several years, DISCO has greatly extended its functionality and has evolved to play a central role in automating the complex, ongoing process of harvesting, validating, integrating, and displaying neuroscience data from a growing set of participating resources. This paper provides an overview of DISCO's current capabilities and discusses a number of the challenges and future directions related to the process of coordinating the integration of neuroscience data within the NIF Federation.

  15. Extending the NIF DISCO framework to automate complex workflow: coordinating the harvest and integration of data from diverse neuroscience information resources.

    Science.gov (United States)

    Marenco, Luis N; Wang, Rixin; Bandrowski, Anita E; Grethe, Jeffrey S; Shepherd, Gordon M; Miller, Perry L

    2014-01-01

    This paper describes how DISCO, the data aggregator that supports the Neuroscience Information Framework (NIF), has been extended to play a central role in automating the complex workflow required to support and coordinate the NIF's data integration capabilities. The NIF is an NIH Neuroscience Blueprint initiative designed to help researchers access the wealth of data related to the neurosciences available via the Internet. A central component is the NIF Federation, a searchable database that currently contains data from 231 data and information resources regularly harvested, updated, and warehoused in the DISCO system. In the past several years, DISCO has greatly extended its functionality and has evolved to play a central role in automating the complex, ongoing process of harvesting, validating, integrating, and displaying neuroscience data from a growing set of participating resources. This paper provides an overview of DISCO's current capabilities and discusses a number of the challenges and future directions related to the process of coordinating the integration of neuroscience data within the NIF Federation.

  16. Development of a neuro-fuzzy technique for automated parameter optimization of inverse treatment planning

    International Nuclear Information System (INIS)

    Parameter optimization in the process of inverse treatment planning for intensity modulated radiation therapy (IMRT) is mainly conducted by human planners in order to create a plan with the desired dose distribution. To automate this tedious process, an artificial intelligence (AI) guided system was developed and examined. The AI system can automatically accomplish the optimization process based on prior knowledge operated by several fuzzy inference systems (FIS). Prior knowledge, which was collected from human planners during their routine trial-and-error process of inverse planning, has first to be 'translated' to a set of 'if-then rules' for driving the FISs. To minimize subjective error which could be costly during this knowledge acquisition process, it is necessary to find a quantitative method to automatically accomplish this task. A well-developed machine learning technique, based on an adaptive neuro fuzzy inference system (ANFIS), was introduced in this study. Based on this approach, prior knowledge of a fuzzy inference system can be quickly collected from observation data (clinically used constraints). The learning capability and the accuracy of such a system were analyzed by generating multiple FIS from data collected from an AI system with known settings and rules. Multiple analyses showed good agreements of FIS and ANFIS according to rules (error of the output values of ANFIS based on the training data from FIS of 7.77 ± 0.02%) and membership functions (3.9%), thus suggesting that the 'behavior' of an FIS can be propagated to another, based on this process. The initial experimental results on a clinical case showed that ANFIS is an effective way to build FIS from practical data, and analysis of ANFIS and FIS with clinical cases showed good planning results provided by ANFIS. OAR volumes encompassed by characteristic percentages of isodoses were reduced by a mean of between 0 and 28%. The study demonstrated a feasible way

  17. Dynamic strategy based fast decomposed GA coordinated with FACTS devices to enhance the optimal power flow

    Energy Technology Data Exchange (ETDEWEB)

    Mahdad, Belkacem, E-mail: bemahdad@yahoo.f [University of Biskra, Department of Electrical Engineering, Biskra 07000 (Algeria); Bouktir, T. [Oum El Bouaghi, Department of Electrical Engineering, Oum El Bouaghi 04000 (Algeria); Srairi, K. [University of Biskra, Department of Electrical Engineering, Biskra 07000 (Algeria); EL Benbouzid, M. [Laboratoire Brestois de Mecanique et des Systemes, University of Brest (France)

    2010-07-15

    Under critical situation the main preoccupation of expert engineers is to assure power system security and to deliver power to the consumer within the desired index power quality. The total generation cost taken as a secondary strategy. This paper presents an efficient decomposed GA to enhance the solution of the optimal power flow (OPF) with non-smooth cost function and under severe loading conditions. At the decomposed stage the length of the original chromosome is reduced successively and adapted to the topology of the new partition. Two sub problems are proposed to coordinate the OPF problem under different loading conditions: the first sub problem related to the active power planning under different loading factor to minimize the total fuel cost, and the second sub problem is a reactive power planning designed based in practical rules to make fine corrections to the voltage deviation and reactive power violation using a specified number of shunt dynamic compensators named Static Var Compensators (SVC). To validate the robustness of the proposed approach, the proposed algorithm tested on IEEE 30-Bus, 26-Bus and IEEE 118-Bus under different loading conditions and compared with global optimization methods (GA, EGA, FGA, PSO, MTS, MDE and ACO) and with two robust simulation packages: PSAT and MATPOWER. The results show that the proposed approach can converge to the near solution and obtain a competitive solution at critical situation and with a reasonable time.

  18. Automated bone segmentation from dental CBCT images using patch-based sparse representation and convex optimization

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Li; Gao, Yaozong; Shi, Feng; Liao, Shu; Li, Gang [Department of Radiology and BRIC, University of North Carolina at Chapel Hill, North Carolina 27599 (United States); Chen, Ken Chung [Department of Oral and Maxillofacial Surgery, Houston Methodist Hospital Research Institute, Houston, Texas 77030 and Department of Stomatology, National Cheng Kung University Medical College and Hospital, Tainan, Taiwan 70403 (China); Shen, Steve G. F.; Yan, Jin [Department of Oral and Craniomaxillofacial Surgery and Science, Shanghai Ninth People' s Hospital, Shanghai Jiao Tong University College of Medicine, Shanghai, China 200011 (China); Lee, Philip K. M.; Chow, Ben [Hong Kong Dental Implant and Maxillofacial Centre, Hong Kong, China 999077 (China); Liu, Nancy X. [Department of Oral and Maxillofacial Surgery, Houston Methodist Hospital Research Institute, Houston, Texas 77030 and Department of Oral and Maxillofacial Surgery, Peking University School and Hospital of Stomatology, Beijing, China 100050 (China); Xia, James J. [Department of Oral and Maxillofacial Surgery, Houston Methodist Hospital Research Institute, Houston, Texas 77030 (United States); Department of Surgery (Oral and Maxillofacial Surgery), Weill Medical College, Cornell University, New York, New York 10065 (United States); Department of Oral and Craniomaxillofacial Surgery and Science, Shanghai Ninth People' s Hospital, Shanghai Jiao Tong University College of Medicine, Shanghai, China 200011 (China); Shen, Dinggang, E-mail: dgshen@med.unc.edu [Department of Radiology and BRIC, University of North Carolina at Chapel Hill, North Carolina 27599 and Department of Brain and Cognitive Engineering, Korea University, Seoul, 136701 (Korea, Republic of)

    2014-04-15

    Purpose: Cone-beam computed tomography (CBCT) is an increasingly utilized imaging modality for the diagnosis and treatment planning of the patients with craniomaxillofacial (CMF) deformities. Accurate segmentation of CBCT image is an essential step to generate three-dimensional (3D) models for the diagnosis and treatment planning of the patients with CMF deformities. However, due to the poor image quality, including very low signal-to-noise ratio and the widespread image artifacts such as noise, beam hardening, and inhomogeneity, it is challenging to segment the CBCT images. In this paper, the authors present a new automatic segmentation method to address these problems. Methods: To segment CBCT images, the authors propose a new method for fully automated CBCT segmentation by using patch-based sparse representation to (1) segment bony structures from the soft tissues and (2) further separate the mandible from the maxilla. Specifically, a region-specific registration strategy is first proposed to warp all the atlases to the current testing subject and then a sparse-based label propagation strategy is employed to estimate a patient-specific atlas from all aligned atlases. Finally, the patient-specific atlas is integrated into amaximum a posteriori probability-based convex segmentation framework for accurate segmentation. Results: The proposed method has been evaluated on a dataset with 15 CBCT images. The effectiveness of the proposed region-specific registration strategy and patient-specific atlas has been validated by comparing with the traditional registration strategy and population-based atlas. The experimental results show that the proposed method achieves the best segmentation accuracy by comparison with other state-of-the-art segmentation methods. Conclusions: The authors have proposed a new CBCT segmentation method by using patch-based sparse representation and convex optimization, which can achieve considerably accurate segmentation results in CBCT

  19. Moving Toward an Optimal and Automated Geospatial Network for CCUS Infrastructure

    Energy Technology Data Exchange (ETDEWEB)

    Hoover, Brendan Arthur [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-08-05

    Modifications in the global climate are being driven by the anthropogenic release of greenhouse gases (GHG) including carbon dioxide (CO2) (Middleton et al. 2014). CO2 emissions have, for example, been directly linked to an increase in total global temperature (Seneviratne et al. 2016). Strategies that limit CO2 emissions—like CO2 capture, utilization, and storage (CCUS) technology—can greatly reduce emissions by capturing CO2 before it is released to the atmosphere. However, to date CCUS technology has not been developed at a large commercial scale despite several promising high profile demonstration projects (Middleton et al. 2015). Current CCUS research has often focused on capturing CO2 emissions from coal-fired power plants, but recent research at Los Alamos National Laboratory (LANL) suggests focusing CCUS CO2 capture research upon industrial sources might better encourage CCUS deployment. To further promote industrial CCUS deployment, this project builds off current LANL research by continuing the development of a software tool called SimCCS, which estimates a regional system of transport to inject CO2 into sedimentary basins. The goal of SimCCS, which was first developed by Middleton and Bielicki (2009), is to output an automated and optimal geospatial industrial CCUS pipeline that accounts for industrial source and sink locations by estimating a Delaunay triangle network which also minimizes topographic and social costs (Middleton and Bielicki 2009). Current development of SimCCS is focused on creating a new version that accounts for spatial arrangements that were not available in the previous version. This project specifically addresses the issue of non-unique Delaunay triangles by adding additional triangles to the network, which can affect how the CCUS network is calculated.

  20. Automated bone segmentation from dental CBCT images using patch-based sparse representation and convex optimization

    International Nuclear Information System (INIS)

    Purpose: Cone-beam computed tomography (CBCT) is an increasingly utilized imaging modality for the diagnosis and treatment planning of the patients with craniomaxillofacial (CMF) deformities. Accurate segmentation of CBCT image is an essential step to generate three-dimensional (3D) models for the diagnosis and treatment planning of the patients with CMF deformities. However, due to the poor image quality, including very low signal-to-noise ratio and the widespread image artifacts such as noise, beam hardening, and inhomogeneity, it is challenging to segment the CBCT images. In this paper, the authors present a new automatic segmentation method to address these problems. Methods: To segment CBCT images, the authors propose a new method for fully automated CBCT segmentation by using patch-based sparse representation to (1) segment bony structures from the soft tissues and (2) further separate the mandible from the maxilla. Specifically, a region-specific registration strategy is first proposed to warp all the atlases to the current testing subject and then a sparse-based label propagation strategy is employed to estimate a patient-specific atlas from all aligned atlases. Finally, the patient-specific atlas is integrated into amaximum a posteriori probability-based convex segmentation framework for accurate segmentation. Results: The proposed method has been evaluated on a dataset with 15 CBCT images. The effectiveness of the proposed region-specific registration strategy and patient-specific atlas has been validated by comparing with the traditional registration strategy and population-based atlas. The experimental results show that the proposed method achieves the best segmentation accuracy by comparison with other state-of-the-art segmentation methods. Conclusions: The authors have proposed a new CBCT segmentation method by using patch-based sparse representation and convex optimization, which can achieve considerably accurate segmentation results in CBCT

  1. Automated property optimization via ab initio O(N) elongation method: Application to (hyper-)polarizability in DNA.

    Science.gov (United States)

    Orimoto, Yuuichi; Aoki, Yuriko

    2016-07-14

    An automated property optimization method was developed based on the ab initio O(N) elongation (ELG) method and applied to the optimization of nonlinear optical (NLO) properties in DNA as a first test. The ELG method mimics a polymerization reaction on a computer, and the reaction terminal of a starting cluster is attacked by monomers sequentially to elongate the electronic structure of the system by solving in each step a limited space including the terminal (localized molecular orbitals at the terminal) and monomer. The ELG-finite field (ELG-FF) method for calculating (hyper-)polarizabilities was used as the engine program of the optimization method, and it was found to show linear scaling efficiency while maintaining high computational accuracy for a random sequenced DNA model. Furthermore, the self-consistent field convergence was significantly improved by using the ELG-FF method compared with a conventional method, and it can lead to more feasible NLO property values in the FF treatment. The automated optimization method successfully chose an appropriate base pair from four base pairs (A, T, G, and C) for each elongation step according to an evaluation function. From test optimizations for the first order hyper-polarizability (β) in DNA, a substantial difference was observed depending on optimization conditions between "choose-maximum" (choose a base pair giving the maximum β for each step) and "choose-minimum" (choose a base pair giving the minimum β). In contrast, there was an ambiguous difference between these conditions for optimizing the second order hyper-polarizability (γ) because of the small absolute value of γ and the limitation of numerical differential calculations in the FF method. It can be concluded that the ab initio level property optimization method introduced here can be an effective step towards an advanced computer aided material design method as long as the numerical limitation of the FF method is taken into account. PMID:27421397

  2. Automated property optimization via ab initio O(N) elongation method: Application to (hyper-)polarizability in DNA

    Science.gov (United States)

    Orimoto, Yuuichi; Aoki, Yuriko

    2016-07-01

    An automated property optimization method was developed based on the ab initio O(N) elongation (ELG) method and applied to the optimization of nonlinear optical (NLO) properties in DNA as a first test. The ELG method mimics a polymerization reaction on a computer, and the reaction terminal of a starting cluster is attacked by monomers sequentially to elongate the electronic structure of the system by solving in each step a limited space including the terminal (localized molecular orbitals at the terminal) and monomer. The ELG-finite field (ELG-FF) method for calculating (hyper-)polarizabilities was used as the engine program of the optimization method, and it was found to show linear scaling efficiency while maintaining high computational accuracy for a random sequenced DNA model. Furthermore, the self-consistent field convergence was significantly improved by using the ELG-FF method compared with a conventional method, and it can lead to more feasible NLO property values in the FF treatment. The automated optimization method successfully chose an appropriate base pair from four base pairs (A, T, G, and C) for each elongation step according to an evaluation function. From test optimizations for the first order hyper-polarizability (β) in DNA, a substantial difference was observed depending on optimization conditions between "choose-maximum" (choose a base pair giving the maximum β for each step) and "choose-minimum" (choose a base pair giving the minimum β). In contrast, there was an ambiguous difference between these conditions for optimizing the second order hyper-polarizability (γ) because of the small absolute value of γ and the limitation of numerical differential calculations in the FF method. It can be concluded that the ab initio level property optimization method introduced here can be an effective step towards an advanced computer aided material design method as long as the numerical limitation of the FF method is taken into account.

  3. Dynamic Coordinated Shifting Control of Automated Mechanical Transmissions without a Clutch in a Plug-In Hybrid Electric Vehicle

    OpenAIRE

    Xinlei Liu; Zhentong Liu; Liming Zhu; Hongwen He

    2012-01-01

    On the basis of the shifting process of automated mechanical transmissions (AMTs) for traditional hybrid electric vehicles (HEVs), and by combining the features of electric machines with fast response speed, the dynamic model of the hybrid electric AMT vehicle powertrain is built up, the dynamic characteristics of each phase of shifting process are analyzed, and a control strategy in which torque and speed of the engine and electric ma...

  4. Optimizing Electric Vehicle Coordination Over a Heterogeneous Mesh Network in a Scaled-Down Smart Grid Testbed

    DEFF Research Database (Denmark)

    Bhattarai, Bishnu Prasad; Lévesque, Martin; Maier, Martin;

    2015-01-01

    High penetration of renewable energy sources and electric vehicles (EVs) create power imbalance and congestion in the existing power network, and hence causes significant problems in the control and operation. Despite investing huge efforts from the electric utilities, governments, and researchers...... by optimizing EV charging coordination realized through the synchronized exchange of monitoring and control packets via an heterogeneous Ethernet-based mesh network....

  5. A new module for constrained multi-fragment geometry optimization in internal coordinates implemented in the MOLCAS package.

    Science.gov (United States)

    Vysotskiy, Victor P; Boström, Jonas; Veryazov, Valera

    2013-11-15

    A parallel procedure for an effective optimization of relative position and orientation between two or more fragments has been implemented in the MOLCAS program package. By design, the procedure does not perturb the electronic structure of a system under the study. The original composite system is divided into frozen fragments and internal coordinates linking those fragments are the only optimized parameters. The procedure is capable to handle fully independent (no border atoms) fragments as well as fragments connected by covalent bonds. In the framework of the procedure, the optimization of relative position and orientation of the fragments are carried out in the internal "Z-matrix" coordinates using numerical derivatives. The total number of required single points energy evaluations scales with the number of fragments rather than with the total number of atoms in the system. The accuracy and the performance of the procedure have been studied by test calculations for a representative set of two- and three-fragment molecules with artificially distorted structures. The developed approach exhibits robust and smooth convergence to the reference optimal structures. As only a few internal coordinates are varied during the procedure, the proposed constrained fragment geometry optimization can be afforded even for high level ab initio methods like CCSD(T) and CASPT2. This capability has been demonstrated by applying the method to two larger cases, CCSD(T) and CASPT2 calculations on a positively charged benzene lithium complex and on the oxygen molecule interacting to iron porphyrin molecule, respectively.

  6. A new module for constrained multi-fragment geometry optimization in internal coordinates implemented in the MOLCAS package.

    Science.gov (United States)

    Vysotskiy, Victor P; Boström, Jonas; Veryazov, Valera

    2013-11-15

    A parallel procedure for an effective optimization of relative position and orientation between two or more fragments has been implemented in the MOLCAS program package. By design, the procedure does not perturb the electronic structure of a system under the study. The original composite system is divided into frozen fragments and internal coordinates linking those fragments are the only optimized parameters. The procedure is capable to handle fully independent (no border atoms) fragments as well as fragments connected by covalent bonds. In the framework of the procedure, the optimization of relative position and orientation of the fragments are carried out in the internal "Z-matrix" coordinates using numerical derivatives. The total number of required single points energy evaluations scales with the number of fragments rather than with the total number of atoms in the system. The accuracy and the performance of the procedure have been studied by test calculations for a representative set of two- and three-fragment molecules with artificially distorted structures. The developed approach exhibits robust and smooth convergence to the reference optimal structures. As only a few internal coordinates are varied during the procedure, the proposed constrained fragment geometry optimization can be afforded even for high level ab initio methods like CCSD(T) and CASPT2. This capability has been demonstrated by applying the method to two larger cases, CCSD(T) and CASPT2 calculations on a positively charged benzene lithium complex and on the oxygen molecule interacting to iron porphyrin molecule, respectively. PMID:24006272

  7. Electric Vehicle Charging and Discharging Coordination on Distribution Network Using Multi-Objective Particle Swarm Optimization and Fuzzy Decision Making

    Directory of Open Access Journals (Sweden)

    Dongqi Liu

    2016-03-01

    Full Text Available This paper proposed a optimal strategy for coordinated operation of electric vehicles (EVs charging and discharging with wind-thermal system. By aggregating a large number of EVs, the huge total battery capacity is sufficient to stabilize the disturbance of the transmission grid. Hence, a dynamic environmental dispatch model which coordinates a cluster of charging and discharging controllable EV units with wind farms and thermal plants is proposed. A multi-objective particle swarm optimization (MOPSO algorithm and a fuzzy decision maker are put forward for the simultaneous optimization of grid operating cost, CO2 emissions, wind curtailment, and EV users’ cost. Simulations are done in a 30 node system containing three traditional thermal plants, two carbon capture and storage (CCS thermal plants, two wind farms, and six EV aggregations. Contrast of strategies under different EV charging/discharging price is also discussed. The results are presented to prove the effectiveness of the proposed strategy.

  8. Computer-aided method for automated selection of optimal imaging plane for measurement of total cerebral blood flow by MRI

    Science.gov (United States)

    Teng, Pang-yu; Bagci, Ahmet Murat; Alperin, Noam

    2009-02-01

    A computer-aided method for finding an optimal imaging plane for simultaneous measurement of the arterial blood inflow through the 4 vessels leading blood to the brain by phase contrast magnetic resonance imaging is presented. The method performance is compared with manual selection by two observers. The skeletons of the 4 vessels for which centerlines are generated are first extracted. Then, a global direction of the relatively less curved internal carotid arteries is calculated to determine the main flow direction. This is then used as a reference direction to identify segments of the vertebral arteries that strongly deviates from the main flow direction. These segments are then used to identify anatomical landmarks for improved consistency of the imaging plane selection. An optimal imaging plane is then identified by finding a plane with the smallest error value, which is defined as the sum of the angles between the plane's normal and the vessel centerline's direction at the location of the intersections. Error values obtained using the automated and the manual methods were then compared using 9 magnetic resonance angiography (MRA) data sets. The automated method considerably outperformed the manual selection. The mean error value with the automated method was significantly lower than the manual method, 0.09+/-0.07 vs. 0.53+/-0.45, respectively (p<.0001, Student's t-test). Reproducibility of repeated measurements was analyzed using Bland and Altman's test, the mean 95% limits of agreements for the automated and manual method were 0.01~0.02 and 0.43~0.55 respectively.

  9. Optimization of RNA Purification and Analysis for Automated, Pre-Symptomatic Disease Diagnostics

    Energy Technology Data Exchange (ETDEWEB)

    Vaidya, A; Nasarabadi, S; Milanovich, F

    2005-06-28

    When diagnosing disease, time is often a more formidable enemy than the pathogen itself. Current detection methods rely primarily on post-symptomatic protein production (i.e. antibodies), which does not occur in noticeable levels until several weeks after infection. As such, a major goal among researchers today is to expedite pre-symptomatic disease recognition and treatment. Since most pathogens are known to leave a unique signature on the genetic expression of the host, one potential diagnostic tool is host mRNA. In my experiments, I examined several methods of isolating RNA and reading its genetic sequence. I first used two types of reverse transcriptase polymerase chain reactions (using commercial RNA) and examined the resultant complementary DNA through gel electrophoresis. I then proceeded to isolate and purify whole RNA from actual human monocytes and THP-1 cells using several published methods, and examined gene expression on the RNA itself. I compared the two RT-PCR methods and concluded that a double step RT-PCR is superior to the single step method. I also compared the various techniques of RNA isolation by examining the yield and purity of the resultant RNA. Finally, I studied the level of cellular IL-8 and IL-1 gene expression, two genes involved in the human immune response, which can serve as a baseline for future genetic comparison with LPS-exposed cells. Based on the results, I have determined which conditions and procedures are optimal for RNA isolation, RT-PCR, and RNA yield assessment. The overall goal of my research is to develop a flow-through system of RNA analysis, whereby blood samples can be collected and analyzed for disease prior to the onset of symptoms. The Pathomics group hopes to automate this process by removing the human labor factor, thereby decreasing the procedure's cost and increasing its availability to the general population. Eventually, our aim is to have an autonomous diagnostic system based on RNA analysis that would

  10. Optimal Attitude Estimation and Filtering Without Using Local Coordinates Part I: Uncontrolled and Deterministic Attitude Dynamics

    OpenAIRE

    Sanyal, Amit K.

    2005-01-01

    There are several attitude estimation algorithms in existence, all of which use local coordinate representations for the group of rigid body orientations. All local coordinate representations of the group of orientations have associated problems. While minimal coordinate representations exhibit kinematic singularities for large rotations, the quaternion representation requires satisfaction of an extra constraint. This paper treats the attitude estimation and filtering problem as an optimizati...

  11. Patient Dose Optimization in Fluoroscopically Guided Interventional Procedures. Final Report of a Coordinated Research Project

    International Nuclear Information System (INIS)

    In recent years, many surgical procedures have increasingly been replaced by interventional procedures that guide catheters into the arteries under X ray fluoroscopic guidance to perform a variety of operations such as ballooning, embolization, implantation of stents etc. The radiation exposure to patients and staff in such procedures is much higher than in simple radiographic examinations like X ray of chest or abdomen such that radiation induced skin injuries to patients and eye lens opacities among workers have been reported in the 1990's and after. Interventional procedures have grown both in frequency and importance during the last decade. This Coordinated Research Project (CRP) and TECDOC were developed within the International Atomic Energy Agency's (IAEA) framework of statutory responsibility to provide for the worldwide application of the standards for the protection of people against exposure to ionizing radiation. The CRP took place between 2003 and 2005 in six countries, with a view of optimizing the radiation protection of patients undergoing interventional procedures. The Fundamental Safety Principles and the International Basic Safety Standards for Protection against Ionizing Radiation (BSS) issued by the IAEA and co-sponsored by the Food and Agriculture Organization of the United Nations (FAO), the International Labour Organization (ILO), the World Health Organization (WHO), the Pan American Health Organization (PAHO) and the Nuclear Energy Agency (NEA), among others, require the radiation protection of patients undergoing medical exposures through justification of the procedures involved and through optimization. In keeping with its responsibility on the application of standards, the IAEA programme on Radiological Protection of Patients encourages the reduction of patient doses. To facilitate this, it has issued specific advice on the application of the BSS in the field of radiology in Safety Reports Series No. 39 and the three volumes on Radiation

  12. Guiding automated NMR structure determination using a global optimization metric, the NMR DP score

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Yuanpeng Janet, E-mail: yphuang@cabm.rutgers.edu; Mao, Binchen; Xu, Fei; Montelione, Gaetano T., E-mail: gtm@rutgers.edu [Rutgers, The State University of New Jersey, Department of Molecular Biology and Biochemistry, Center for Advanced Biotechnology and Medicine, and Northeast Structural Genomics Consortium (United States)

    2015-08-15

    ASDP is an automated NMR NOE assignment program. It uses a distinct bottom-up topology-constrained network anchoring approach for NOE interpretation, with 2D, 3D and/or 4D NOESY peak lists and resonance assignments as input, and generates unambiguous NOE constraints for iterative structure calculations. ASDP is designed to function interactively with various structure determination programs that use distance restraints to generate molecular models. In the CASD–NMR project, ASDP was tested and further developed using blinded NMR data, including resonance assignments, either raw or manually-curated (refined) NOESY peak list data, and in some cases {sup 15}N–{sup 1}H residual dipolar coupling data. In these blinded tests, in which the reference structure was not available until after structures were generated, the fully-automated ASDP program performed very well on all targets using both the raw and refined NOESY peak list data. Improvements of ASDP relative to its predecessor program for automated NOESY peak assignments, AutoStructure, were driven by challenges provided by these CASD–NMR data. These algorithmic improvements include (1) using a global metric of structural accuracy, the discriminating power score, for guiding model selection during the iterative NOE interpretation process, and (2) identifying incorrect NOESY cross peak assignments caused by errors in the NMR resonance assignment list. These improvements provide a more robust automated NOESY analysis program, ASDP, with the unique capability of being utilized with alternative structure generation and refinement programs including CYANA, CNS, and/or Rosetta.

  13. Push-through direct injection NMR: an optimized automation method applied to metabolomics

    Science.gov (United States)

    There is a pressing need to increase the throughput of NMR analysis in fields such as metabolomics and drug discovery. Direct injection (DI) NMR automation is recognized to have the potential to meet this need due to its suitability for integration with the 96-well plate format. ...

  14. Vibrational quasi-degenerate perturbation theory with optimized coordinates: applications to ethylene and trans-1,3-butadiene.

    Science.gov (United States)

    Yagi, Kiyoshi; Otaki, Hiroki

    2014-02-28

    A perturbative extension to optimized coordinate vibrational self-consistent field (oc-VSCF) is proposed based on the quasi-degenerate perturbation theory (QDPT). A scheme to construct the degenerate space (P space) is developed, which incorporates degenerate configurations and alleviates the divergence of perturbative expansion due to localized coordinates in oc-VSCF (e.g., local O-H stretching modes of water). An efficient configuration selection scheme is also implemented, which screens out the Hamiltonian matrix element between the P space configuration (p) and the complementary Q space configuration (q) based on a difference in their quantum numbers (λpq = ∑s|ps - qs|). It is demonstrated that the second-order vibrational QDPT based on optimized coordinates (oc-VQDPT2) smoothly converges with respect to the order of the mode coupling, and outperforms the conventional one based on normal coordinates. Furthermore, an improved, fast algorithm is developed for optimizing the coordinates. First, the minimization of the VSCF energy is conducted in a restricted parameter space, in which only a portion of pairs of coordinates is selectively transformed. A rational index is devised for this purpose, which identifies the important coordinate pairs to mix from others that may remain unchanged based on the magnitude of harmonic coupling induced by the transformation. Second, a cubic force field (CFF) is employed in place of a quartic force field, which bypasses intensive procedures that arise due to the presence of the fourth-order force constants. It is found that oc-VSCF based on CFF together with the pair selection scheme yields the coordinates similar in character to the conventional ones such that the final vibrational energy is affected very little while gaining an order of magnitude acceleration. The proposed method is applied to ethylene and trans-1,3-butadiene. An accurate, multi-resolution potential, which combines the MP2 and coupled-cluster with singles

  15. Optimal Ordering Policy and Coordination Mechanism of a Supply Chain with Controllable Lead-Time-Dependent Demand Forecast

    Directory of Open Access Journals (Sweden)

    Hua-Ming Song

    2011-01-01

    Full Text Available This paper investigates the ordering decisions and coordination mechanism for a distributed short-life-cycle supply chain. The objective is to maximize the whole supply chain's expected profit and meanwhile make the supply chain participants achieve a Pareto improvement. We treat lead time as a controllable variable, thus the demand forecast is dependent on lead time: the shorter lead time, the better forecast. Moreover, optimal decision-making models for lead time and order quantity are formulated and compared in the decentralized and centralized cases. Besides, a three-parameter contract is proposed to coordinate the supply chain and alleviate the double margin in the decentralized scenario. In addition, based on the analysis of the models, we develop an algorithmic procedure to find the optimal ordering decisions. Finally, a numerical example is also presented to illustrate the results.

  16. GENPLAT: an automated platform for biomass enzyme discovery and cocktail optimization.

    Science.gov (United States)

    Walton, Jonathan; Banerjee, Goutami; Car, Suzana

    2011-01-01

    The high cost of enzymes for biomass deconstruction is a major impediment to the economic conversion of lignocellulosic feedstocks to liquid transportation fuels such as ethanol. We have developed an integrated high throughput platform, called GENPLAT, for the discovery and development of novel enzymes and enzyme cocktails for the release of sugars from diverse pretreatment/biomass combinations. GENPLAT comprises four elements: individual pure enzymes, statistical design of experiments, robotic pipeting of biomass slurries and enzymes, and automated colorimeteric determination of released Glc and Xyl. Individual enzymes are produced by expression in Pichia pastoris or Trichoderma reesei, or by chromatographic purification from commercial cocktails or from extracts of novel microorganisms. Simplex lattice (fractional factorial) mixture models are designed using commercial Design of Experiment statistical software. Enzyme mixtures of high complexity are constructed using robotic pipeting into a 96-well format. The measurement of released Glc and Xyl is automated using enzyme-linked colorimetric assays. Optimized enzyme mixtures containing as many as 16 components have been tested on a variety of feedstock and pretreatment combinations. GENPLAT is adaptable to mixtures of pure enzymes, mixtures of commercial products (e.g., Accellerase 1000 and Novozyme 188), extracts of novel microbes, or combinations thereof. To make and test mixtures of ˜10 pure enzymes requires less than 100 μg of each protein and fewer than 100 total reactions, when operated at a final total loading of 15 mg protein/g glucan. We use enzymes from several sources. Enzymes can be purified from natural sources such as fungal cultures (e.g., Aspergillus niger, Cochliobolus carbonum, and Galerina marginata), or they can be made by expression of the encoding genes (obtained from the increasing number of microbial genome sequences) in hosts such as E. coli, Pichia pastoris, or a filamentous fungus such

  17. OPTIMIZATION OF THE PREPARATION PROCESS IN MINIBASCHETBALL BY OPERATIONAL STRUCTURES OF COORDINATIVE TYPE

    Directory of Open Access Journals (Sweden)

    CĂTĂNESCU A.

    2015-12-01

    Full Text Available Object of research. Consist in improving the training process of mini basketball players, by introducing some structures of coordinative type, focused on: ball control when passing, reaction times, unipodale and bipodale coordination, balance, court orientation and the ambidextrous. Hypothesis. If in the learning stage of basketball, we will use operational structures of coordinative type, which implies most of the components of the coordinative capacity: (single podal coordination and bi-podal coordination, balance, ball control, field orientation, ambidextrousness, then the speed of assimilation of the technical procedures will increase significantly. Material and methods. Studying the specialty literature, direct and indirect observation, testing methods, mathematical and tabular statistics. Results. By comparing the results gain by at the final evaluation, by the control group and by the experimental group it was observed significant statistic differences (p<0.001 between the two groups, for all the evaluation test realized. These results allowed us to reject the null hypothesis and sustain that preparation programs used by us have improved the performance of the subjects in the experimental group. Conclusions. . So after the results of the proposed experiment and submitting the control samples at all realized tests, the subjects of the experimental group got better results, differences between the evaluations being confirmed statistically. Also, the evaluation tests proposed are described by a progressive growth of the number of elements proposed and tested, but also by the growth of the interaction between the components of coordinative capacity and technical tactic elements specific to basketball.

  18. Optimal Coordinated Management of a Plug-In Electric Vehicle Charging Station under a Flexible Penalty Contract for Voltage Security

    Directory of Open Access Journals (Sweden)

    Jip Kim

    2016-07-01

    Full Text Available The increasing penetration of plug-in electric vehicles (PEVs may cause a low-voltage problem in the distribution network. In particular, the introduction of charging stations where multiple PEVs are simultaneously charged at the same bus can aggravate the low-voltage problem. Unlike a distribution network operator (DNO who has the overall responsibility for stable and reliable network operation, a charging station operator (CSO may schedule PEV charging without consideration for the resulting severe voltage drop. Therefore, there is a need for the DNO to impose a coordination measure to induce the CSO to adjust its charging schedule to help mitigate the voltage problem. Although the current time-of-use (TOU tariff is an indirect coordination measure that can motivate the CSO to shift its charging demand to off-peak time by imposing a high rate at the peak time, it is limited by its rigidity in that the network voltage condition cannot be flexibly reflected in the tariff. Therefore, a flexible penalty contract (FPC for voltage security to be used as a direct coordination measure is proposed. In addition, the optimal coordinated management is formulated. Using the Pacific Gas and Electric Company (PG&E 69-bus test distribution network, the effectiveness of the coordination was verified by comparison with the current TOU tariff.

  19. Microprocessor-based integration of microfluidic control for the implementation of automated sensor monitoring and multithreaded optimization algorithms.

    Science.gov (United States)

    Ezra, Elishai; Maor, Idan; Bavli, Danny; Shalom, Itai; Levy, Gahl; Prill, Sebastian; Jaeger, Magnus S; Nahmias, Yaakov

    2015-08-01

    Microfluidic applications range from combinatorial synthesis to high throughput screening, with platforms integrating analog perfusion components, digitally controlled micro-valves and a range of sensors that demand a variety of communication protocols. Currently, discrete control units are used to regulate and monitor each component, resulting in scattered control interfaces that limit data integration and synchronization. Here, we present a microprocessor-based control unit, utilizing the MS Gadgeteer open framework that integrates all aspects of microfluidics through a high-current electronic circuit that supports and synchronizes digital and analog signals for perfusion components, pressure elements, and arbitrary sensor communication protocols using a plug-and-play interface. The control unit supports an integrated touch screen and TCP/IP interface that provides local and remote control of flow and data acquisition. To establish the ability of our control unit to integrate and synchronize complex microfluidic circuits we developed an equi-pressure combinatorial mixer. We demonstrate the generation of complex perfusion sequences, allowing the automated sampling, washing, and calibrating of an electrochemical lactate sensor continuously monitoring hepatocyte viability following exposure to the pesticide rotenone. Importantly, integration of an optical sensor allowed us to implement automated optimization protocols that require different computational challenges including: prioritized data structures in a genetic algorithm, distributed computational efforts in multiple-hill climbing searches and real-time realization of probabilistic models in simulated annealing. Our system offers a comprehensive solution for establishing optimization protocols and perfusion sequences in complex microfluidic circuits. PMID:26227212

  20. Microprocessor-based integration of microfluidic control for the implementation of automated sensor monitoring and multithreaded optimization algorithms.

    Science.gov (United States)

    Ezra, Elishai; Maor, Idan; Bavli, Danny; Shalom, Itai; Levy, Gahl; Prill, Sebastian; Jaeger, Magnus S; Nahmias, Yaakov

    2015-08-01

    Microfluidic applications range from combinatorial synthesis to high throughput screening, with platforms integrating analog perfusion components, digitally controlled micro-valves and a range of sensors that demand a variety of communication protocols. Currently, discrete control units are used to regulate and monitor each component, resulting in scattered control interfaces that limit data integration and synchronization. Here, we present a microprocessor-based control unit, utilizing the MS Gadgeteer open framework that integrates all aspects of microfluidics through a high-current electronic circuit that supports and synchronizes digital and analog signals for perfusion components, pressure elements, and arbitrary sensor communication protocols using a plug-and-play interface. The control unit supports an integrated touch screen and TCP/IP interface that provides local and remote control of flow and data acquisition. To establish the ability of our control unit to integrate and synchronize complex microfluidic circuits we developed an equi-pressure combinatorial mixer. We demonstrate the generation of complex perfusion sequences, allowing the automated sampling, washing, and calibrating of an electrochemical lactate sensor continuously monitoring hepatocyte viability following exposure to the pesticide rotenone. Importantly, integration of an optical sensor allowed us to implement automated optimization protocols that require different computational challenges including: prioritized data structures in a genetic algorithm, distributed computational efforts in multiple-hill climbing searches and real-time realization of probabilistic models in simulated annealing. Our system offers a comprehensive solution for establishing optimization protocols and perfusion sequences in complex microfluidic circuits.

  1. Automated optimal glycaemic control using a physiology based pharmacokinetic, pharmacodynamic model

    OpenAIRE

    Schaller, Stephan

    2015-01-01

    After decades of research, Automated Glucose Control (AGC) is still out of reach for everyday control of blood glucose. The inter- and intra-individual variability of glucose dynamics largely arising from variability in insulin absorption, distribution, and action, and related physiological lag-times remain a core problem in the development of suitable control algorithms. Over the years, model predictive control (MPC) has established itself as the gold standard in AGC systems in research. Mod...

  2. Civil Engineering and Building Service Topographic Permanent Landmarks Network. Spatial Coordinate Optimization

    Directory of Open Access Journals (Sweden)

    Lepadatu Daniel

    2016-06-01

    Full Text Available Sustainable development is a modern concept of adaptation conditions for achieving objectives that respond simultaneously to at least three major requirements: economic, social and environmental. Achieving sustainable development cannot be accomplished without a change of mentality of people and without communities able to use resources rationally and efficiently. For an efficient application programs surveying topography discipline the students have imagined and created a network of local topographic permanent terminals required for reporting the rectangular coordinates of applications. In order to obtain more accurate values of these coordinates we have made several types of measurements that will be presented in detail in this work.

  3. Optimal Geometrical Set for Automated Marker Placement to Virtualized Real-Time Facial Emotions.

    Directory of Open Access Journals (Sweden)

    Vasanthan Maruthapillai

    Full Text Available In recent years, real-time face recognition has been a major topic of interest in developing intelligent human-machine interaction systems. Over the past several decades, researchers have proposed different algorithms for facial expression recognition, but there has been little focus on detection in real-time scenarios. The present work proposes a new algorithmic method of automated marker placement used to classify six facial expressions: happiness, sadness, anger, fear, disgust, and surprise. Emotional facial expressions were captured using a webcam, while the proposed algorithm placed a set of eight virtual markers on each subject's face. Facial feature extraction methods, including marker distance (distance between each marker to the center of the face and change in marker distance (change in distance between the original and new marker positions, were used to extract three statistical features (mean, variance, and root mean square from the real-time video sequence. The initial position of each marker was subjected to the optical flow algorithm for marker tracking with each emotional facial expression. Finally, the extracted statistical features were mapped into corresponding emotional facial expressions using two simple non-linear classifiers, K-nearest neighbor and probabilistic neural network. The results indicate that the proposed automated marker placement algorithm effectively placed eight virtual markers on each subject's face and gave a maximum mean emotion classification rate of 96.94% using the probabilistic neural network.

  4. Optimal Geometrical Set for Automated Marker Placement to Virtualized Real-Time Facial Emotions.

    Science.gov (United States)

    Maruthapillai, Vasanthan; Murugappan, Murugappan

    2016-01-01

    In recent years, real-time face recognition has been a major topic of interest in developing intelligent human-machine interaction systems. Over the past several decades, researchers have proposed different algorithms for facial expression recognition, but there has been little focus on detection in real-time scenarios. The present work proposes a new algorithmic method of automated marker placement used to classify six facial expressions: happiness, sadness, anger, fear, disgust, and surprise. Emotional facial expressions were captured using a webcam, while the proposed algorithm placed a set of eight virtual markers on each subject's face. Facial feature extraction methods, including marker distance (distance between each marker to the center of the face) and change in marker distance (change in distance between the original and new marker positions), were used to extract three statistical features (mean, variance, and root mean square) from the real-time video sequence. The initial position of each marker was subjected to the optical flow algorithm for marker tracking with each emotional facial expression. Finally, the extracted statistical features were mapped into corresponding emotional facial expressions using two simple non-linear classifiers, K-nearest neighbor and probabilistic neural network. The results indicate that the proposed automated marker placement algorithm effectively placed eight virtual markers on each subject's face and gave a maximum mean emotion classification rate of 96.94% using the probabilistic neural network. PMID:26859884

  5. Digital Piracy: An Assessment of Consumer Piracy Risk and Optimal Supply Chain Coordination Strategies

    Science.gov (United States)

    Jeong, Bong-Keun

    2010-01-01

    Digital piracy and the emergence of new distribution channels have changed the dynamics of supply chain coordination and created many interesting problems. There has been increased attention to understanding the phenomenon of consumer piracy behavior and its impact on supply chain profitability. The purpose of this dissertation is to better…

  6. Research on ISFLA-Based Optimal Control Strategy for the Coordinated Charging of EV Battery Swap Station

    Directory of Open Access Journals (Sweden)

    Xueliang Huang

    2013-01-01

    Full Text Available As an important component of the smart grid, electric vehicles (EVs could be a good measure against energy shortages and environmental pollution. A main way of energy supply to EVs is to swap battery from the swap station. Based on the characteristics of EV battery swap station, the coordinated charging optimal control strategy is investigated to smooth the load fluctuation. Shuffled frog leaping algorithm (SFLA is an optimization method inspired by the memetic evolution of a group of frogs when seeking food. An improved shuffled frog leaping algorithm (ISFLA with the reflecting method to deal with the boundary constraint is proposed to obtain the solution of the optimal control strategy for coordinated charging. Based on the daily load of a certain area, the numerical simulations including the comparison of PSO and ISFLA are carried out and the results show that the presented ISFLA can effectively lower the peak-valley difference and smooth the load profile with the faster convergence rate and higher convergence precision.

  7. Automated Identification of the Heart Wall Throughout the Entire Cardiac Cycle Using Optimal Cardiac Phase for Extracted Features

    Science.gov (United States)

    Takahashi, Hiroki; Hasegawa, Hideyuki; Kanai, Hiroshi

    2011-07-01

    In most methods for evaluation of cardiac function based on echocardiography, the heart wall is currently identified manually by an operator. However, this task is very time-consuming and suffers from inter- and intraobserver variability. The present paper proposes a method that uses multiple features of ultrasonic echo signals for automated identification of the heart wall region throughout an entire cardiac cycle. In addition, the optimal cardiac phase to select a frame of interest, i.e., the frame for the initiation of tracking, was determined. The heart wall region at the frame of interest in this cardiac phase was identified by the expectation-maximization (EM) algorithm, and heart wall regions in the following frames were identified by tracking each point classified in the initial frame as the heart wall region using the phased tracking method. The results for two subjects indicate the feasibility of the proposed method in the longitudinal axis view of the heart.

  8. Statistical Learning in Automated Troubleshooting: Application to LTE Interference Mitigation

    CERN Document Server

    Tiwana, Moazzam Islam; Altman, Zwi

    2010-01-01

    This paper presents a method for automated healing as part of off-line automated troubleshooting. The method combines statistical learning with constraint optimization. The automated healing aims at locally optimizing radio resource management (RRM) or system parameters of cells with poor performance in an iterative manner. The statistical learning processes the data using Logistic Regression (LR) to extract closed form (functional) relations between Key Performance Indicators (KPIs) and Radio Resource Management (RRM) parameters. These functional relations are then processed by an optimization engine which proposes new parameter values. The advantage of the proposed formulation is the small number of iterations required by the automated healing method to converge, making it suitable for off-line implementation. The proposed method is applied to heal an Inter-Cell Interference Coordination (ICIC) process in a 3G Long Term Evolution (LTE) network which is based on soft-frequency reuse scheme. Numerical simulat...

  9. GPRS and Bluetooth Based Devices/Mobile Connectivity Shifting From Manual To Automation For Performance Optimization

    Directory of Open Access Journals (Sweden)

    Nazia Bibi

    2011-09-01

    Full Text Available Many companies/organizations are trying to move towards automation and provide their workers with the internet facility on their mobile in order to carry out their routine tasks to save time and resources. The proposed system is based on GPRS technology aims to provide a solution to problem faced in carryout routine tasks considering mobility. The system is designed in a way that facilitates Workers/field staff get updates on their mobile phone regarding tasks at hand. This System is beneficial in a sense that it saves resources in term of time, human resources and cuts down the paper work. The proposed system has been developed in view of research study conducted in the software development and telecom industry and provides a high end solution to the customers/fieldworkers that use GPRS technology for transactions updates of databases.

  10. An automated optimization tool for high-dose-rate (HDR) prostate brachytherapy with divergent needle pattern

    Science.gov (United States)

    Borot de Battisti, M.; Maenhout, M.; de Senneville, B. Denis; Hautvast, G.; Binnekamp, D.; Lagendijk, J. J. W.; van Vulpen, M.; Moerland, M. A.

    2015-10-01

    Focal high-dose-rate (HDR) for prostate cancer has gained increasing interest as an alternative to whole gland therapy as it may contribute to the reduction of treatment related toxicity. For focal treatment, optimal needle guidance and placement is warranted. This can be achieved under MR guidance. However, MR-guided needle placement is currently not possible due to space restrictions in the closed MR bore. To overcome this problem, a MR-compatible, single-divergent needle-implant robotic device is under development at the University Medical Centre, Utrecht: placed between the legs of the patient inside the MR bore, this robot will tap the needle in a divergent pattern from a single rotation point into the tissue. This rotation point is just beneath the perineal skin to have access to the focal prostate tumor lesion. Currently, there is no treatment planning system commercially available which allows optimization of the dose distribution with such needle arrangement. The aim of this work is to develop an automatic inverse dose planning optimization tool for focal HDR prostate brachytherapy with needle insertions in a divergent configuration. A complete optimizer workflow is proposed which includes the determination of (1) the position of the center of rotation, (2) the needle angulations and (3) the dwell times. Unlike most currently used optimizers, no prior selection or adjustment of input parameters such as minimum or maximum dose or weight coefficients for treatment region and organs at risk is required. To test this optimizer, a planning study was performed on ten patients (treatment volumes ranged from 8.5 cm3to 23.3 cm3) by using 2-14 needle insertions. The total computation time of the optimizer workflow was below 20 min and a clinically acceptable plan was reached on average using only four needle insertions.

  11. Optimization and coordination of South-to-North Water Diversion supply chain with strategic customer behavior

    OpenAIRE

    Chen, Zhi-Song; Wang, Hui-Min

    2012-01-01

    The South-to-North Water Diversion (SNWD) Project is a significant engineering project meant to solve water shortage problems in North China. Faced with market operations management of the water diversion system, this study defined the supply chain system for the SNWD Project, considering the actual project conditions, built a decentralized decision model and a centralized decision model with strategic customer behavior (SCB) using a floating pricing mechanism (FPM), and constructed a coordin...

  12. IMPORTANCE OF THE COORDINATIVE ABILITIES DEVELOPMENT IN OPTIMIZING THE SELECTION PROCESS FOR THE ELITE ATHLETES

    Directory of Open Access Journals (Sweden)

    Juravle I.

    2013-02-01

    Full Text Available The study related in this paper reflects the opinion of various experts regarding the importance in developing thecoordinative abilities level to improve selection system of elite athletes. These coordinative abilities can be viewed as the ability of aperson that performs actions with a high degree of difficulty, adjusting the movements in time and space and taking into account newsituations that occur.The main research method used in this paper is based on the literature studies in this area of interest, i.e. articles andpublications, manuals, tutorials, etc. Initially, the study began from the hypothesis that significant improvements can be observed inthe selection process of the young athlete’s when is take into account the development of the coordinative abilities.Analyzing the related work in this field, selection process is the decisive factor in creating the assumptions for achievinghigh performances in sport. Also, these researches provide criteria, samples and standards, features and models for initial and primaryselection process, and also for the selection of the Olympic or national athletes groups.The conclusion of this study shows that one of the most important criteria for athletes’ selection process is represented bytheir level of development of coordination abilities. Researches included in this paper also argue the importance of athlete’scoordination abilities development for selection process in different types of team sport games.

  13. An Automated, Adaptive Framework for Optimizing Preprocessing Pipelines in Task-Based Functional MRI.

    Science.gov (United States)

    Churchill, Nathan W; Spring, Robyn; Afshin-Pour, Babak; Dong, Fan; Strother, Stephen C

    2015-01-01

    BOLD fMRI is sensitive to blood-oxygenation changes correlated with brain function; however, it is limited by relatively weak signal and significant noise confounds. Many preprocessing algorithms have been developed to control noise and improve signal detection in fMRI. Although the chosen set of preprocessing and analysis steps (the "pipeline") significantly affects signal detection, pipelines are rarely quantitatively validated in the neuroimaging literature, due to complex preprocessing interactions. This paper outlines and validates an adaptive resampling framework for evaluating and optimizing preprocessing choices by optimizing data-driven metrics of task prediction and spatial reproducibility. Compared to standard "fixed" preprocessing pipelines, this optimization approach significantly improves independent validation measures of within-subject test-retest, and between-subject activation overlap, and behavioural prediction accuracy. We demonstrate that preprocessing choices function as implicit model regularizers, and that improvements due to pipeline optimization generalize across a range of simple to complex experimental tasks and analysis models. Results are shown for brief scanning sessions (<3 minutes each), demonstrating that with pipeline optimization, it is possible to obtain reliable results and brain-behaviour correlations in relatively small datasets.

  14. An Automated, Adaptive Framework for Optimizing Preprocessing Pipelines in Task-Based Functional MRI.

    Directory of Open Access Journals (Sweden)

    Nathan W Churchill

    Full Text Available BOLD fMRI is sensitive to blood-oxygenation changes correlated with brain function; however, it is limited by relatively weak signal and significant noise confounds. Many preprocessing algorithms have been developed to control noise and improve signal detection in fMRI. Although the chosen set of preprocessing and analysis steps (the "pipeline" significantly affects signal detection, pipelines are rarely quantitatively validated in the neuroimaging literature, due to complex preprocessing interactions. This paper outlines and validates an adaptive resampling framework for evaluating and optimizing preprocessing choices by optimizing data-driven metrics of task prediction and spatial reproducibility. Compared to standard "fixed" preprocessing pipelines, this optimization approach significantly improves independent validation measures of within-subject test-retest, and between-subject activation overlap, and behavioural prediction accuracy. We demonstrate that preprocessing choices function as implicit model regularizers, and that improvements due to pipeline optimization generalize across a range of simple to complex experimental tasks and analysis models. Results are shown for brief scanning sessions (<3 minutes each, demonstrating that with pipeline optimization, it is possible to obtain reliable results and brain-behaviour correlations in relatively small datasets.

  15. Economic Load Dispatch - A Comparative Study on Heuristic Optimization Techniques With an Improved Coordinated Aggregation-Based PSO

    DEFF Research Database (Denmark)

    Vlachogiannis, Ioannis (John); Lee, KY

    2009-01-01

    In this paper an improved coordinated aggregation-based particle swarm optimization (ICA-PSO) algorithm is introduced for solving the optimal economic load dispatch (ELD) problem in power systems. In the ICA-PSO algorithm each particle in the swarm retains a memory of its best position ever...... encountered, and is attracted only by other particles with better achievements than its own with the exception of the particle with the best achievement, which moves randomly. Moreover, the population size is increased adaptively, the number of search intervals for the particles is selected adaptively...... and the particles search the decision space with accuracy up to two digit points resulting in the improved convergence of the process. The ICA-PSO algorithm is tested on a number of power systems, including the systems with 6, 13, 15, and 40 generating units, the island power system of Crete in Greece...

  16. Fully automated molecular biology routines on a plasmid-based functional proteomic workcell: Evaluation and Characterization of Yeast Strains Optimized for Growth on Xylose Expressing "Stealth" Insecticidal Peptides.

    Science.gov (United States)

    Optimization of genes important to production of fuel ethanol from hemicellulosic biomass for use in developing improved commercial yeast strains is necessary to meet the rapidly expanding need for ethanol. The United States Department of Agriculture has developed a fully automated platform for mol...

  17. SU-E-J-130: Automating Liver Segmentation Via Combined Global and Local Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Li, Dengwang; Wang, Jie [College of Physics and Electronics, Shandong Normal University, Jinan, Shandong (China); Kapp, Daniel S.; Xing, Lei [Department of Radiation Oncology, Stanford University School of Medicine, Stanford, CA (United States)

    2015-06-15

    Purpose: The aim of this work is to develop a robust algorithm for accurate segmentation of liver with special attention paid to the problems with fuzzy edges and tumor. Methods: 200 CT images were collected from radiotherapy treatment planning system. 150 datasets are selected as the panel data for shape dictionary and parameters estimation. The remaining 50 datasets were used as test images. In our study liver segmentation was formulated as optimization process of implicit function. The liver region was optimized via local and global optimization during iterations. Our method consists five steps: 1)The livers from the panel data were segmented manually by physicians, and then We estimated the parameters of GMM (Gaussian mixture model) and MRF (Markov random field). Shape dictionary was built by utilizing the 3D liver shapes. 2)The outlines of chest and abdomen were located according to rib structure in the input images, and the liver region was initialized based on GMM. 3)The liver shape for each 2D slice was adjusted using MRF within the neighborhood of liver edge for local optimization. 4)The 3D liver shape was corrected by employing SSR (sparse shape representation) based on liver shape dictionary for global optimization. Furthermore, H-PSO(Hybrid Particle Swarm Optimization) was employed to solve the SSR equation. 5)The corrected 3D liver was divided into 2D slices as input data of the third step. The iteration was repeated within the local optimization and global optimization until it satisfied the suspension conditions (maximum iterations and changing rate). Results: The experiments indicated that our method performed well even for the CT images with fuzzy edge and tumors. Comparing with physician delineated results, the segmentation accuracy with the 50 test datasets (VOE, volume overlap percentage) was on average 91%–95%. Conclusion: The proposed automatic segmentation method provides a sensible technique for segmentation of CT images. This work is

  18. SU-E-J-130: Automating Liver Segmentation Via Combined Global and Local Optimization

    International Nuclear Information System (INIS)

    Purpose: The aim of this work is to develop a robust algorithm for accurate segmentation of liver with special attention paid to the problems with fuzzy edges and tumor. Methods: 200 CT images were collected from radiotherapy treatment planning system. 150 datasets are selected as the panel data for shape dictionary and parameters estimation. The remaining 50 datasets were used as test images. In our study liver segmentation was formulated as optimization process of implicit function. The liver region was optimized via local and global optimization during iterations. Our method consists five steps: 1)The livers from the panel data were segmented manually by physicians, and then We estimated the parameters of GMM (Gaussian mixture model) and MRF (Markov random field). Shape dictionary was built by utilizing the 3D liver shapes. 2)The outlines of chest and abdomen were located according to rib structure in the input images, and the liver region was initialized based on GMM. 3)The liver shape for each 2D slice was adjusted using MRF within the neighborhood of liver edge for local optimization. 4)The 3D liver shape was corrected by employing SSR (sparse shape representation) based on liver shape dictionary for global optimization. Furthermore, H-PSO(Hybrid Particle Swarm Optimization) was employed to solve the SSR equation. 5)The corrected 3D liver was divided into 2D slices as input data of the third step. The iteration was repeated within the local optimization and global optimization until it satisfied the suspension conditions (maximum iterations and changing rate). Results: The experiments indicated that our method performed well even for the CT images with fuzzy edge and tumors. Comparing with physician delineated results, the segmentation accuracy with the 50 test datasets (VOE, volume overlap percentage) was on average 91%–95%. Conclusion: The proposed automatic segmentation method provides a sensible technique for segmentation of CT images. This work is

  19. Automated evolutionary optimization of ion channel conductances and kinetics in models of young and aged rhesus monkey pyramidal neurons.

    Science.gov (United States)

    Rumbell, Timothy H; Draguljić, Danel; Yadav, Aniruddha; Hof, Patrick R; Luebke, Jennifer I; Weaver, Christina M

    2016-08-01

    Conductance-based compartment modeling requires tuning of many parameters to fit the neuron model to target electrophysiological data. Automated parameter optimization via evolutionary algorithms (EAs) is a common approach to accomplish this task, using error functions to quantify differences between model and target. We present a three-stage EA optimization protocol for tuning ion channel conductances and kinetics in a generic neuron model with minimal manual intervention. We use the technique of Latin hypercube sampling in a new way, to choose weights for error functions automatically so that each function influences the parameter search to a similar degree. This protocol requires no specialized physiological data collection and is applicable to commonly-collected current clamp data and either single- or multi-objective optimization. We applied the protocol to two representative pyramidal neurons from layer 3 of the prefrontal cortex of rhesus monkeys, in which action potential firing rates are significantly higher in aged compared to young animals. Using an idealized dendritic topology and models with either 4 or 8 ion channels (10 or 23 free parameters respectively), we produced populations of parameter combinations fitting the target datasets in less than 80 hours of optimization each. Passive parameter differences between young and aged models were consistent with our prior results using simpler models and hand tuning. We analyzed parameter values among fits to a single neuron to facilitate refinement of the underlying model, and across fits to multiple neurons to show how our protocol will lead to predictions of parameter differences with aging in these neurons. PMID:27106692

  20. Fully automated tracking of cardiac structures using radiopaque markers and high-frequency videofluoroscopy in an in vivo ovine model: from three-dimensional marker coordinates to quantitative analyses

    OpenAIRE

    Bothe, Wolfgang; Schubert, Harald; Diab, Mahmoud; Faerber, Gloria; Bettag, Christoph; Jiang, Xiaoyan; Fischer, Martin S; Denzler, Joachim; Doenst, Torsten

    2016-01-01

    Purpose Recently, algorithms were developed to track radiopaque markers in the heart fully automated. However, the methodology did not allow to assign the exact anatomical location to each marker. In this case study we describe the steps from the generation of three-dimensional marker coordinates to quantitative data analyses in an in vivo ovine model. Methods In one adult sheep, twenty silver balls were sutured to the right side of the heart: 10 to the tricuspid annulus, one to the anterior ...

  1. An Optimized Clustering Approach for Automated Detection of White Matter Lesions in MRI Brain Images

    Directory of Open Access Journals (Sweden)

    M. Anitha

    2012-04-01

    Full Text Available Settings White Matter lesions (WMLs are small areas of dead cells found in parts of the brain. In general, it is difficult for medical experts to accurately quantify the WMLs due to decreased contrast between White Matter (WM and Grey Matter (GM. The aim of this paper is to
    automatically detect the White Matter Lesions which is present in the brains of elderly people. WML detection process includes the following stages: 1. Image preprocessing, 2. Clustering (Fuzzy c-means clustering, Geostatistical Possibilistic clustering and Geostatistical Fuzzy clustering and 3.Optimization using Particle Swarm Optimization (PSO. The proposed system is tested on a database of 208 MRI images. GFCM yields high sensitivity of 89%, specificity of 94% and overall accuracy of 93% over FCM and GPC. The clustered brain images are then subjected to Particle Swarm Optimization (PSO. The optimized result obtained from GFCM-PSO provides sensitivity of 90%, specificity of 94% and accuracy of 95%. The detection results reveals that GFCM and GFCMPSO better localizes the large regions of lesions and gives less false positive rate when compared to GPC and GPC-PSO which captures the largest loads of WMLs only in the upper ventral horns of the brain.

  2. SWANS: A Prototypic SCALE Criticality Sequence for Automated Optimization Using the SWAN Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Greenspan, E.

    2001-01-11

    SWANS is a new prototypic analysis sequence that provides an intelligent, semi-automatic search for the maximum k{sub eff} of a given amount of specified fissile material, or of the minimum critical mass. It combines the optimization strategy of the SWAN code with the composition-dependent resonance self-shielded cross sections of the SCALE package. For a given system composition arrived at during the iterative optimization process, the value of k{sub eff} is as accurate and reliable as obtained using the CSAS1X Sequence of SCALE-4.4. This report describes how SWAN is integrated within the SCALE system to form the new prototypic optimization sequence, describes the optimization procedure, provides a user guide for SWANS, and illustrates its application to five different types of problems. In addition, the report illustrates that resonance self-shielding might have a significant effect on the maximum k{sub eff} value a given fissile material mass can have.

  3. Automated Response Surface Methodology for Stochastic Optimization Models with Unknown Variance

    NARCIS (Netherlands)

    R.P. Nicolai (Robin); R. Dekker (Rommert)

    2005-01-01

    textabstractResponse Surface Methodology (RSM) is a tool that was introduced in the early 50´s by Box and Wilson (1951). It is a collection of mathematical and statistical techniques useful for the approximation and optimization of stochastic models. Applications of RSM can be found in e.g. chemical

  4. Automated tracing of open-field coronal structures for an optimized large-scale magnetic field reconstruction

    Science.gov (United States)

    Uritsky, V. M.; Davila, J. M.; Jones, S. I.

    2014-12-01

    Solar Probe Plus and Solar Orbiter will provide detailed measurements in the inner heliosphere magnetically connected with the topologically complex and eruptive solar corona. Interpretation of these measurements will require accurate reconstruction of the large-scale coronal magnetic field. In a related presentation by S. Jones et al., we argue that such reconstruction can be performed using photospheric extrapolation methods constrained by white-light coronagraph images. Here, we present the image-processing component of this project dealing with an automated segmentation of fan-like coronal loop structures. In contrast to the existing segmentation codes designed for detecting small-scale closed loops in the vicinity of active regions, we focus on the large-scale geometry of the open-field coronal features observed at significant radial distances from the solar surface. The coronagraph images used for the loop segmentation are transformed into a polar coordinate system and undergo radial detrending and initial noise reduction. The preprocessed images are subject to an adaptive second order differentiation combining radial and azimuthal directions. An adjustable thresholding technique is applied to identify candidate coronagraph features associated with the large-scale coronal field. A blob detection algorithm is used to extract valid features and discard noisy data pixels. The obtained features are interpolated using higher-order polynomials which are used to derive empirical directional constraints for magnetic field extrapolation procedures based on photospheric magnetograms.

  5. Deployable reflector antenna performance optimization using automated surface correction and array-feed compensation

    Science.gov (United States)

    Schroeder, Lyle C.; Bailey, M. C.; Mitchell, John L.

    1992-01-01

    Methods for increasing the electromagnetic (EM) performance of reflectors with rough surfaces were tested and evaluated. First, one quadrant of the 15-meter hoop-column antenna was retrofitted with computer-driven and controlled motors to allow automated adjustment of the reflector surface. The surface errors, measured with metric photogrammetry, were used in a previously verified computer code to calculate control motor adjustments. With this system, a rough antenna surface (rms of approximately 0.180 inch) was corrected in two iterations to approximately the structural surface smoothness limit of 0.060 inch rms. The antenna pattern and gain improved significantly as a result of these surface adjustments. The EM performance was evaluated with a computer program for distorted reflector antennas which had been previously verified with experimental data. Next, the effects of the surface distortions were compensated for in computer simulations by superimposing excitation from an array feed to maximize antenna performance relative to an undistorted reflector. Results showed that a 61-element array could produce EM performance improvements equal to surface adjustments. When both mechanical surface adjustment and feed compensation techniques were applied, the equivalent operating frequency increased from approximately 6 to 18 GHz.

  6. Process optimization and biocompatibility of cell carriers suitable for automated magnetic manipulation.

    Science.gov (United States)

    Krejci, I; Piana, C; Howitz, S; Wegener, T; Fiedler, S; Zwanzig, M; Schmitt, D; Daum, N; Meier, K; Lehr, C M; Batista, U; Zemljic, S; Messerschmidt, J; Franzke, J; Wirth, M; Gabor, F

    2012-03-01

    There is increasing demand for automated cell reprogramming in the fields of cell biology, biotechnology and the biomedical sciences. Microfluidic-based platforms that provide unattended manipulation of adherent cells promise to be an appropriate basis for cell manipulation. In this study we developed a magnetically driven cell carrier to serve as a vehicle within an in vitro environment. To elucidate the impact of the carrier on cells, biocompatibility was estimated using the human adenocarcinoma cell line Caco-2. Besides evaluation of the quality of the magnetic carriers by field emission scanning electron microscopy, the rate of adherence, proliferation and differentiation of Caco-2 cells grown on the carriers was quantified. Moreover, the morphology of the cells was monitored by immunofluorescent staining. Early generations of the cell carrier suffered from release of cytotoxic nickel from the magnetic cushion. Biocompatibility was achieved by complete encapsulation of the nickel bulk within galvanic gold. The insulation process had to be developed stepwise and was controlled by parallel monitoring of the cell viability. The final carrier generation proved to be a proper support for cell manipulation, allowing proliferation of Caco-2 cells equal to that on glass or polystyrene as a reference for up to 10 days. Functional differentiation was enhanced by more than 30% compared with the reference. A flat, ferromagnetic and fully biocompatible carrier for cell manipulation was developed for application in microfluidic systems. Beyond that, this study offers advice for the development of magnetic cell carriers and the estimation of their biocompatibility.

  7. A Wolf Pack Algorithm for Active and Reactive Power Coordinated Optimization in Active Distribution Network

    Science.gov (United States)

    Zhuang, H. M.; Jiang, X. J.

    2016-08-01

    This paper presents an active and reactive power dynamic optimization model for active distribution network (ADN), whose control variables include the output of distributed generations (DGs), charge or discharge power of energy storage system (ESS) and reactive power from capacitor banks. To solve the high-dimension nonlinear optimization model, a new heuristic swarm intelligent method, namely wolf pack algorithm (WPA) with better global convergence and computational robustness, is adapted so that the network loss minimization can be achieved. In this paper, the IEEE33-bus system is used to show the effectiveness of WPA technique compared with other techniques. Numerical tests on the modified IEEE 33-bus system show that WPA for active and reactive multi-period optimization of ADN is exact and effective.

  8. Coordination and Optimization: The Integrated Supply Chain Analysis with Non-Linear Price-Sensitive Demand

    Directory of Open Access Journals (Sweden)

    Mohammed Forhad UDDIN

    2012-01-01

    Full Text Available In this paper, a supply chain with a coordination mechanism consisting of a single vendor and buyeris considered. Further, instead of a price sensitive linear or deterministic demand function, a price-sensitivenon-linear demand function is introduced. To find the inventory cost, penalty cost and transportation cost, it isassumed that the production and shipping functions of the vendor are continuously harmonized and occur at thesame rate. In this integrated supply chain, the Buyer’s Linear Program (LP, vendor’s Integer Program (IP andcoordinated Mixed Integer Program (MIP models are formulated. In this research, numerical example ispresented which includes the sensitivity of the key parameters to illustrate the models. The solution proceduresdemonstrate that the individual profit as well as joint profit could be increased by a coordination mechanismeven though the demand function is non-linear. In addition, the results illustrate that Buyer’s selling price, alongwith the consumers purchasing price, could be decreased, which may increase the demand of the end market.Finally, a conclusion

  9. Optimal coordinated control of energy extraction in LES of wind farms: effect of turbine arrangement patterns

    Science.gov (United States)

    Meyers, Johan; Munters, Wim; Goit, Jay

    2015-11-01

    We investigate optimal control of wind-farm boundary layers, considering the individual wind turbines as flow actuators. By controlling the thrust coefficients of the turbines as function of time, the energy extraction can be dynamically regulated with the aim to optimally influence the flow field and the vertical energy transport. To this end, we use Large-Eddy Simulations (LES) of wind-farm boundary layers in a receding-horizon optimal control framework. Recently, the approach was applied to fully developed wind-farm boundary layers in a 7D by 6D aligned wind-turbine arrangement. For this case, energy extraction increased up to 16%, related to improved wake mixing by slightly anti-correlating the turbine thrust coefficient with the local wind speed at the turbine level. Here we discuss optimal control results for finite wind farms that are characterized by entrance effects and a developing internal boundary layer above the wind farm. Both aligned and staggered arrangement patterns are considered, and a range of different constraints on the controls is included. The authors acknowledge support from the European Research Council (FP7-Ideas, grant no. 306471). Simulations were performed on the infrastructure of the Flemish Supercomputer Center, funded by the Hercules Foundation and the Flemish Governement.

  10. Closure to Discussion on "Economic Load Dispatch-A Comparative Study on Heuristic Optimization Techniques With an Improved Coordinated Aggregation-Based PSO"

    DEFF Research Database (Denmark)

    Vlachogiannis, Ioannis (John); Lee, K. Y.

    2010-01-01

    In this paper an improved coordinated aggregation-based particle swarm optimization algorithm is introduced for solving the optimal economic load dispatch problem in power systems. In the improved coordinated aggregation-based particle swarm optimization algorithm each particle in the swarm retains...... a memory of its best position ever encountered, and is attracted only by other particles with better achievements than its own with the exception of the particle with the best achievement, which moves randomly.The ICA-PSO algorithm is tested on a number of power systems, including the systems with 6, 13...

  11. Waste Characterization Using Gamma Ray Spectrometry with Automated Efficiency Optimization - 13404

    Energy Technology Data Exchange (ETDEWEB)

    Bosko, A.; Venkataraman, R.; Bronson, F.L.; Ilie, G.; Russ, W.R. [Canberra Industries, 800 Research Parkway, Meriden, CT 06450 (United States)

    2013-07-01

    Gamma ray spectrometry using High Purity Germanium (HPGe) detectors is commonly employed in assaying radioactive waste streams from a variety of sources: nuclear power plants, Department of Energy (DOE) laboratories, medical facilities, decontamination and decommissioning activities etc. The radioactive material is typically packaged in boxes or drums (for e.g. B-25 boxes or 208 liter drums) and assayed to identify and quantify radionuclides. Depending on the origin of the waste stream, the radionuclides could be special nuclear materials (SNM), fission products, or activation products. Efficiency calibration of the measurement geometry is a critical step in the achieving accurate quantification of radionuclide content. Due to the large size of the waste items, it is impractical and expensive to manufacture gamma ray standard sources for performing a measurement based calibration. For well over a decade, mathematical efficiency methods such as those in Canberra's In Situ Object Counting System (ISOCS) have been successfully employed in the efficiency calibration of gamma based waste assay systems. In the traditional ISOCS based calibrations, the user provides input data such as the dimensions of the waste item, the average density and fill height of the matrix, and matrix composition. As in measurement based calibrations, the user typically defines a homogeneous matrix with a uniform distribution of radioactivity. Actual waste containers can be quite nonuniform, however. Such simplifying assumptions in the efficiency calibration could lead to a large Total Measurement Uncertainty (TMU), thus limiting the amount of waste that can be disposed of as intermediate or low activity level waste. To improve the accuracy of radionuclide quantification, and reduce the TMU, Canberra has developed the capability to optimize the efficiency calibration using the ISOCS method. The optimization is based on benchmarking the efficiency shape and magnitude to the data available

  12. Long-term evaluation of TiO2-based 68Ge/68Ga generators and optimized automation of [68Ga]DOTATOC radiosynthesis.

    Science.gov (United States)

    Lin, Mai; Ranganathan, David; Mori, Tetsuya; Hagooly, Aviv; Rossin, Raffaella; Welch, Michael J; Lapi, Suzanne E

    2012-10-01

    Interest in using (68)Ga is rapidly increasing for clinical PET applications due to its favorable imaging characteristics and increased accessibility. The focus of this study was to provide our long-term evaluations of the two TiO(2)-based (68)Ge/(68)Ga generators and develop an optimized automation strategy to synthesize [(68)Ga]DOTATOC by using HEPES as a buffer system. This data will be useful in standardizing the evaluation of (68)Ge/(68)Ga generators and automation strategies to comply with regulatory issues for clinical use. PMID:22897970

  13. Automation of reverse engineering process in aircraft modeling and related optimization problems

    Science.gov (United States)

    Li, W.; Swetits, J.

    1994-01-01

    During the year of 1994, the engineering problems in aircraft modeling were studied. The initial concern was to obtain a surface model with desirable geometric characteristics. Much of the effort during the first half of the year was to find an efficient way of solving a computationally difficult optimization model. Since the smoothing technique in the proposal 'Surface Modeling and Optimization Studies of Aerodynamic Configurations' requires solutions of a sequence of large-scale quadratic programming problems, it is important to design algorithms that can solve each quadratic program in a few interactions. This research led to three papers by Dr. W. Li, which were submitted to SIAM Journal on Optimization and Mathematical Programming. Two of these papers have been accepted for publication. Even though significant progress has been made during this phase of research and computation times was reduced from 30 min. to 2 min. for a sample problem, it was not good enough for on-line processing of digitized data points. After discussion with Dr. Robert E. Smith Jr., it was decided not to enforce shape constraints in order in order to simplify the model. As a consequence, P. Dierckx's nonparametric spline fitting approach was adopted, where one has only one control parameter for the fitting process - the error tolerance. At the same time the surface modeling software developed by Imageware was tested. Research indicated a substantially improved fitting of digitalized data points can be achieved if a proper parameterization of the spline surface is chosen. A winning strategy is to incorporate Dierckx's surface fitting with a natural parameterization for aircraft parts. The report consists of 4 chapters. Chapter 1 provides an overview of reverse engineering related to aircraft modeling and some preliminary findings of the effort in the second half of the year. Chapters 2-4 are the research results by Dr. W. Li on penalty functions and conjugate gradient methods for

  14. Optimized Energy Management of a Single-House Residential Micro-Grid With Automated Demand Response

    DEFF Research Database (Denmark)

    Anvari-Moghaddam, Amjad; Monsef, Hassan; Rahimi-Kian, Ashkan;

    2015-01-01

    In this paper, an intelligent multi-objective energy management system (MOEMS) is proposed for applications in residential LVAC micro-grids where households are equipped with smart appliances, such as washing machine, dishwasher, tumble dryer and electric heating and they have the capability to t...... to reduce residential energy use and improve the user’s satisfaction degree by optimal management of demand/generation sides....... to take part in demand response (DR) programs. The superior performance and efficiency of the proposed system is studied through several scenarios and case studies and validated in comparison with the conventional models. The simulation results demonstrate that the proposed MOEMS has the capability...

  15. A Unified Approach towards Decomposition and Coordination for Multi-level Optimization

    OpenAIRE

    de Wit, A. J.

    2009-01-01

    Complex systems, such as those encountered in aerospace engineering, can typically be considered as a hierarchy of individual coupled elements. This hierarchy is reflected in the analysis techniques that are used to analyze the physcial characteristics of the system. Consequently, a hierarchy of coupled models is to be used, accounting for different physical scales, components and/or disciplines. Numerical optimization of complex systems with embedded hierarchy is accomplished via multi-level...

  16. A Fully Automated Trial Selection Method for Optimization of Motor Imagery Based Brain-Computer Interface.

    Science.gov (United States)

    Zhou, Bangyan; Wu, Xiaopei; Lv, Zhao; Zhang, Lei; Guo, Xiaojin

    2016-01-01

    Independent component analysis (ICA) as a promising spatial filtering method can separate motor-related independent components (MRICs) from the multichannel electroencephalogram (EEG) signals. However, the unpredictable burst interferences may significantly degrade the performance of ICA-based brain-computer interface (BCI) system. In this study, we proposed a new algorithm frame to address this issue by combining the single-trial-based ICA filter with zero-training classifier. We developed a two-round data selection method to identify automatically the badly corrupted EEG trials in the training set. The "high quality" training trials were utilized to optimize the ICA filter. In addition, we proposed an accuracy-matrix method to locate the artifact data segments within a single trial and investigated which types of artifacts can influence the performance of the ICA-based MIBCIs. Twenty-six EEG datasets of three-class motor imagery were used to validate the proposed methods, and the classification accuracies were compared with that obtained by frequently used common spatial pattern (CSP) spatial filtering algorithm. The experimental results demonstrated that the proposed optimizing strategy could effectively improve the stability, practicality and classification performance of ICA-based MIBCI. The study revealed that rational use of ICA method may be crucial in building a practical ICA-based MIBCI system. PMID:27631789

  17. Optimal part and module selection for synthetic gene circuit design automation.

    Science.gov (United States)

    Huynh, Linh; Tagkopoulos, Ilias

    2014-08-15

    An integral challenge in synthetic circuit design is the selection of optimal parts to populate a given circuit topology, so that the resulting circuit behavior best approximates the desired one. In some cases, it is also possible to reuse multipart constructs or modules that have been already built and experimentally characterized. Efficient part and module selection algorithms are essential to systematically search the solution space, and their significance will only increase in the following years due to the projected explosion in part libraries and circuit complexity. Here, we address this problem by introducing a structured abstraction methodology and a dynamic programming-based algorithm that guaranties optimal part selection. In addition, we provide three extensions that are based on symmetry check, information look-ahead and branch-and-bound techniques, to reduce the running time and space requirements. We have evaluated the proposed methodology with a benchmark of 11 circuits, a database of 73 parts and 304 experimentally constructed modules with encouraging results. This work represents a fundamental departure from traditional heuristic-based methods for part and module selection and is a step toward maximizing efficiency in synthetic circuit design and construction. PMID:24933033

  18. Comparison and Assessment of a Multiple Optimal Coordinated Design Based on PSS and the STATCOM Device for Damping Power System Oscillations

    Directory of Open Access Journals (Sweden)

    A.N. Hussain

    2014-03-01

    Full Text Available The aim of this study is to present a comprehensive comparison and assessment of the damping function improvement for the multiple damping stabilizers using the simultaneously coordinated design based on Power System Stabilizer (PSS and Static synchronous Compensator (STATCOM. In electrical power system, the STATCOM device is used to support bus voltage by compensating reactive power; it is also capable of enhancing the stability of the power system by the adding a supplementary damping stabilizer to the internal AC or DC voltage control channel of the STATCOM inputs to serve as a Power Oscillation Damping (POD controller. Simultaneous coordination can be performed in different ways. First, the dual-coordinated design between PSS and STATCOM AC-POD stabilizer or DC-POD stabilizer is used. Then, coordination between the AC and DC STATCOM-based POD stabilizers are arranged in a single STATCOM device without PSS. Second, the coordinated design has been extended to triple multiple stabilizers among PSS, the AC-based POD and the DC-based POD in a Single Machine Infinite Bus (SMIB. The parameters of the multiple stabilizers have been tuned in the coordinated design by using a Chaotic Particle Swarm Optimization (CPSO algorithm that optimized the given eigenvalue-based objective function. The simulation results show that the dual-coordinated design provide satisfactory damping performance over the individual control responses. Furthermore, the-triple coordinated design has been shown to be more effective in damping oscillations than the dual damping stabilizers.

  19. Automated optimization of measurement setups for the inspection of specular surfaces

    Science.gov (United States)

    Kammel, Soeren

    2002-02-01

    Specular surfaces are used in a wide variety of industrial and consumer products like varnished or chrome plated parts of car bodies, dies or molds. Defects of these parts reduce the quality regarding their visual appearance and/or their technical performance. Even defects that are only about 1 micrometer deep can lead to a rejection during quality control. Deflectometric techniques are an adequate approach to recognize and measure defects on specular surfaces, because the principle of measurement of these methods mimics the behavior of a human observer inspecting the surface. With these methods, the specular object is considered as a part of the optical system. Not the object itself but the surrounding that is reflected by the specular surface is observed in order to obtain information about the object. This technique has proven sensitive for slope and topography measurement. Inherited from the principle of measurement, especially surface parts with high curvature need a special illumination which surrounds the object under inspection to guarantee that light from any direction is reflected onto the sensor. Thus the design of a specific measurement setup requires a substantial engineering effort. To avoid the time consuming process of building, testing and redesigning the measurement setup, a system to simulate and automatically optimize the setup has been developed. Based on CAD data of the object under inspection and a model of the optical system, favorable realizations of the shape, the position and the pattern of the lighting device are determined. In addition, optimization of other system parameters, such as object position and distance relative to the camera, is performed. Finally, constraints are imposed to ascertain the feasibility of illumination system construction.

  20. A hybrid systems strategy for automated spacecraft tour design and optimization

    Science.gov (United States)

    Stuart, Jeffrey R.

    As the number of operational spacecraft increases, autonomous operations is rapidly evolving into a critical necessity. Additionally, the capability to rapidly generate baseline trajectories greatly expands the range of options available to analysts as they explore the design space to meet mission demands. Thus, a general strategy is developed, one that is suitable for the construction of flight plans for both Earth-based and interplanetary spacecraft that encounter multiple objects, where these multiple encounters comprise a ``tour''. The proposed scheme is flexible in implementation and can readily be adjusted to a variety of mission architectures. Heuristic algorithms that autonomously generate baseline tour trajectories and, when appropriate, adjust reference solutions in the presence of rapidly changing environments are investigated. Furthermore, relative priorities for ranking the targets are explicitly accommodated during the construction of potential tour sequences. As a consequence, a priori, as well as newly acquired, knowledge concerning the target objects enhances the potential value of the ultimate encounter sequences. A variety of transfer options are incorporated, from rendezvous arcs enabled by low-thrust engines to more conventional impulsive orbit adjustments via chemical propulsion technologies. When advantageous, trajectories are optimized in terms of propellant consumption via a combination of indirect and direct methods; such a combination of available technologies is an example of hybrid optimization. Additionally, elements of hybrid systems theory, i.e., the blending of dynamical states, some discrete and some continuous, are integrated into the high-level tour generation scheme. For a preliminary investigation, this strategy is applied to mission design scenarios for a Sun-Jupiter Trojan asteroid tour as well as orbital debris removal for near-Earth applications.

  1. Analytical study on coordinative optimization of convection in tubes with variable heat flux

    Institute of Scientific and Technical Information of China (English)

    2004-01-01

    [1]Guo, Z. Y., Li, D. Y., Wang, B. X., A novel concept for convective heat transfer enhancement, Int. J. Heat Mass Transfer, 1998, 41: 2221-2225.[2]Tao, W. Q., Guo, Z. Y., Wang, B. X., Field synergy principle for enhancing convective heat transfer--extension and numerical verification, Int. J. Heat Mass Transfer, 2002, 45: 3849-3856.[3]Guo, Z. Y., Mechanism and control of convective heat transfer--Coordination of velocity and heat flow fields, Chinese Science Bulletin, 2001, 46(7): 596-599.[4]Sellars, J. R., Tribus, M., Klein, J. S., Heat transfer to laminar flow in a round tubes or flat conduit--The Graetz problem extended, Tras. ASME, 1956, 78: 441-448.[5]Kays, W. M., Crawford, M. E., Convective Heat Transfer, 3rd ed., Chapter 9, New York: McGraw-Hill Inc., 1993.[6]Shah, R. K., London, A. L., Laminar Flow Forced Convection in Ducts, Advances in Heat Transfer, New York: Academic Press, 1978.

  2. A Technique for Binocular Stereo Vision System Calibration by the Nonlinear Optimization and Calibration Points with Accurate Coordinates

    International Nuclear Information System (INIS)

    With the increasing need for higher accuracy measurement in computer vision, the precision of camera calibration is a more important factor. The objective of stereo camera calibration is to estimate the intrinsic and extrinsic parameters of each camera. We presented a high-accurate technique to calibrate binocular stereo vision system having been mounted the locations and attitudes, which was realized by combining nonlinear optimization method with accurate calibration points. The calibration points with accurate coordinates, were formed by an infrared LED moved with three-dimensional coordinate measurement machine, which can ensure indeterminacy of measurement is 1/30000. By using bilinear interpolation square-gray weighted centroid location algorithm, the imaging centers of the calibration points can be accurately determined. The accuracy of the calibration is measured in terms of the accuracy in the reconstructing calibration points through triangulation, the mean distance between reconstructing point and given calibration point is 0.039mm. The technique can satisfy the goals of measurement and camera accurate calibration

  3. Optimizing nitrogen fertilizer application to irrigated wheat. Results of a co-ordinated research project. 1994-1998

    International Nuclear Information System (INIS)

    This TECDOC summarizes the results of a Co-ordinated Research Project (CRP) on the Use of Nuclear Techniques for Optimizing Fertilizer Application under Irrigated Wheat to Increase the Efficient Use of Nitrogen Fertilizer and Consequently Reduce Environmental Pollution. The project was carried out between 1994 and 1998 through the technical co-ordination of the Joint FAO/IAEA Division of Nuclear Techniques in Food and Agriculture. Fourteen Member States of the IAEA and FAO carried out a series of field experiments aimed at improving irrigation water and fertilizer-N uptake efficiencies through integrated management of the complex Interactions involving inputs, soils, climate, and wheat cultivars. Its goals were: to investigate various aspects of fertilizer N uptake efficiency of wheat crops under irrigation through an interregional research network involving countries growing large areas of irrigated wheat; to use 15N and the soil-moisture neutron probe to determine the fate of applied N, to follow water and nitrate movement in the soil, and to determine water balance and water-use efficiency in irrigated wheat cropping systems; to use the data generated to further develop and refine various relationships in the Ceres-Wheat computer simulation model; to use the knowledge generated to produce a N-rate-recommendation package to refine specific management strategies with respect to fertilizer applications and expected yields

  4. Automation for tsetse mass rearing for use in sterile insect technique programmes. Final report of a co-ordinated research project 1995-2001

    International Nuclear Information System (INIS)

    The rearing of tsetse flies for the sterile insect technique has been a laborious procedure in the past. The purpose of this co-ordinated research project (CRP) 'Automation for tsetse mass rearing for use in sterile insect technique programmes' was to develop appropriate semiautomated procedures to simplify the rearing, reduce the cost and standardize the product. Two main objectives were accomplished. The first was to simplify the handling of adults at emergence. This was achieved by allowing the adults to emerge directly into the production cages. Selection of the appropriate environmental conditions and timing allowed the manipulation of the emergence pattern to achieve the desired ratio of four females to one male with minimal un-emerged females remaining mixed with the male pupae. Tests demonstrated that putting the sexes together at emergence, leaving the males in the production cages, and using a ratio of 4:1 (3:1 for a few species) did not adversely affect pupal production. This has resulted in a standardized system for the self stocking of production cages. The second was to reduce the labour involved in feeding the flies. Three distinct systems were developed and tested in sequence. The first tsetse production unit (TPU 1) was a fully automated system, but the fly survival and fecundity were unacceptably low. From this a simpler TPU 2 was developed and tested, where 63 large cages were held on a frame that could be moved as a single unit to the feeding location. TPU 2 was tested in various locations, and found to satisfy the basic requirements, and the adoption of Plexiglas pupal collection slopes resolved much of the problem due to light distribution. However the cage holding frame was heavy and difficult to position on the feeding frame and the movement disturbed the flies. TPU 2 was superseded by TPU 3, in which the cages remain stationary at all times, and the blood is brought to the flies. The blood feeding system is mounted on rails to make it

  5. AN OPTIMIZATION-BASED HEURISTIC FOR A CAPACITATED LOT-SIZING MODEL IN AN AUTOMATED TELLER MACHINES NETWORK

    Directory of Open Access Journals (Sweden)

    Supatchaya Chotayakul

    2013-01-01

    Full Text Available This research studies a cash inventory problem in an ATM Network to satisfy customer’s cash needs over multiple periods with deterministic demand. The objective is to determine the amount of money to place in Automated Teller Machines (ATMs and cash centers for each period over a given time horizon. The algorithms are designed as a multi-echelon inventory problem with single-item capacitated lot-sizing to minimize total costs of running ATM network. In this study, we formulate the problem as a Mixed Integer Program (MIP and develop an approach based on reformulating the model as a shortest path formulation for finding a near-optimal solution of the problem. This reformulation is the same as the traditional model, except the capacity constraints, inventory balance constraints and setup constraints related to the management of the money in ATMs are relaxed. This new formulation gives more variables and constraints, but has a much tighter linear relaxation than the original and is faster to solve for short term planning. Computational results show its effectiveness, especially for large sized problems.

  6. Automated Sperm Head Detection Using Intersecting Cortical Model Optimised by Particle Swarm Optimization

    Science.gov (United States)

    Tan, Weng Chun; Mat Isa, Nor Ashidi

    2016-01-01

    In human sperm motility analysis, sperm segmentation plays an important role to determine the location of multiple sperms. To ensure an improved segmentation result, the Laplacian of Gaussian filter is implemented as a kernel in a pre-processing step before applying the image segmentation process to automatically segment and detect human spermatozoa. This study proposes an intersecting cortical model (ICM), which was derived from several visual cortex models, to segment the sperm head region. However, the proposed method suffered from parameter selection; thus, the ICM network is optimised using particle swarm optimization where feature mutual information is introduced as the new fitness function. The final results showed that the proposed method is more accurate and robust than four state-of-the-art segmentation methods. The proposed method resulted in rates of 98.14%, 98.82%, 86.46% and 99.81% in accuracy, sensitivity, specificity and precision, respectively, after testing with 1200 sperms. The proposed algorithm is expected to be implemented in analysing sperm motility because of the robustness and capability of this algorithm. PMID:27632581

  7. Automated Sperm Head Detection Using Intersecting Cortical Model Optimised by Particle Swarm Optimization.

    Science.gov (United States)

    Tan, Weng Chun; Mat Isa, Nor Ashidi

    2016-01-01

    In human sperm motility analysis, sperm segmentation plays an important role to determine the location of multiple sperms. To ensure an improved segmentation result, the Laplacian of Gaussian filter is implemented as a kernel in a pre-processing step before applying the image segmentation process to automatically segment and detect human spermatozoa. This study proposes an intersecting cortical model (ICM), which was derived from several visual cortex models, to segment the sperm head region. However, the proposed method suffered from parameter selection; thus, the ICM network is optimised using particle swarm optimization where feature mutual information is introduced as the new fitness function. The final results showed that the proposed method is more accurate and robust than four state-of-the-art segmentation methods. The proposed method resulted in rates of 98.14%, 98.82%, 86.46% and 99.81% in accuracy, sensitivity, specificity and precision, respectively, after testing with 1200 sperms. The proposed algorithm is expected to be implemented in analysing sperm motility because of the robustness and capability of this algorithm. PMID:27632581

  8. Optimization of automated radiosynthesis of [{sup 18}F]AV-45: a new PET imaging agent for Alzheimer's disease

    Energy Technology Data Exchange (ETDEWEB)

    Liu Yajing; Zhu Lin [Key Laboratory of Radiopharmaceuticals, Beijing Normal University, Ministry of Education, Beijing, 100875 (China); Department of Radiology, University of Pennsylvania, Philadelphia, PA 19014 (United States); Ploessl, Karl [Department of Radiology, University of Pennsylvania, Philadelphia, PA 19014 (United States); Choi, Seok Rye [Avid Radiopharmaceuticals Inc., Philadelphia, PA 19014 (United States); Qiao Hongwen; Sun Xiaotao; Li Song [Key Laboratory of Radiopharmaceuticals, Beijing Normal University, Ministry of Education, Beijing, 100875 (China); Zha Zhihao [Key Laboratory of Radiopharmaceuticals, Beijing Normal University, Ministry of Education, Beijing, 100875 (China); Department of Radiology, University of Pennsylvania, Philadelphia, PA 19014 (United States); Kung, Hank F., E-mail: kunghf@sunmac.spect.upenn.ed [Key Laboratory of Radiopharmaceuticals, Beijing Normal University, Ministry of Education, Beijing, 100875 (China); Department of Radiology, University of Pennsylvania, Philadelphia, PA 19014 (United States)

    2010-11-15

    Introduction: Accumulation of {beta}-amyloid (A{beta}) aggregates in the brain is linked to the pathogenesis of Alzheimer's disease (AD). Imaging probes targeting these A{beta} aggregates in the brain may provide a useful tool to facilitate the diagnosis of AD. Recently, [{sup 18}F]AV-45 ([{sup 18}F]5) demonstrated high binding to the A{beta} aggregates in AD patients. To improve the availability of this agent for widespread clinical application, a rapid, fully automated, high-yield, cGMP-compliant radiosynthesis was necessary for production of this probe. We report herein an optimal [{sup 18}F]fluorination, de-protection condition and fully automated radiosynthesis of [{sup 18}F]AV-45 ([{sup 18}F]5) on a radiosynthesis module (BNU F-A2). Methods: The preparation of [{sup 18}F]AV-45 ([{sup 18}F]5) was evaluated under different conditions, specifically by employing different precursors (-OTs and -Br as the leaving group), reagents (K222/K{sub 2}CO{sub 3} vs. tributylammonium bicarbonate) and deprotection in different acids. With optimized conditions from these experiments, the automated synthesis of [{sup 18}F]AV-45 ([{sup 18}F]5) was accomplished by using a computer-programmed, standard operating procedure, and was purified on an on-line solid-phase cartridge (Oasis HLB). Results: The optimized reaction conditions were successfully implemented to an automated nucleophilic fluorination module. The radiochemical purity of [{sup 18}F]AV-45 ([{sup 18}F]5) was >95%, and the automated synthesis yield was 33.6{+-}5.2% (no decay corrected, n=4), 50.1{+-}7.9% (decay corrected) in 50 min at a quantity level of 10-100 mCi (370-3700 MBq). Autoradiography studies of [{sup 18}F]AV-45 ([{sup 18}F]5) using postmortem AD brain and Tg mouse brain sections in the presence of different concentration of 'cold' AV-136 showed a relatively low inhibition of in vitro binding of [{sup 18}F]AV-45 ([{sup 18}F]5) to the A{beta} plaques (IC50=1-4 {mu}M, a concentration several

  9. Distributed Co-ordinator Model for Optimal Utilization of Software and Piracy Prevention

    Directory of Open Access Journals (Sweden)

    S. Zeeshan Hussain

    2010-01-01

    Full Text Available Today the software technologies have evolved it to the extent that now a customer can have free and open source software available in the market. But with this evolution the menace of software piracy has also evolved. Unlike other things a customer purchases, the software applications and fonts bought don't belong to the specified user. Instead, the customer becomes a licensed user — means the customer purchases the right to use the software on a single computer, and can't put copies on other machines or pass that software along to colleagues. Software piracy is the illegal distribution and/or reproduction of software applications for business or personal use. Whether software piracy is deliberate or not, it is still illegal and punishable by law. The major reasons of piracy include the high cost of software and the rigid licensing structure which is becoming even less popular due to inefficient software utilization. Various software companies are inclined towards the research of techniques to handle this problem of piracy. Many defense mechanisms have been devised till date but the hobbyists or the black market leaders (so called “software pirates” have always found a way out of it. This paper identifies the types of piracies and licensing mechanisms along with the flaws in the existing defense mechanisms and examines social and technical challenges associated with handling software piracy prevention. The goal of this paper is to design, implement and empirically evaluate a comprehensive framework for software piracy prevention and optimal utilization of the software.

  10. Optimal power flow based TU/CHP/PV/WPP coordination in view of wind speed, solar irradiance and load correlations

    International Nuclear Information System (INIS)

    Highlights: • Formulate probabilistic OPF with VPE, multi-fuel options, POZs, FOR of CHP units. • Propose a new powerful optimization method based on enhanced black hole algorithm. • Coordinate of TUs, WPPs, PVs and CHP units together in the proposed problem. • Evaluate the impacts of inputs’ uncertainties and their correlations on the POPF. • Use the 2m + 1 point estimated method. - Abstract: This paper addresses a novel probabilistic optimisation framework for handling power system uncertainties in the optimal power flow (OPF) problem that considers all the essential factors of great impact in the OPF problem. The object is to study and model the correlation and fluctuation of load demands, photovoltaic (PV) and wind power plants (WPPs) which have an important influence on transmission lines and bus voltages. Moreover, as an important tool of saving waste heat energy in the thermoelectric power plant, the power networks share of combined heat and power (CHP) has increased dramatically in the past decade. So, the probabilistic OPF (POPF) problem considering valve point effects, multi-fuel options and prohibited zones of thermal units (TUs) is firstly formulated. The PV, WPP and CHP units are also modeled. Then, a new method utilizing enhanced binary black hole (EBBH) algorithm and 2m + 1 point estimated method is proposed to solve this problem and to handle the random nature of solar irradiance, wind speed and load of consumers. The correlation between input random variables is considered using a correlation matrix. Finally, numerical results are presented and considered regarding the IEEE 118-busses, including PV, WPP, CHP and TU at several busses. The simulation and comparison results obtained demonstrate the broad advantages and feasibility of the suggested framework in the presence of dependent non-Gaussian distribution of random variables

  11. Test Construction: Automated

    NARCIS (Netherlands)

    Veldkamp, Bernard P.

    2014-01-01

    Optimal test construction deals with automated assembly of tests for educational and psychological measurement. Items are selected from an item bank to meet a predefined set of test specifications. Several models for optimal test construction are presented, and two algorithms for optimal test assemb

  12. Test Construction: Automated

    NARCIS (Netherlands)

    Veldkamp, Bernard P.

    2016-01-01

    Optimal test construction deals with automated assembly of tests for educational and psychological measurement. Items are selected from an item bank to meet a predefined set of test specifications. Several models for optimal test construction are presented, and two algorithms for optimal test assemb

  13. Analysis of the Optimal Duration of Behavioral Observations Based on an Automated Continuous Monitoring System in Tree Swallows (Tachycineta bicolor: Is One Hour Good Enough?

    Directory of Open Access Journals (Sweden)

    Ádám Z Lendvai

    Full Text Available Studies of animal behavior often rely on human observation, which introduces a number of limitations on sampling. Recent developments in automated logging of behaviors make it possible to circumvent some of these problems. Once verified for efficacy and accuracy, these automated systems can be used to determine optimal sampling regimes for behavioral studies. Here, we used a radio-frequency identification (RFID system to quantify parental effort in a bi-parental songbird species: the tree swallow (Tachycineta bicolor. We found that the accuracy of the RFID monitoring system was similar to that of video-recorded behavioral observations for quantifying parental visits. Using RFID monitoring, we also quantified the optimum duration of sampling periods for male and female parental effort by looking at the relationship between nest visit rates estimated from sampling periods with different durations and the total visit numbers for the day. The optimum sampling duration (the shortest observation time that explained the most variation in total daily visits per unit time was 1h for both sexes. These results show that RFID and other automated technologies can be used to quantify behavior when human observation is constrained, and the information from these monitoring technologies can be useful for evaluating the efficacy of human observation methods.

  14. A Hybrid Shuffled Frog Leaping Algorithm to Solve Optimal Directional Over Current Relay Coordination Problem for Power Delivery System with DGs

    Directory of Open Access Journals (Sweden)

    Mohammad Sadegh Payam

    2012-01-01

    Full Text Available This study presents a new approach for simultaneous coordinated tuning of over current relay for a Power Delivery System (PDS including Distribution Generations (DGs. In the proposed scheme, instead of changing in protection system structure or using new elements, solving of relay coordination problem is done with revising of relays setting in presence of DGs. For this, the relay coordination problem is formulated as the optimization problem by considering two strategies: minimizing the relays operation time and minimizing the number of changes in relays setting. Also, an efficient hybrid algorithm based on Shuffled Frog Leaping (SFL algorithm and Linear Programming (LP is introduced for solving complex and non-convex optimization problem. To investigate the ability of the proposed method, a 30-bus IEEE test system is considered. Three scenarios are examined to evaluate the effectiveness of the proposed approach to solve the directional overcurrent relay coordination problem for a PDS with DGs. Simulation result show the efficiency of proposed method.

  15. Optimization of the coupling of nuclear reactors and desalination systems. Final report of a coordinated research project 1999-2003

    International Nuclear Information System (INIS)

    Nuclear power has been used for five decades and has been one of the fastest growing energy options. Although the rate at which nuclear power has penetrated the world energy market has declined, it has retained a substantial share, and is expected to continue as a viable option well into the future. Seawater desalination by distillation is much older than nuclear technology. However, the current desalination technology involving large-scale application, has a history comparable to nuclear power, i.e. it spans about five decades. Both nuclear and desalination technologies are mature and proven, and are commercially available from a variety of suppliers. Therefore, there are benefits in combining the two technologies together. Where nuclear energy could be an option for electricity supply, it can also be used as an energy source for seawater desalination. This has been recognized from the early days of the two technologies. However, the main interest during the 1960s and 1970s was directed towards the use of nuclear energy for electricity generation, district heating, and industrial process heat. Renewed interest in nuclear desalination has been growing worldwide since 1989, as indicated by the adoption of a number of resolutions on the subject at the IAEA General Conferences. Responding to this trend, the IAEA reviewed information on desalination technologies and the coupling of nuclear reactors with desalination plants, compared the economic viability of seawater desalination using nuclear energy in various coupling configuration with fossil fuels in a generic assessment, conducted a regional feasibility study on nuclear desalination in the North African Countries and initiated in a two-year Options Identification Programme (OIP) to identify candidate reactor and desalination technologies that could serve as practical demonstrations of nuclear desalination, supplementing the existing expertise and experience. In 1998, the IAEA initiated a Coordinated Research

  16. Development of methodologies for optimization of surveillance testing and maintenance of safety related equipment at NPPs. Report of a research coordination meeting. Working material

    International Nuclear Information System (INIS)

    This report summarizes the results of the first meeting of the Coordinated Research Programme (CRP) on Development of Methodologies for Optimization of Surveillance Testing and Maintenance of Safety Related Equipment at NPPs, held at the Agency Headquarters in Vienna, from 16 to 20 December 1996. The purpose of this Research Coordination Meeting (RCM) was that all Chief Scientific Investigators of the groups participating in the CRP presented an outline of their proposed research projects. Additionally, the participants discussed the objective, scope, work plan and information channels of the CRP in detail. Based on these presentations and discussions, the entire project plan was updated, completed and included in this report. This report represents a common agreed project work plan for the CRP. Refs, figs, tabs

  17. Biogas-pH automation control strategy for optimizing organic loading rate of anaerobic membrane bioreactor treating high COD wastewater.

    Science.gov (United States)

    Yu, Dawei; Liu, Jibao; Sui, Qianwen; Wei, Yuansong

    2016-03-01

    Control of organic loading rate (OLR) is essential for anaerobic digestion treating high COD wastewater, which would cause operation failure by overload or less efficiency by underload. A novel biogas-pH automation control strategy using the combined gas-liquor phase monitoring was developed for an anaerobic membrane bioreactor (AnMBR) treating high COD (27.53 g·L(-1)) starch wastewater. The biogas-pH strategy was proceeded with threshold between biogas production rate >98 Nml·h(-1) preventing overload and pH>7.4 preventing underload, which were determined by methane production kinetics and pH titration of methanogenesis slurry, respectively. The OLR and the effluent COD were doubled as 11.81 kgCOD·kgVSS(-1)·d(-1) and halved as 253.4 mg·L(-1), respectively, comparing with a constant OLR control strategy. Meanwhile COD removal rate, biogas yield and methane concentration were synchronously improved to 99.1%, 312 Nml·gCODin(-1) and 74%, respectively. Using the biogas-pH strategy, AnMBR formed a "pH self-regulation ternary buffer system" which seizes carbon dioxide and hence provides sufficient buffering capacity. PMID:26722804

  18. Home Automation

    OpenAIRE

    Ahmed, Zeeshan

    2010-01-01

    In this paper I briefly discuss the importance of home automation system. Going in to the details I briefly present a real time designed and implemented software and hardware oriented house automation research project, capable of automating house's electricity and providing a security system to detect the presence of unexpected behavior.

  19. Parameter Extraction for PSpice Models by means of an Automated Optimization Tool – An IGBT model Study Case

    DEFF Research Database (Denmark)

    Suárez, Carlos Gómez; Reigosa, Paula Diaz; Iannuzzo, Francesco;

    2016-01-01

    An original tool for parameter extraction of PSpice models has been released, enabling a simple parameter identification. A physics-based IGBT model is used to demonstrate that the optimization tool is capable of generating a set of parameters which predicts the steady-state and switching behavior...... of two IGBT modules rated at 1.7 kV / 1 kA and 1.7 kV / 1.4kA....

  20. DG-AMMOS: A New tool to generate 3D conformation of small molecules using Distance Geometry and Automated Molecular Mechanics Optimization for in silico Screening

    Directory of Open Access Journals (Sweden)

    Villoutreix Bruno O

    2009-11-01

    Full Text Available Abstract Background Discovery of new bioactive molecules that could enter drug discovery programs or that could serve as chemical probes is a very complex and costly endeavor. Structure-based and ligand-based in silico screening approaches are nowadays extensively used to complement experimental screening approaches in order to increase the effectiveness of the process and facilitating the screening of thousands or millions of small molecules against a biomolecular target. Both in silico screening methods require as input a suitable chemical compound collection and most often the 3D structure of the small molecules has to be generated since compounds are usually delivered in 1D SMILES, CANSMILES or in 2D SDF formats. Results Here, we describe the new open source program DG-AMMOS which allows the generation of the 3D conformation of small molecules using Distance Geometry and their energy minimization via Automated Molecular Mechanics Optimization. The program is validated on the Astex dataset, the ChemBridge Diversity database and on a number of small molecules with known crystal structures extracted from the Cambridge Structural Database. A comparison with the free program Balloon and the well-known commercial program Omega generating the 3D of small molecules is carried out. The results show that the new free program DG-AMMOS is a very efficient 3D structure generator engine. Conclusion DG-AMMOS provides fast, automated and reliable access to the generation of 3D conformation of small molecules and facilitates the preparation of a compound collection prior to high-throughput virtual screening computations. The validation of DG-AMMOS on several different datasets proves that generated structures are generally of equal quality or sometimes better than structures obtained by other tested methods.

  1. Conflict and Coordination Problem of Carbon Tax' Diversity Targets in China-Based on the Tax Optimization Theory%Conflict and Coordination Problem of Carbon Tax' Diversity Targets in China-Based on the Tax Optimization Theory

    Institute of Scientific and Technical Information of China (English)

    Xue Gang

    2011-01-01

    Among all the emission reduction measures, carbon tax is recognized as the most effective way to protect our climate. That is why the Chinese government has recently taken it as a tax reform direction, In the current economic analysis, the design of carbon tax is mostly based on the target to maximize the efficiency However, based on the theory of tax system optimization, we should also consider other policy objectives, such as equity, revenue and cost, and then balance different objectives to achieve the suboptimum reform of carbon tax system in China.

  2. An Automated Treatment Plan Quality Control Tool for Intensity-Modulated Radiation Therapy Using a Voxel-Weighting Factor-Based Re-Optimization Algorithm.

    Science.gov (United States)

    Song, Ting; Li, Nan; Zarepisheh, Masoud; Li, Yongbao; Gautier, Quentin; Zhou, Linghong; Mell, Loren; Jiang, Steve; Cerviño, Laura

    2016-01-01

    Intensity-modulated radiation therapy (IMRT) currently plays an important role in radiotherapy, but its treatment plan quality can vary significantly among institutions and planners. Treatment plan quality control (QC) is a necessary component for individual clinics to ensure that patients receive treatments with high therapeutic gain ratios. The voxel-weighting factor-based plan re-optimization mechanism has been proved able to explore a larger Pareto surface (solution domain) and therefore increase the possibility of finding an optimal treatment plan. In this study, we incorporated additional modules into an in-house developed voxel weighting factor-based re-optimization algorithm, which was enhanced as a highly automated and accurate IMRT plan QC tool (TPS-QC tool). After importing an under-assessment plan, the TPS-QC tool was able to generate a QC report within 2 minutes. This QC report contains the plan quality determination as well as information supporting the determination. Finally, the IMRT plan quality can be controlled by approving quality-passed plans and replacing quality-failed plans using the TPS-QC tool. The feasibility and accuracy of the proposed TPS-QC tool were evaluated using 25 clinically approved cervical cancer patient IMRT plans and 5 manually created poor-quality IMRT plans. The results showed high consistency between the QC report quality determinations and the actual plan quality. In the 25 clinically approved cases that the TPS-QC tool identified as passed, a greater difference could be observed for dosimetric endpoints for organs at risk (OAR) than for planning target volume (PTV), implying that better dose sparing could be achieved in OAR than in PTV. In addition, the dose-volume histogram (DVH) curves of the TPS-QC tool re-optimized plans satisfied the dosimetric criteria more frequently than did the under-assessment plans. In addition, the criteria for unsatisfied dosimetric endpoints in the 5 poor-quality plans could typically be

  3. Automated Test-Form Generation

    Science.gov (United States)

    van der Linden, Wim J.; Diao, Qi

    2011-01-01

    In automated test assembly (ATA), the methodology of mixed-integer programming is used to select test items from an item bank to meet the specifications for a desired test form and optimize its measurement accuracy. The same methodology can be used to automate the formatting of the set of selected items into the actual test form. Three different…

  4. Always-optimally-coordinated candidate selection algorithm for peer-to-peer files sharing system in mobile self-organized networks

    Institute of Scientific and Technical Information of China (English)

    Li Xi; Ji Hong; Zheng Ruiming; Li Ting

    2009-01-01

    In order to improve the performance of peer-to-peer files sharing system under mobile distributed environments, a novel always-optimally-coordinated (AOC) criterion and corresponding candidate selection algorithm are proposed in this paper. Compared with the traditional min-hops criterion, the new approach introduces a fuzzy knowledge combination theory to investigate several important factors that influence files transfer success rate and efficiency. Whereas the min-hops based protocols only ask the nearest candidate peer for desired files, the selection algorithm based on AOC comprehensively considers users' preference and network requirements with flexible balancing rules. Furthermore, its advantage also expresses in the independence of specified resource discovering protocols, allowing for scalability. The simulation results show that when using the AOC based peer selection algorithm, system performance is much better than the min-hops scheme, with files successful transfer rate improved more than 50% and transfer time reduced at least 20%.

  5. Optimal Installation Locations for Automated External Defibrillators in Taipei 7-Eleven Stores: Using GIS and a Genetic Algorithm with a New Stirring Operator

    Directory of Open Access Journals (Sweden)

    Chung-Yuan Huang

    2014-01-01

    Full Text Available Immediate treatment with an automated external defibrillator (AED increases out-of-hospital cardiac arrest (OHCA patient survival potential. While considerable attention has been given to determining optimal public AED locations, spatial and temporal factors such as time of day and distance from emergency medical services (EMSs are understudied. Here we describe a geocomputational genetic algorithm with a new stirring operator (GANSO that considers spatial and temporal cardiac arrest occurrence factors when assessing the feasibility of using Taipei 7-Eleven stores as installation locations for AEDs. Our model is based on two AED conveyance modes, walking/running and driving, involving service distances of 100 and 300 meters, respectively. Our results suggest different AED allocation strategies involving convenience stores in urban settings. In commercial areas, such installations can compensate for temporal gaps in EMS locations when responding to nighttime OHCA incidents. In residential areas, store installations can compensate for long distances from fire stations, where AEDs are currently held in Taipei.

  6. Sequential injection analysis for automation of the Winkler methodology, with real-time SIMPLEX optimization and shipboard application.

    Science.gov (United States)

    Horstkotte, Burkhard; Tovar Sánchez, Antonio; Duarte, Carlos M; Cerdà, Víctor

    2010-01-25

    A multipurpose analyzer system based on sequential injection analysis (SIA) for the determination of dissolved oxygen (DO) in seawater is presented. Three operation modes were established and successfully applied onboard during a research cruise in the Southern ocean: 1st, in-line execution of the entire Winkler method including precipitation of manganese (II) hydroxide, fixation of DO, precipitate dissolution by confluent acidification, and spectrophotometric quantification of the generated iodine/tri-iodide (I(2)/I(3)(-)), 2nd, spectrophotometric quantification of I(2)/I(3)(-) in samples prepared according the classical Winkler protocol, and 3rd, accurate batch-wise titration of I(2)/I(3)(-) with thiosulfate using one syringe pump of the analyzer as automatic burette. In the first mode, the zone stacking principle was applied to achieve high dispersion of the reagent solutions in the sample zone. Spectrophotometric detection was done at the isobestic wavelength 466 nm of I(2)/I(3)(-). Highly reduced consumption of reagents and sample compared to the classical Winkler protocol, linear response up to 16 mg L(-1) DO, and an injection frequency of 30 per hour were achieved. It is noteworthy that for the offline protocol, sample metering and quantification with a potentiometric titrator lasts in general over 5 min without counting sample fixation, incubation, and glassware cleaning. The modified SIMPLEX methodology was used for the simultaneous optimization of four volumetric and two chemical variables. Vertex calculation and consequent application including in-line preparation of one reagent was carried out in real-time using the software AutoAnalysis. The analytical system featured high signal stability, robustness, and a repeatability of 3% RSD (1st mode) and 0.8% (2nd mode) during shipboard application. PMID:20103088

  7. Autonomous Optimal Coordination Scheme in Protection System of Power Distribution Network by Using Multi-Agent Concept

    Institute of Scientific and Technical Information of China (English)

    LEE Seung-Jae; KIM Tae-Wan; LEE Gi-Young

    2008-01-01

    A protection system using a multi-agent concept for power distribution networks is pro- posed. Every digital over current relay(OCR) is developed as an agent by adding its own intelli- gence, self-tuning and communication ability. The main advantage of the multi-agent concept is that a group of agents work together to achieve a global goal which is beyond the ability of each individual agent. In order to cope with frequent changes in the network operation condition and faults, an OCR agent, proposed in this paper, is able to detect a fault or a change in the network and find its optimal parameters for protection in an autonomous manner considering information of the whole network obtained by communication between other agents.Through this kind of coordi- nation and information exchanges, not only a local but also a global protective scheme is com- pleted. Simulations in a simple distribution network show the effectiveness of the proposed protec- tion system.

  8. First-Stage Development and Validation of a Web-Based Automated Dietary Modeling Tool: Using Constraint Optimization Techniques to Streamline Food Group and Macronutrient Focused Dietary Prescriptions for Clinical Trials

    Science.gov (United States)

    Morrison, Evan; Sullivan, Emma; Dam, Hoa Khanh

    2016-01-01

    Background Standardizing the background diet of participants during a dietary randomized controlled trial is vital to trial outcomes. For this process, dietary modeling based on food groups and their target servings is employed via a dietary prescription before an intervention, often using a manual process. Partial automation has employed the use of linear programming. Validity of the modeling approach is critical to allow trial outcomes to be translated to practice. Objective This paper describes the first-stage development of a tool to automatically perform dietary modeling using food group and macronutrient requirements as a test case. The Dietary Modeling Tool (DMT) was then compared with existing approaches to dietary modeling (manual and partially automated), which were previously available to dietitians working within a dietary intervention trial. Methods Constraint optimization techniques were implemented to determine whether nonlinear constraints are best suited to the development of the automated dietary modeling tool using food composition and food consumption data. Dietary models were produced and compared with a manual Microsoft Excel calculator, a partially automated Excel Solver approach, and the automated DMT that was developed. Results The web-based DMT was produced using nonlinear constraint optimization, incorporating estimated energy requirement calculations, nutrition guidance systems, and the flexibility to amend food group targets for individuals. Percentage differences between modeling tools revealed similar results for the macronutrients. Polyunsaturated fatty acids and monounsaturated fatty acids showed greater variation between tools (practically equating to a 2-teaspoon difference), although it was not considered clinically significant when the whole diet, as opposed to targeted nutrients or energy requirements, were being addressed. Conclusions Automated modeling tools can streamline the modeling process for dietary intervention trials

  9. Library Automation

    OpenAIRE

    Dhakne, B. N.; Giri, V. V; Waghmode, S. S.

    2010-01-01

    New technologies library provides several new materials, media and mode of storing and communicating the information. Library Automation reduces the drudgery of repeated manual efforts in library routine. By use of library automation collection, Storage, Administration, Processing, Preservation and communication etc.

  10. Optimization of automation: II. Estimation method of ostracism rate based on the loss of situation awareness of human operators in nuclear power plants

    International Nuclear Information System (INIS)

    Highlights: • We analyze the relationship between Out-of-the-Loop and the loss of human operators’ situation awareness. • We propose an ostracism rate estimation method by only considering the negative effects of automation. • The ostracism rate reflects how much automation interrupts human operators to receive information. • The higher the ostracism rate is, the lower the accuracy of human operators’ SA will be. - Abstract: With the introduction of automation in various industries including the nuclear field, its side effect, referred to as the Out-of-the-Loop (OOTL) problem, has emerged as a critical issue that needs to be addressed. Many studies have been attempted to analyze and solve the OOTL problem, but this issue still needs a clear solution to provide criteria for introducing automation. Therefore, a quantitative estimation method for identifying negative effects of automation is proposed in this paper. The representative aspect of the OOTL problem in nuclear power plants (NPPs) is that human operators in automated operations are given less information than human operators in manual operations. In other words, human operators have less opportunity to obtain needed information as automation is introduced. From this point of view, the degree of difficulty in obtaining information from automated systems is defined as the Level of Ostracism (LOO). Using the LOO and information theory, we propose the ostracism rate, which is a new estimation method that expresses how much automation interrupts human operators’ situation awareness. We applied production rules to describe the human operators’ thinking processes, Bayesian inference to describe the production rules mathematically, and information theory to calculate the amount of information that human operators receive through observations. The validity of the suggested method was proven by conducting an experiment. The results show that the ostracism rate was significantly related to the accuracy

  11. Process automation

    International Nuclear Information System (INIS)

    Process automation technology has been pursued in the chemical processing industries and to a very limited extent in nuclear fuel reprocessing. Its effective use has been restricted in the past by the lack of diverse and reliable process instrumentation and the unavailability of sophisticated software designed for process control. The Integrated Equipment Test (IET) facility was developed by the Consolidated Fuel Reprocessing Program (CFRP) in part to demonstrate new concepts for control of advanced nuclear fuel reprocessing plants. A demonstration of fuel reprocessing equipment automation using advanced instrumentation and a modern, microprocessor-based control system is nearing completion in the facility. This facility provides for the synergistic testing of all chemical process features of a prototypical fuel reprocessing plant that can be attained with unirradiated uranium-bearing feed materials. The unique equipment and mission of the IET facility make it an ideal test bed for automation studies. This effort will provide for the demonstration of the plant automation concept and for the development of techniques for similar applications in a full-scale plant. A set of preliminary recommendations for implementing process automation has been compiled. Some of these concepts are not generally recognized or accepted. The automation work now under way in the IET facility should be useful to others in helping avoid costly mistakes because of the underutilization or misapplication of process automation. 6 figs

  12. The model of the stochastic optimization of the automated forecast of dangerous squalls and tornadoes over the territory of the republic of byelorussia

    OpenAIRE

    Perekhodtseva, E. V.

    2013-01-01

    Development of successful method of automated statistical well-in-advance forecast (from 12 hours to two days) of dangerous phenomena – severe squalls and tornadoes – could allow mitigate the economic losses.

  13. Towards black-box calculations of tunneling splittings obtained from vibrational structure methods based on normal coordinates.

    Science.gov (United States)

    Neff, Michael; Rauhut, Guntram

    2014-02-01

    Multidimensional potential energy surfaces obtained from explicitly correlated coupled-cluster calculations and further corrections for high-order correlation contributions, scalar relativistic effects and core-correlation energy contributions were generated in a fully automated fashion for the double-minimum benchmark systems OH3(+) and NH3. The black-box generation of the potentials is based on normal coordinates, which were used in the underlying multimode expansions of the potentials and the μ-tensor within the Watson operator. Normal coordinates are not the optimal choice for describing double-minimum potentials and the question remains if they can be used for accurate calculations at all. However, their unique definition is an appealing feature, which removes remaining errors in truncated potential expansions arising from different choices of curvilinear coordinate systems. Fully automated calculations are presented, which demonstrate, that the proposed scheme allows for the determination of energy levels and tunneling splittings as a routine application.

  14. 基于Bloch球面坐标的量子粒子群算法%Quantum particle swarm optimization based on Bloch coordinates of qubits

    Institute of Scientific and Technical Information of China (English)

    陈义雄; 梁昔明; 黄亚飞

    2013-01-01

    To improve the efficiency of Particle Swarm Optimization ( PSO), a quantum particle swarm optimization algorithm combined with quantum theory on the basis of Bloch sphere was proposed. In Bloch spherical coordinates, the particle automatically updated rotation angle and particle position, without setting the rotation angle in the form of look-up table (or setting fixed value of the interval), making up for the deficiency of quantum evolutionary algorithm and quantum genetic algorithm on the basis of Bloch sphere, and the algorithm is more generalizable. Using quantum Hadamard gate to realize the variation of particle enhanced the diversity of population, and prompted particle jump out of local extreme value. The simulation results of the typical function optimization problem show that the algorithm is stable with high precision and fast convergence rate, and it is practical.%为了提高粒子群优化(PSO)算法的优化效率,结合量子理论提出一种基于Bloch球面坐标的量子粒子群优化算法.在Bloch球面坐标下,粒子自动更新旋转角大小和粒子位置,不需将旋转角以查询表的形式设定(或设定为区间上的固定值),弥补了Bloch球面坐标下量子进化算法和量子遗传算法的不足,算法更具有普遍性;用量子Hadamard门实现粒子的变异,增强了种群的多样性,促使粒子跳出局部极值点.对典型函数优化问题的仿真结果表明,提出的算法稳定性强,精度高,收敛速度快,具有一定的实用价值.

  15. Optimization of the radiological protection of patients undergoing radiography, fluoroscopy and computed tomography. Final report of a coordinated research project in Africa, Asia and eastern Europe

    International Nuclear Information System (INIS)

    Although radiography has been an established imaging modality for over a century, continuous developments have led to improvements in technique resulting in improved image quality at reduced patient dose. If one compares the technique used by Roentgen with the methods used today, one finds that a radiograph can now be obtained at a dose which is smaller by a factor of 100 or more. Nonetheless, some national surveys, particularly in the United Kingdom and in the United States of America in the 1980s and 1990s, have indicated large variations in patient doses for the same diagnostic examination, in some cases by a factor of 20 or more. This arises not only owing to the various types of equipment and accessories used by the different health care providers, but also because of operational factors. The IAEA has a statutory responsibility to establish standards for the protection of people against exposure to ionising radiation and to provide for the worldwide application of those standards. A fundamental requirement of the International Basic Safety Standards for Protection against Ionizing Radiation and for the Safety of Radiation Sources (BSS), issued by the IAEA in cooperation with the FAO, ILO, WHO, PAHO and NEA, is the optimization of radiological protection of patients undergoing medical exposure. Towards its responsibility of implementation of standards and under the subprogramme of radiation safety, in 1995, the IAEA launched a coordinated research project (CRP) on radiological protection in diagnostic radiology in some countries in the Eastern European, African and Asian region. Initially, the CRP addressed radiography only and it covered wide aspects of optimisation of radiological protection. Subsequently, the scope of the CRP was extended to fluoroscopy and computed tomography (CT), but it covered primarily situation analysis of patient doses and equipment quality control. It did not cover patient dose reduction aspects in fluoroscopy and CT. The project

  16. 商业银行效益风险协调区间研究%The Estimation and Optimization of Returns and Risks of Commercial Bank Coordination Interval

    Institute of Scientific and Technical Information of China (English)

    高艳平; 李立新

    2015-01-01

    Based on the 16 listed banks data from 2002 to 2013, the variable model of dynamic panel instrument and multi-obje-ctive optimization method, the research is done on China's commercial bank risks and returns coordination. Firstly, by the comprehen-sive measure of the commercial bank returns and risk, using the method of comprehensive evaluation and the principal component of comprehensive sequence analysis method, this paper gets the returns and risks index and establishes the risks and returns index model. Secondly, the paper controls the intrinsic factors of the returns and risks models to get the upper limit of the risks and the lower limit of the returns. Then, the external macro economic and financial environment factors are added to regain the returns and risk models. On the basis of the two models, it gets the returns under the constraint of risks and risks under the restriction of benefits. Finally, the pa-per measures the corresponding optimal benefit and the coordination interval of commercial bank efficiency and risks in our country and get the following conclusions: the relative efficiency range, risks included, is[0.86,1.15]. Once out of the scope, they may need to in-crease the risk for the price.%文章基于2002-2013年十六家上市银行数据、 动态面板工具变量模型及多目标优化方法, 对中国商业银行风险、收益协调关系进行了研究. 首先通过对商业银行效益、 风险的全面衡量, 利用综合评价法、 时序全局主成分分析方法得到效益、风险指数, 建立了风险和效益的指数模型. 其次对效益、 风险模型分别控制其内在因素, 得到商业银行所能承受的风险的上限值和效益的下限值. 再次将外部宏观经济、 金融环境等因素加进去, 重新得到效益、 风险模型, 以这两个模型为基础, 求解得到效益约束下的风险与风险约束下的效益. 最后测度我国商业银行效益与风险二者

  17. Optimization of Operation Reserve Coordination Considering Wind Power Integration%考虑风电接入的有功运行备用协调优化

    Institute of Scientific and Technical Information of China (English)

    张国强; 吴文传; 张伯明

    2011-01-01

    分析了大量风电接入对系统有功运行备用提出的问题,首先分析了风电接入后系统的运行备用需求性质,进而提出一种基于风险的风电备用需求决策方法,并提出一种充分利用发电机组控制性能的备用协调优化分配算法.IEEE 39节点系统的仿真结果表明,所提出的方法有效解决了风电备用需求的确定和分配问题,对接入大量风电的系统的有功调度运行具有实用价值.%The impact of wind power integration on the system active operation reserve is analyzed. The characteristics of operating reserve are discussed, and a risk-based reserve decision method for wind power integration is presented.A coordination and optimal reserve allocation algorithm is proposed to take full advantage of the control performance of generators, which is capable of solving the wind power impact on the operation reserve. Simulation results of an IEEE 39-bus system show that the method proposed is effective in determining and allocating the wind power reserve demands and of practical value in integrating massive active dispatch operation of the wind power system.

  18. Design automation, languages, and simulations

    CERN Document Server

    Chen, Wai-Kai

    2003-01-01

    As the complexity of electronic systems continues to increase, the micro-electronic industry depends upon automation and simulations to adapt quickly to market changes and new technologies. Compiled from chapters contributed to CRC's best-selling VLSI Handbook, this volume covers a broad range of topics relevant to design automation, languages, and simulations. These include a collaborative framework that coordinates distributed design activities through the Internet, an overview of the Verilog hardware description language and its use in a design environment, hardware/software co-design, syst

  19. Automating Finance

    Science.gov (United States)

    Moore, John

    2007-01-01

    In past years, higher education's financial management side has been riddled with manual processes and aging mainframe applications. This article discusses schools which had taken advantage of an array of technologies that automate billing, payment processing, and refund processing in the case of overpayment. The investments are well worth it:…

  20. Automation in biological crystallization.

    Science.gov (United States)

    Stewart, Patrick Shaw; Mueller-Dieckmann, Jochen

    2014-06-01

    Crystallization remains the bottleneck in the crystallographic process leading from a gene to a three-dimensional model of the encoded protein or RNA. Automation of the individual steps of a crystallization experiment, from the preparation of crystallization cocktails for initial or optimization screens to the imaging of the experiments, has been the response to address this issue. Today, large high-throughput crystallization facilities, many of them open to the general user community, are capable of setting up thousands of crystallization trials per day. It is thus possible to test multiple constructs of each target for their ability to form crystals on a production-line basis. This has improved success rates and made crystallization much more convenient. High-throughput crystallization, however, cannot relieve users of the task of producing samples of high quality. Moreover, the time gained from eliminating manual preparations must now be invested in the careful evaluation of the increased number of experiments. The latter requires a sophisticated data and laboratory information-management system. A review of the current state of automation at the individual steps of crystallization with specific attention to the automation of optimization is given.

  1. Sensitivity analysis approach to multibody systems described by natural coordinates

    Science.gov (United States)

    Li, Xiufeng; Wang, Yabin

    2014-03-01

    The classical natural coordinate modeling method which removes the Euler angles and Euler parameters from the governing equations is particularly suitable for the sensitivity analysis and optimization of multibody systems. However, the formulation has so many principles in choosing the generalized coordinates that it hinders the implementation of modeling automation. A first order direct sensitivity analysis approach to multibody systems formulated with novel natural coordinates is presented. Firstly, a new selection method for natural coordinate is developed. The method introduces 12 coordinates to describe the position and orientation of a spatial object. On the basis of the proposed natural coordinates, rigid constraint conditions, the basic constraint elements as well as the initial conditions for the governing equations are derived. Considering the characteristics of the governing equations, the newly proposed generalized-α integration method is used and the corresponding algorithm flowchart is discussed. The objective function, the detailed analysis process of first order direct sensitivity analysis and related solving strategy are provided based on the previous modeling system. Finally, in order to verify the validity and accuracy of the method presented, the sensitivity analysis of a planar spinner-slider mechanism and a spatial crank-slider mechanism are conducted. The test results agree well with that of the finite difference method, and the maximum absolute deviation of the results is less than 3%. The proposed approach is not only convenient for automatic modeling, but also helpful for the reduction of the complexity of sensitivity analysis, which provides a practical and effective way to obtain sensitivity for the optimization problems of multibody systems.

  2. Geographic Routing on Improved Coordinates

    OpenAIRE

    Brandes, Ulrik; Fleischer, Daniel

    2007-01-01

    We consider routing methods for networks when geographic positions of nodes are available. Instead of using the original geographic coordinates, however, we precompute virtual coordinates using barycentric layout. Combined with simple geometric routing rules, this greatly reduces the lengths of routes and outperforms algorithms working on the original coordinates. Along with experimental results we proof properties such as guaranteed message delivery and worst-case optimality. Our methods app...

  3. Order Picking Optimization of Automated Warehouses Based on the Ant Colony Genetic Algorithm%基于蚁群遗传算法的自动化立体仓库拣选路径优化

    Institute of Scientific and Technical Information of China (English)

    庞龙; 陆金桂

    2012-01-01

    Optimizing the order picking is a useful way to improve the efficiency of automated warehouses. According to analyzing the process and characteristics of picking in automated warehouses, a new mathematic model is proposed for automated warehouses. Firstly, we make an excellent initial population by the ant colony algorithm, and then optimize and solve the model with the genetic algorithms. The simulation results show that the model is feasible, and the mix of the ant colony and genetic algorithms is not only feasible but also accelerate the speed of the algorithm, and then improve the efficiency of order picking.%合理优化货物的拣选路径是提高自动化立体仓库运行效率的一种有效方法.通过分析自动化立体仓库拣选作业的工作流程与特点,为自动化仓库拣选作业建立优化数学模型,首先利用蚁群算法生成优异的初始种群,然后通过遗传算法对该数学模型进行优化求解.仿真结果表明该模型是可行的,蚁群遗传算法的混合不仅得到更精确的结果而且加速了算法的求解速度,从而能够改善拣选作业的效率.

  4. Optimization of production and quality control of therapeutic radionuclides and radiopharmaceuticals. Final report of a co-ordinated research project 1994-1998

    International Nuclear Information System (INIS)

    The 'renaissance' of the therapeutic applications of radiopharmaceuticals during the last few years was in part due to a greater availability of radionuclides with appropriate nuclear decay properties, as well as to the development of carrier molecules with improved characteristics. Although radionuclides such as 32P, 89Sr and 131I, were used from the early days of nuclear medicine in the late 1930s and early 1940s, the inclusion of other particle emitting radionuclides into the nuclear medicine armamentarium was rather late. Only in the early 1980s did the specialized scientific literature start to show the potential for using other beta emitting nuclear reactor produced radionuclides such as 153Sm, 166 Ho, 165Dy and 186-188Re. Bone seeking agents radiolabelled with the above mentioned beta emitting radionuclides demonstrated clear clinical potential in relieving intense bone pain resulting from metastases of the breast, prostate and lung of cancer patients. Therefore, upon the recommendation of a consultants meeting held in Vienna in 1993, the Co-ordinated Research Project (CRP) on Optimization of the Production and quality control of Radiotherapeutic Radionuclides and Radiopharmaceuticals was established in 1994. The CRP aimed at developing and improving existing laboratory protocols for the production of therapeutic radionuclides using existing nuclear research reactors including the corresponding radiolabelling, quality control procedures; and validation in experimental animals. With the participation of ten scientists from IAEA Member States, several laboratory procedures for preparation and quality control were developed, tested and assessed as potential therapeutic radiopharmaceuticals for bone pain palliation. In particular, the CRP optimised the reactor production of 153Sm and the preparation of the radiopharmaceutical 153Sm-EDTMP (ethylene diamine tetramethylene phosphonate), as well as radiolabelling techniques and quality control methods for the

  5. Poisson Coordinates.

    Science.gov (United States)

    Li, Xian-Ying; Hu, Shi-Min

    2013-02-01

    Harmonic functions are the critical points of a Dirichlet energy functional, the linear projections of conformal maps. They play an important role in computer graphics, particularly for gradient-domain image processing and shape-preserving geometric computation. We propose Poisson coordinates, a novel transfinite interpolation scheme based on the Poisson integral formula, as a rapid way to estimate a harmonic function on a certain domain with desired boundary values. Poisson coordinates are an extension of the Mean Value coordinates (MVCs) which inherit their linear precision, smoothness, and kernel positivity. We give explicit formulas for Poisson coordinates in both continuous and 2D discrete forms. Superior to MVCs, Poisson coordinates are proved to be pseudoharmonic (i.e., they reproduce harmonic functions on n-dimensional balls). Our experimental results show that Poisson coordinates have lower Dirichlet energies than MVCs on a number of typical 2D domains (particularly convex domains). As well as presenting a formula, our approach provides useful insights for further studies on coordinates-based interpolation and fast estimation of harmonic functions.

  6. Integrating Test-Form Formatting into Automated Test Assembly

    Science.gov (United States)

    Diao, Qi; van der Linden, Wim J.

    2013-01-01

    Automated test assembly uses the methodology of mixed integer programming to select an optimal set of items from an item bank. Automated test-form generation uses the same methodology to optimally order the items and format the test form. From an optimization point of view, production of fully formatted test forms directly from the item pool using…

  7. Application of an automation system and a supervisory control and data acquisition (SCADA) system for the optimal operation of a membrane adsorption hybrid system.

    Science.gov (United States)

    Smith, P J; Vigneswaran, S; Ngo, H H; Nguyen, H T; Ben-Aim, R

    2006-01-01

    The application of automation and supervisory control and data acquisition (SCADA) systems to municipal water and wastewater treatment plants is rapidly increasing. However, the application of these systems is less frequent in the research and development phases of emerging treatment technologies used in these industries. This study involved the implementation of automation and a SCADA system to the submerged membrane adsorption hybrid system for use in a semi-pilot scale research project. An incremental approach was used in the development of the automation and SCADA systems, leading to the development of two new control systems. The first system developed involved closed loop control of the backwash initiation, based upon a pressure increase, leading to productivity improvements as the backwash is only activated when required, not at a fixed time. This system resulted in a 40% reduction in the number of backwashes required and also enabled optimised operations under unsteady concentrations of wastewater. The second system developed involved closed loop control of the backwash duration, whereby the backwash was terminated when the pressure reached a steady state. This system resulted in a reduction of the duration of the backwash of up to 25% and enabled optimised operations as the foulant build-up within the reactor increased. PMID:16722068

  8. Automation Security

    OpenAIRE

    Mirzoev, Dr. Timur

    2014-01-01

    Web-based Automated Process Control systems are a new type of applications that use the Internet to control industrial processes with the access to the real-time data. Supervisory control and data acquisition (SCADA) networks contain computers and applications that perform key functions in providing essential services and commodities (e.g., electricity, natural gas, gasoline, water, waste treatment, transportation) to all Americans. As such, they are part of the nation s critical infrastructu...

  9. Java Implementation based Heterogeneous Video Sequence Automated Surveillance Monitoring

    Directory of Open Access Journals (Sweden)

    Sankari Muthukarupan

    2013-04-01

    Full Text Available Automated video based surveillance monitoring is an essential and computationally challenging task to resolve issues in the secure access localities. This paper deals with some of the issues which are encountered in the integration surveillance monitoring in the real-life circumstances. We have employed video frames which are extorted from heterogeneous video formats. Each video frame is chosen to identify the anomalous events which are occurred in the sequence of time-driven process. Background subtraction is essentially required based on the optimal threshold and reference frame. Rest of the frames are ablated from reference image, hence all the foreground images paradigms are obtained. The co-ordinate existing in the deducted images is found by scanning the images horizontally until the occurrence of first black pixel. Obtained coordinate is twinned with existing co-ordinates in the primary images. The twinned co-ordinate in the primary image is considered as an active-region-of-interest. At the end, the starred images are converted to temporal video that scrutinizes the moving silhouettes of human behaviors in a static background. The proposed model is implemented in Java. Results and performance analysis are carried out in the real-life environments.

  10. Practically Coordinating

    OpenAIRE

    Durfee, Edmund H.

    1999-01-01

    To coordinate, intelligent agents might need to know something about themselves, about each other, about how others view themselves and others, about how others think others view themselves and others, and so on. Taken to an extreme, the amount of knowledge an agent might possess to coordinate its interactions with others might outstrip the agent's limited reasoning capacity (its available time, memory, and so on). Much of the work in studying and building multiagent systems has thus been dev...

  11. 计及省地协调的地区AVC可变目标优化策略%Variable-Objective Optimization Strategy of Regional AVC Based on Provincial and Regional Coordination

    Institute of Scientific and Technical Information of China (English)

    董岳昕; 杨洪耕

    2011-01-01

    Considering coordinated control between provincial and regional automatic voltage control(AVC) systems, a variable-objective optimal method is presented. In coordinated control mode, regional AVC needs to provide reserve gateway reactive power capacity to provincial AVC, and the optimal objective is the maxi mum devoting/removing reactive power. After receiving adjustable range of gateway power factors, regional AVC balances between reactive power flow and voltage quality and then selects feasible-region optimization, convergent optimization or relaxation optimization to accomplish various coordinated control goals. If there is communication interruption or other faults, regional AVC will switch to autonomous decentralized control mode, and stage-by-stage optimization method is introduced. The proposed control methods are applied to a practical grid, and the results indicate that the approach proposed is effective in optimizing the reactive power flow, improving the voltage quality and is practical for projects.%考虑省地自动电压控制系统(AVC)联合协调控制要求,提出一种可变目标函数的优化方法.协调控制模式下,地区AVC向省网提供关口无功备用容量时,以最大可投/切容量为优化目标;接收到省网下发的关口功率因数限值后,地区AVC在省地无功平衡与主网电压合格之间进行综合权衡,选择可行域优化、趋同优化或松弛优化策略,通过改变目标函数实现不同情况下的协调控制效果.在出现通讯中断等故障情况下,地区AVC自动转入自律分散控制模式,并采用逐级优化策略.将所提出的优化方法应用于实际电网,运行结果表明所提策略可以进一步优化无功潮流,改善电压质量,具有工程实用性.

  12. Advertising and Coordination

    OpenAIRE

    1990-01-01

    We show that when relevant market information such as price is difficult to communicate, advertising plays a key role in bringing about optimal coordination of purchase behavior: an efficient firm uses advertising expenditures in place of price to inform sophisticated consumers that it offers a better deal. This provides a theoretical explanation for Benham's (1972) empirical association of the ability to advertise with lower prices and larger scale. We find that advertising improves welfare ...

  13. Coordinated unbundling

    DEFF Research Database (Denmark)

    Timmermans, Bram; Zabala-Iturriagagoitia, Jon Mikel

    2013-01-01

    not focused on the role this policy instrument can play in the promotion of (knowledge-intensive) entrepreneurship. This paper investigates this link in more detail and introduces the concept of coordinated unbundling as a strategy that can facilitate this purpose. We also present a framework on how...

  14. Automated Budget System

    Data.gov (United States)

    Department of Transportation — The Automated Budget System (ABS) automates management and planning of the Mike Monroney Aeronautical Center (MMAC) budget by providing enhanced capability to plan,...

  15. Integrated Automation System for Rare Earth Countercurrent Extraction Process

    Institute of Scientific and Technical Information of China (English)

    柴天佑; 杨辉

    2004-01-01

    Lower automation level in industrial rare-earth extraction processes results in high production cost, inconsistent product quality and great consumption of resources in China. An integrated automation system for extraction process of rare earth is proposed to realize optimal product indices, such as product purity,recycle rate and output. The optimal control strategy for output component, structure and function of the two-gradcd integrated automation system composed of the process management grade and the process control grade were discussed. This system is successfully applied to a HAB yttrium extraction production process and was found to provide optimal control, optimal operation, optimal management and remarkable benefits.

  16. Designing Institutions for International Monetary Policy Coordination

    OpenAIRE

    Padilla, Atilano Jorge

    1995-01-01

    In this paper we study the adjustment of a N-country world economy to an unfavourable common supply shock. We show that world-wide monetary policy coordination is essential to achieve an optimal adjustment to the common shock, but that its actual implementation requires careful design to ensure that each country finds it optimal to join and to remain faithful to the coordination agreement. We then construct alternative coordination mechanisms which implement the first-best response to the com...

  17. TECHNICAL COORDINATION

    CERN Multimedia

    A. Ball

    Overview From a technical perspective, CMS has been in “beam operation” state since 6th November. The detector is fully closed with all components operational and the magnetic field is normally at the nominal 3.8T. The UXC cavern is normally closed with the radiation veto set. Access to UXC is now only possible during downtimes of LHC. Such accesses must be carefully planned, documented and carried out in agreement with CMS Technical Coordination, Experimental Area Management, LHC programme coordination and the CCC. Material flow in and out of UXC is now strictly controlled. Access to USC remains possible at any time, although, for safety reasons, it is necessary to register with the shift crew in the control room before going down.It is obligatory for all material leaving UXC to pass through the underground buffer zone for RP scanning, database entry and appropriate labeling for traceability. Technical coordination (notably Stephane Bally and Christoph Schaefer), the shift crew and run ...

  18. Automated drafting system uses computer techniques

    Science.gov (United States)

    Millenson, D. H.

    1966-01-01

    Automated drafting system produces schematic and block diagrams from the design engineers freehand sketches. This system codes conventional drafting symbols and their coordinate locations on standard size drawings for entry on tapes that are used to drive a high speed photocomposition machine.

  19. Towards reduction of Paradigm coordination models

    NARCIS (Netherlands)

    Andova, S.; Groenewegen, L.P.J.; Vink, E.P. de; Aceto, L.; Mousavi, M.R.

    2011-01-01

    The coordination modelling language Paradigm addresses collaboration between components in terms of dynamic constraints. Within a Paradigm model, component dynamics are consistently specified at a detailed and a global level of abstraction. To enable automated verification of Paradigm models, a tran

  20. 基于转向稳定性的汽车悬架协调优化%Coordinated optimal of automotive suspension based on steering stability

    Institute of Scientific and Technical Information of China (English)

    刘伟; 史文库; 郭福祥; 沈志宏; 方德广

    2011-01-01

    A mathematical model was built for an automotive magneto-rheological damper based on the performance test. A multi-body dynamics model and a vistual proving ground were established for a passenger car using the multi-body dynamics simulation software SIMPACK. In view of the impact of the semi-active suspension system(SASS) and the steering system on the vehicle.steering stability, a coordinated controller based on the PID neural network was designed, the data exchange interface between the SASS coordinated controller and the vehicle multi-body dynamics model was defined under MATLAB enviornment, and the coordinated control of vehicle SASS and steering system was achieved. The simulation results showed that the corrdinated control stralegy using PID neural network can reduce the vibration transmitted from the suspension system, adjecst the yaw and roll motions of car body at steering condition, mitigate the contradiction between steering stability and ride comfort of the vehicle.%基于试验研究建立了某款车用磁流变阻尼器的数学模型.利用多体动力学仿真软件SIMPACK建立了某款乘用客车整车多体动力学模型和虚拟仿真环境.针对半主动悬架系统和转向系统对汽车操纵稳定性的影响,提出了一种PID神经网络协调控制器,并在Matlab环境中定义了半主动悬架协调控制器与整车多体动力学模型的数据交换接口,实现了汽车半主动悬架和转向系统的协调控制.仿真结果表明:PID神经网络协调控制策略可以抑制由悬架传递的振动,调节汽车转向时车身的横摆及侧倾运动,有效地解决了汽车悬架设计中操纵稳定性与行驶平顺性之间的矛盾.

  1. TECHNICAL COORDINATION

    CERN Multimedia

    A. Ball

    2010-01-01

    Operational Experience At the end of the first full-year running period of LHC, CMS is established as a reliable, robust and mature experiment. In particular common systems and infrastructure faults accounted for <0.6 % CMS downtime during LHC pp physics. Technical operation throughout the entire year was rather smooth, the main faults requiring UXC access being sub-detector power systems and rack-cooling turbines. All such problems were corrected during scheduled technical stops, in the shadow of tunnel access needed by the LHC, or in negotiated accesses or access extensions. Nevertheless, the number of necessary accesses to the UXC averaged more than one per week and the technical stops were inevitably packed with work packages, typically 30 being executed within a few days, placing a high load on the coordination and area management teams. It is an appropriate moment for CMS Technical Coordination to thank all those in many CERN departments and in the Collaboration, who were involved in CMS techni...

  2. JAERI contribution for the IAEA coordinated research program phase III (CRP-3) on 'optimizing of reactor pressure vessel surveillance programmes and their analysis'

    International Nuclear Information System (INIS)

    As a part of IAEA coordinated research program on irradiation embrittlement of RPV steels, JAERI performed irradiation study using seven materials with varying copper and nickel contents. Irradiation of test specimens was conducted in JMTR of JAERI/Oarai and post irradiation tests including fracture toughness were done at hot laboratory at JAERI/Tokai. The following conclusions were drawn: (1) With increasing Ni contents from 0.1 to 1.18 wt%, radiation hardening and therefore embrittlement became larger. (2) The increase of yield stress can be correlated with hardness increase and Charpy transition temperature shift. (3) In the upper shelf region, irradiation response on fracture toughness was well correlated with radiation hardening, while no good correlation was found between Charpy energy decrease and radiation hardening. (author)

  3. 计及负荷与储能装置协调优化的微网可靠性评估%Reliability Evaluation of Microgrids Considering Coordinative Optimization of Loads and Storage Devices

    Institute of Scientific and Technical Information of China (English)

    别朝红; 李更丰; 谢海鹏

    2014-01-01

    For microgrids, coordinative optimization of loads and storage devices is an important approach that promotes the utilization of renewable energy and economic operation. However, the optimization directly changes the customers’ consumption habits and the operation status of storage devices, and affects the reliability of microgrids. This paper establishes optimization models for loads and storages, which are integrated into the reliability evaluation of microgrids by a sequential Monte Carlo method.The integration realizes the consideration of the coordinative optimizationin reliability evaluation of microgrids.Besides, new reliability indices are defined to quantify the impacts of the optimization on the reliability of customers, storage devices and microgrids. Extensive results indicate the effectiveness and accuracy of the presented models and methods. With consideration of the load and storage optimization, the results are of significance to make microgrid operation plans andimprove the reliability and economy of microgrids.%微网中负荷与储能装置的协调优化是促进可再生能源的利用,提高微网运行经济性的重要手段,但同时将直接改变用户的用电习惯和储能装置的运行特性,进而影响微网的可靠性。本文通过建立负荷与储能装置的协调优化模型,并基于时序蒙特卡洛模拟法,将该模型与微网可靠性评估相结合,实现在微网可靠性评估中计及负荷与储能装置协调优化的影响。此外,定义了新的负荷、储能装置以及微网整体的可靠性指标,对该影响进行定量化评估。通过算例分析,验证了上述模型、方法及指标体系的有效性。计算结果表明,计及负荷与储能装置协调优化影响使得评估更加符合微网的运行实际,对制定合理的微网运行方案,提高其可靠性和经济性具有重要的意义。

  4. Optimal scan techniques for dynamic gadolinium-enhanced MR imaging of the liver. Usefulness of test injection method and automated bolus tracking technique

    International Nuclear Information System (INIS)

    This study was performed to determine optimal scan techniques and important factors in dynamic gadolinium-enhanced MR imaging of the liver. Comparing the influence of test bolus injection to that of the automatic trigger was done in the same cases. One hundred patients were enrolled in the whole liver dynamic study using a 1.5 T MR system. MR imaging using a power injector was performed in 85 cases, while manual injection was performed in the other 15 cases. Fast spoiled GRASS (FSPGR) or enhanced 3D fast gradient echo (efgre3d) pulse sequence was used for the study. MR SmartPrep, which is a computer assisted bolus tracking technique, was performed in 37 of the 57 cases with efgre3d. Both test bolus and MR SmartPrep were used in 10 of the 57 cases. Hepatic arterial phase images were designated to 3 grades of scan timing. Optimal timing was determined when a faint depiction of portal vein in addition to a description of second branches of the hepatic artery were present. The best timing was found in efgre3d 3 ml/s with a SmartPrep in which the acquisition delay time was 10 s. The mean delay time from the initiation of contrast agent administration to the beginning of arterial phase scanning was 17.0±4.71 s (mean±SD, range 10-29 s). Although similar effects might be achieved when the test injection method is used, it requires complicated steps. These results indicate that we can perform optimal timing for the hepatic arterial phase of dynamic MR imaging. We can also set the delay time for individual cases with the power injector. (author)

  5. 基于粒子群算法的24小时综合无功协调优化%The 24 hours reactive power optimization and coordination based on particle swarm algorithm

    Institute of Scientific and Technical Information of China (English)

    陈兰芝; 王克文

    2016-01-01

    对于电力系统24小时无功协调优化来说,优化方法是使用粒子群优化算法及罚函数法,将所有的不等式约束方程式引入原目标函数作为惩罚项;优化目标是以全天经济费用最小作为目标函数;优化过程为静态优化和综合优化两个阶段。并根据在线负荷预测来确定24个时刻的并联电容器组的投切状态和变压器分接头的位置。将粒子群算法用于求解多目标无功优化问题中能够有效降低有功网损,减少无功补偿成本,而且其收敛性能好、收敛速度快、稳定性好。%For 24 hours reactive power optimization and coordination in the power system , the optimization method is used in particle swarm optimization algorithm and penalty function method to bring all the inequality constraint equa -tions into the original objective function , which is optimized as a penalty term .The optimization goal is the minimum economic cost as the objective function throughout the day , and the optimization procedure is composed of two stages of static and comprehensive optimization .Based on the on-online forecasted load powers , the shunt capacitors switc-hing states and transform tap stalls for 24 hours are determined .The particle swarm algorithm is used to solve the multi-objective reactive power optimization problem , which can not only reduce the active power loss effectively and the cost of reactive power compensation , but also improve the convergence performance , the convergence speed and the stability .

  6. RUN COORDINATION

    CERN Multimedia

    C. Delaere

    2012-01-01

      With the analysis of the first 5 fb–1 culminating in the announcement of the observation of a new particle with mass of around 126 GeV/c2, the CERN directorate decided to extend the LHC run until February 2013. This adds three months to the original schedule. Since then the LHC has continued to perform extremely well, and the total luminosity delivered so far this year is 22 fb–1. CMS also continues to perform excellently, recording data with efficiency higher than 95% for fills with the magnetic field at nominal value. The highest instantaneous luminosity achieved by LHC to date is 7.6x1033 cm–2s–1, which translates into 35 interactions per crossing. On the CMS side there has been a lot of work to handle these extreme conditions, such as a new DAQ computer farm and trigger menus to handle the pile-up, automation of recovery procedures to minimise the lost luminosity, better training for the shift crews, etc. We did suffer from a couple of infrastructure ...

  7. The Center for Optimized Structural Studies (COSS) platform for automation in cloning, expression, and purification of single proteins and protein-protein complexes.

    Science.gov (United States)

    Mlynek, Georg; Lehner, Anita; Neuhold, Jana; Leeb, Sarah; Kostan, Julius; Charnagalov, Alexej; Stolt-Bergner, Peggy; Djinović-Carugo, Kristina; Pinotsis, Nikos

    2014-06-01

    Expression in Escherichia coli represents the simplest and most cost effective means for the production of recombinant proteins. This is a routine task in structural biology and biochemistry where milligrams of the target protein are required in high purity and monodispersity. To achieve these criteria, the user often needs to screen several constructs in different expression and purification conditions in parallel. We describe a pipeline, implemented in the Center for Optimized Structural Studies, that enables the systematic screening of expression and purification conditions for recombinant proteins and relies on a series of logical decisions. We first use bioinformatics tools to design a series of protein fragments, which we clone in parallel, and subsequently screen in small scale for optimal expression and purification conditions. Based on a scoring system that assesses soluble expression, we then select the top ranking targets for large-scale purification. In the establishment of our pipeline, emphasis was put on streamlining the processes such that it can be easily but not necessarily automatized. In a typical run of about 2 weeks, we are able to prepare and perform small-scale expression screens for 20-100 different constructs followed by large-scale purification of at least 4-6 proteins. The major advantage of our approach is its flexibility, which allows for easy adoption, either partially or entirely, by any average hypothesis driven laboratory in a manual or robot-assisted manner. PMID:24647677

  8. Coordination Capacity

    CERN Document Server

    Cuff, Paul; Cover, Thomas

    2009-01-01

    We develop elements of a theory of cooperation and coordination in networks. Rather than considering a communication network as a means of distributing information, or of reconstructing random processes at remote nodes, we ask what dependence can be established among the nodes given the communication constraints. Specifically, in a network with communication rates between the nodes, we ask what is the set of all achievable joint distributions p(x1, ..., xm) of actions at the nodes on the network. Several networks are solved, including arbitrarily large cascade networks. Distributed cooperation can be the solution to many problems such as distributed games, distributed control, and establishing mutual information bounds on the influence of one part of a physical system on another.

  9. Laboratory automation: trajectory, technology, and tactics.

    Science.gov (United States)

    Markin, R S; Whalen, S A

    2000-05-01

    Laboratory automation is in its infancy, following a path parallel to the development of laboratory information systems in the late 1970s and early 1980s. Changes on the horizon in healthcare and clinical laboratory service that affect the delivery of laboratory results include the increasing age of the population in North America, the implementation of the Balanced Budget Act (1997), and the creation of disease management companies. Major technology drivers include outcomes optimization and phenotypically targeted drugs. Constant cost pressures in the clinical laboratory have forced diagnostic manufacturers into less than optimal profitability states. Laboratory automation can be a tool for the improvement of laboratory services and may decrease costs. The key to improvement of laboratory services is implementation of the correct automation technology. The design of this technology should be driven by required functionality. Automation design issues should be centered on the understanding of the laboratory and its relationship to healthcare delivery and the business and operational processes in the clinical laboratory. Automation design philosophy has evolved from a hardware-based approach to a software-based approach. Process control software to support repeat testing, reflex testing, and transportation management, and overall computer-integrated manufacturing approaches to laboratory automation implementation are rapidly expanding areas. It is clear that hardware and software are functionally interdependent and that the interface between the laboratory automation system and the laboratory information system is a key component. The cost-effectiveness of automation solutions suggested by vendors, however, has been difficult to evaluate because the number of automation installations are few and the precision with which operational data have been collected to determine payback is suboptimal. The trend in automation has moved from total laboratory automation to a

  10. How to assess sustainability in automated manufacturing

    DEFF Research Database (Denmark)

    Dijkman, Teunis Johannes; Rödger, Jan-Markus; Bey, Niki

    2015-01-01

    The aim of this paper is to describe how sustainability in automation can be assessed. The assessment method is illustrated using a case study of a robot. Three aspects of sustainability assessment in automation are identified. Firstly, we consider automation as part of a larger system that fulfi......The aim of this paper is to describe how sustainability in automation can be assessed. The assessment method is illustrated using a case study of a robot. Three aspects of sustainability assessment in automation are identified. Firstly, we consider automation as part of a larger system......, (sustainability) specifications move top-down, which helps avoiding sub-optimization and problem shifting. From these three aspects, sustainable automation is defined as automation that contributes to products that fulfill a market demand in a more sustainable way. The case study presents the carbon footprints...... of a robot, a production cell, a production line and the final product. The case study results illustrate that, depending on the actor and the level he/she acts at, sustainability and the actions that can be taken to contribute to a more sustainable product are perceived differently: even though the robot...

  11. Manufacturing and automation

    Directory of Open Access Journals (Sweden)

    Ernesto Córdoba Nieto

    2010-04-01

    Full Text Available The article presents concepts and definitions from different sources concerning automation. The work approaches automation by virtue of the author’s experience in manufacturing production; why and how automation prolects are embarked upon is considered. Technological reflection regarding the progressive advances or stages of automation in the production area is stressed. Coriat and Freyssenet’s thoughts about and approaches to the problem of automation and its current state are taken and examined, especially that referring to the problem’s relationship with reconciling the level of automation with the flexibility and productivity demanded by competitive, worldwide manufacturing.

  12. Optimization and studies of the welding processes, automation of the sealing welding system and fracture mechanics in the vessels surveillance in nuclear power plants

    International Nuclear Information System (INIS)

    Inside this work the optimization of two welding systems is described, as well as the conclusion of a system for the qualification of containers sealing in the National Institute of Nuclear Research that have application in the surveillance programs of nuclear reactors vessels and the correspondent extension of the operation license. The test tubes Charpy are assay to evaluate the embrittlement grade, when obtaining the increment in the reference temperature and the decrease of the absorbed maximum energy, in the transition curve fragile-ductile of the material. After the test two test tube halves are obtained that should take advantage to follow the surveillance of the vessel and their possible operation extension, this is achieved by means of rebuilding (being obtained of a tested test tube two reconstituted test tubes). The welding system for the rebuilding of test tubes Charpy, was optimized when diminishing the union force at solder, achieving the elimination of the rejection for penetration lack for spill. For this work temperature measurements were carried out at different distances of the welding interface from 1 up to 12 mm, obtaining temperature profiles. With the maximum temperatures were obtained a graph and equation that represents the maximum temperature regarding the distance of the interface, giving as a result practical the elimination of other temperature measurements. The reconstituted test tubes were introduced inside pressurized containers with helium of ultra high purity to 1 pressure atmosphere. This process was carried out in the welding system for containers sealing, where an automatic process was implemented by means of an application developed in the program LabVIEW, reducing operation times and allowing the remote control of the process, the acquisition parameters as well as the generation of welding reports, avoiding with this the human error. (Author)

  13. Laboratory automation in clinical bacteriology: what system to choose?

    Science.gov (United States)

    Croxatto, A; Prod'hom, G; Faverjon, F; Rochais, Y; Greub, G

    2016-03-01

    Automation was introduced many years ago in several diagnostic disciplines such as chemistry, haematology and molecular biology. The first laboratory automation system for clinical bacteriology was released in 2006, and it rapidly proved its value by increasing productivity, allowing a continuous increase in sample volumes despite limited budgets and personnel shortages. Today, two major manufacturers, BD Kiestra and Copan, are commercializing partial or complete laboratory automation systems for bacteriology. The laboratory automation systems are rapidly evolving to provide improved hardware and software solutions to optimize laboratory efficiency. However, the complex parameters of the laboratory and automation systems must be considered to determine the best system for each given laboratory. We address several topics on laboratory automation that may help clinical bacteriologists to understand the particularities and operative modalities of the different systems. We present (a) a comparison of the engineering and technical features of the various elements composing the two different automated systems currently available, (b) the system workflows of partial and complete laboratory automation, which define the basis for laboratory reorganization required to optimize system efficiency, (c) the concept of digital imaging and telebacteriology, (d) the connectivity of laboratory automation to the laboratory information system, (e) the general advantages and disadvantages as well as the expected impacts provided by laboratory automation and (f) the laboratory data required to conduct a workflow assessment to determine the best configuration of an automated system for the laboratory activities and specificities. PMID:26806135

  14. 基于交替方向乘子法的非光滑损失坐标优化算法%New coordinate optimization method for non-smooth losses based on alternating direction method of multipliers

    Institute of Scientific and Technical Information of China (English)

    高乾坤; 王玉军; 王惊晓

    2013-01-01

    Alternating Direction Method of Multipliers (ADMM) already has some practical applications in machine learning field.In order to adapt to the large-scale data processing and non-smooth loss convex optimization problem,the original batch ADMM algorithm was improved by using mirror descent method,and a new coordinate optimization algorithm was proposed for solving non-smooth loss convex optimization.This new algorithm has a simple operation and efficient computation.Through detailed theoretical analysis,the convergence of the new algorithm is verified and it also has the optimal convergence rate in general convex condition.Finally,the experimental results compared with the state-of-art algorithms demonstrate it gets better convergence rate under the sparsity of solution.%交替方向乘子法(ADMM)在机器学习问题中已有一些实际应用.针对大规模数据的处理和非光滑损失凸优化问题,将镜面下降方法引入原ADMM批处理算法,得到了一种新的改进算法,并在此基础上提出了一种求解非光滑损失凸优化问题的坐标优化算法.该算法具有操作简单、计算高效的特点.通过详尽的理论分析,证明了新算法的收敛性,在一般凸条件下其具有目前最优的收敛速度.最后与相关算法进行了对比,实验结果表明该算法在保证解稀疏性的同时拥有更快的收敛速度.

  15. Automation Framework for Flight Dynamics Products Generation

    Science.gov (United States)

    Wiegand, Robert E.; Esposito, Timothy C.; Watson, John S.; Jun, Linda; Shoan, Wendy; Matusow, Carla

    2010-01-01

    XFDS provides an easily adaptable automation platform. To date it has been used to support flight dynamics operations. It coordinates the execution of other applications such as Satellite TookKit, FreeFlyer, MATLAB, and Perl code. It provides a mechanism for passing messages among a collection of XFDS processes, and allows sending and receiving of GMSEC messages. A unified and consistent graphical user interface (GUI) is used for the various tools. Its automation configuration is stored in text files, and can be edited either directly or using the GUI.

  16. RUN COORDINATION

    CERN Document Server

    C. Delaere

    2013-01-01

    Since the LHC ceased operations in February, a lot has been going on at Point 5, and Run Coordination continues to monitor closely the advance of maintenance and upgrade activities. In the last months, the Pixel detector was extracted and is now stored in the pixel lab in SX5; the beam pipe has been removed and ME1/1 removal has started. We regained access to the vactank and some work on the RBX of HB has started. Since mid-June, electricity and cooling are back in S1 and S2, allowing us to turn equipment back on, at least during the day. 24/7 shifts are not foreseen in the next weeks, and safety tours are mandatory to keep equipment on overnight, but re-commissioning activities are slowly being resumed. Given the (slight) delays accumulated in LS1, it was decided to merge the two global runs initially foreseen into a single exercise during the week of 4 November 2013. The aim of the global run is to check that we can run (parts of) CMS after several months switched off, with the new VME PCs installed, th...

  17. RUN COORDINATION

    CERN Multimedia

    Christophe Delaere

    2013-01-01

    The focus of Run Coordination during LS1 is to monitor closely the advance of maintenance and upgrade activities, to smooth interactions between subsystems and to ensure that all are ready in time to resume operations in 2015 with a fully calibrated and understood detector. After electricity and cooling were restored to all equipment, at about the time of the last CMS week, recommissioning activities were resumed for all subsystems. On 7 October, DCS shifts began 24/7 to allow subsystems to remain on to facilitate operations. That culminated with the Global Run in November (GriN), which   took place as scheduled during the week of 4 November. The GriN has been the first centrally managed operation since the beginning of LS1, and involved all subdetectors but the Pixel Tracker presently in a lab upstairs. All nights were therefore dedicated to long stable runs with as many subdetectors as possible. Among the many achievements in that week, three items may be highlighted. First, the Strip...

  18. Configuration Management Automation (CMA)

    Data.gov (United States)

    Department of Transportation — Configuration Management Automation (CMA) will provide an automated, integrated enterprise solution to support CM of FAA NAS and Non-NAS assets and investments. CMA...

  19. Optimizing Installation and Operation Properties of an AUV-Mounted Swath Sonar Sensor for Automated Marine Gas Seep Detection - a Modelling Approach

    Science.gov (United States)

    Wenau, S.; Fei, T.; Tóth, Z.; Keil, H.; Spiess, V.; Kraus, D.

    2014-12-01

    The detection of gas bubble streams in the water column by single- and multibeam sonars has been a common procedure in the research of marine seep sites. In the framework of the development of an AUV capable of automatic detection and sampling of gas bubble streams, such acoustic flares were modelled in MATLAB routines to assess the optimal sonar configuration for flare detection. The AUV development (IMGAM-project) is carried out as a cooperation of the company ATLAS Hydrographic and the MARUM at the University of Bremen. The combination of sensor inclination, sonar carrier frequency and pulse characteristics affect the ability of the system to detect bubble streams of different sizes and intensities. These variations in acoustic signal return from gas bubble streams depending on acquisition parameters can affect the detectability and acoustic properties of recorded acoustic flares in various seepage areas in the world's oceans. We show several examples of acoustic signatures of previously defined bubble streams under varying acquisition parameters and document the effects of changing sensor parameters on detection efficiency.

  20. Purchasing Optimization and Decision of Supply Chain based on Flexible Contract Coordinating%基于柔性契约协调机制下的供应链采购优化与决策

    Institute of Scientific and Technical Information of China (English)

    朱珠; 朱云龙

    2012-01-01

    To maximize the profits of the system and each entity on the supply chain, a coordination model was established based on flexible coordination mechanism of option contract. The influence on improving the performance of supply chain composed of multi-suppliers and a core-manufacturer was analyzed. Due to the characteristic that the manufacturer had strong bargaining power under the many-to-one supply chain structure, a Stackelberg game model was constructed in which manufacturer was leader and the suppliers were followers. Then, optimal decisions under central and decentralized control were analyzed respectively. It shows that manufacturer can design appropriate parameters to encourage the suppliers' efforts while the performance of the supply chain making Pareto improvement. Finally, the validity of the flexible coordination mechanism was illustrated by a numerical example. The results also show that the manufacturer cannot set the initial order arbitrarily under the very coordination mechanism.%为提高供应链及链上各实体的收益,合理分配采购量,针对多个供应商和单一制造商组成的供应链系统构建了基于柔性契约协调机制的多对一供应链协调优化模型,分析了柔性契约协调机制对分散式供应链系统绩效的影响.针对多对一供应链结构中核心制造商有强议价能力的特点,建立了以制造商为主方、供应商为从方的Stackelberg主从对策模型,分别讨论了集中控制和分散控制下的最优决策.在期权契约下核心制造商通过设计合理的契约参数,可以激励上游供应商的决策行为,从而使得分散供应链系统的绩效能够得到提高.仿真实验证明了该柔性契约协调机制的有效性.实验结果表明,在一定的契约参数条件下,制造商不能随意设置其初始采购量,否则可能导致协调失效.

  1. Chinese geodetic coordinate system 2000

    Institute of Scientific and Technical Information of China (English)

    YANG YuanXi

    2009-01-01

    The basic strategies In establishing the Chinese geodetic coordinate system 2000 have been summarized,including the definition of the coordinate system,the structure of the terrestrial reference frame,the functional and stochastic models involved in the realization of the reference frame as well as the Improvements of the adjustment procedures.First,the fundamental frame of the coordinate system is composed of the permanent GPS tracking network in China which is integrated into the international GPS service stations by combined adjustment,in order to guarantee the consistence between the international terrestrial reference system and the Chinese geodetic coordinate system.Second,the extended frame of the coordinate system is composed of the unified 2000' national GPS network which is Integrated by 6 nationwide GPS networks with more than 2500 stations under the controlling of the fundamental frame.Third,the densified frame is composed of national astronomical geodetic network with nearly 50 thousand stations which was updated by the combined adjustment with the 2000' national GPS network,thus the datum of the national astronomical geodetic network has been unified and the precision greatly improved.By the optimal data fusion method the influences of the datum errors,systematic errors and the outliers in the separated geodetic networks are weakened in the unified Chinese geodetic coordinate frame.The significance in application of the new geodetic coordinate system and the existing problems In the reference frame are described and analyzed.

  2. Multi-Agent Systems for Transportation Planning and Coordination

    NARCIS (Netherlands)

    J.M. Moonen (Hans)

    2009-01-01

    textabstractMany transportation problems are in fact coordination problems: problems that require communication, coordination and negotiation to be optimally solved. However, most software systems targeted at transportation have never approached it this way, and have instead concentrated on centrali

  3. Optimal Damping Control Systems Based on Coordination Between Units and Grid%考虑机网协调的阻尼优化控制系统研究

    Institute of Scientific and Technical Information of China (English)

    李淼; 胡伟; 陆秋瑜; 姚浩威; 李小平; 李大虎

    2012-01-01

    随着电力系统规模的不断扩大,小干扰稳定已经成为影响电力系统安全稳定的重要因素之一.为了有效地阻尼电力系统的小扰动振荡,提出了基于机网协调优化的阻尼控制方法.借鉴混成控制系统的思想与理念,考虑自动发电控制和自动电压控制系统之间的协调,建立了分层控制的阻尼协调优化控制系统.该系统将电力系统的阻尼强弱作为事件形成的指标,并以事件驱动作为系统调节的触发机制.该阻尼协调优化控制系统不仅能消除系统的弱阻尼模式,而且有效解决了传统自动发电控制和自动电压控制系统之间相互作用而导致的负面影响,实现了2套自动控制系统的控制指令的优化协调.在IEEE 5机14节点系统上进行多种运行方式下的仿真研究,仿真结果验证了所提方法的正确性和有效性.%With the constant expansion of power systems, small signal stability is one important factor on power system security and stability. In order to effectively damping power system oscillations of small disturbance, this paper presented a damping control system based on coordinated optimization between generator units and power grid. Using hybrid control system ideas and concepts, this paper considered the coordination between automatic generation control (AGC) system and automatic voltage control (AVC) system, and set up a hierarchical coordinated optimal damping control (CODC) system. The CODC system used the power system damping strength as indicators of the formation of the event, and used discrete event-driven as adjustable trigger mechanism. The CODC system can improve the weak damping, and effectively overcome the negative interaction between traditional AGC and AVC system caused by the coupling of active and reactive power. And this system achieves optimized coordination instructions between AGC and AVC. Finally, as an illustration, the proposed method was applied to the IEEE 5-machine

  4. Application of TRIZ Theory in Optimal Design for Automated Guided Vehicle%TRIZ理论在自行小车优化设计中的应用

    Institute of Scientific and Technical Information of China (English)

    彭开元; 叶际隆; 方春平; 于源

    2014-01-01

    TRIZ theory ( Theory of Inventive Problem Solving) is used to provide the methods of solving the problem and the direc-tion of structural innovation for the product design. Based on analyzing the characteristic of the automatic guide vehicle,and accord-ing to the contradiction matrix and invention principles, the optimal design strategies are proposed to impove its moving stability, turn-ing reliability and instal ation precision of the gears and reduce its energy-consumption. On the other hand, the material-field model-ing is introduced in this paper, it is used to improve the precision of its guide wheel.%TRIZ理论(发明问题解决理论)为产品设计提供了解决问题的方法和结构创新的方向。为解决自行小车设计过程中稳定性不高、转弯过程中可靠性差,能量损失较大,及齿轮难以精确定位等问题,在分析自行小车结构特点的基础上,应用矛盾矩阵、发明原理,对自行小车行驶稳定性、转弯可靠性以及齿轮精确定位等问题提出优化设计方案;引入物-场模型提高自行小车导向轮的导向精度。

  5. Workflow automation architecture standard

    Energy Technology Data Exchange (ETDEWEB)

    Moshofsky, R.P.; Rohen, W.T. [Boeing Computer Services Co., Richland, WA (United States)

    1994-11-14

    This document presents an architectural standard for application of workflow automation technology. The standard includes a functional architecture, process for developing an automated workflow system for a work group, functional and collateral specifications for workflow automation, and results of a proof of concept prototype.

  6. Automatización e Integración de la Inspección Dimensional con Máquinas de Medir por Coordenadas Automation and Integration of Dimensional Inspection using Coordinate Measurement Machines

    Directory of Open Access Journals (Sweden)

    E. Cuesta

    2004-01-01

    Full Text Available En este trabajo se detallan las fases del desarrollo de un sistema de Planificación de la Inspección, orientado a la inspección con máquinas de medir por coordenadas (MMC y que se integra con el resto de la información que fluye en todo el proceso productivo. El sistema está basado en el intercambio de información utilizando el novedoso formato STEP. La información transmitida en formato STEP tiene que ver con la mayor parte de las variables que intervienen en el proceso, desde la geometría de la pieza, sus características físicas y tecnológicas, hasta el tipo de máquina de medir y palpador, pasando por toda la información de Control de Calidad y de los requerimientos de Gestión de la Producción. El desarrollo del sistema presentado ha implicado la personalización y reprogramación tanto del software de la MMC como de un software CAD/CAM, aprovechando ambas aplicaciones para integrar la inspección MMC y la fabricación con tecnologías de control numérico computarizado (CNCThis paper describes in detail the phases of the development of an Inspection Planning System oriented toward the inspection of coordinate measurement machines (CMM, and its integration with the total information flow through the productive process. This system is based on information exchange using the newest STEP format. The information transmitted by the STEP format is mostly related to the process variables, from the geometry of the part and its physical and technological characteristics, to the type of measuring machine and the type of probe, passing through all the Quality Control information and the requirements for Production Management. The development of the system presented has required personalization and reprogramming both of the software for the CMM and of CAD/CAM software, making use of both applications for integrating CMM inspection with CNC technologies

  7. 基于启发式优化算法的单元协调控制系统整定%Unit Coordinated Control System Tuning Based on Heuristic Optimization Algorithm

    Institute of Scientific and Technical Information of China (English)

    刘吉臻; 谢谢; 曾德良; 闫姝; 刘一民

    2012-01-01

    In order to improve the dynamic performance and robustness of generating unit as load varying within a wide range,this paper proposes the tuning scheme for coordinated control system of boiler-turbine unit based on heuristic Kalman algorithm.The tuning scheme can be treated as a optimization problem that takes regulating performance of load and main steam pressure as optimization goal and robustness as constraint punishment condition.The heuristic Kalman algorithm is used to optimize parameters of control system,and two management regulations of North China Grid are applied to evaluate the performance of coordinated control system.The simulation results show that while load changes in a wide range,with this designed control system the main steam pressure is less volatile,and the good robustness and AGC performance can meet grid needs.%为提高单元机组在大范围变负荷时的动态调节性能与鲁棒性,提出了基于启发式卡尔曼算法的单元机组协调控制系统整定方法。该方法将控制系统的整定归结为一个以负荷与主汽压的调节性能为优化目标、鲁棒性能指标为约束惩罚条件的优化问题,利用启发式卡尔曼算法对控制系统参数寻优,并采用华北电网两个细则评价控制系统性能。仿真结果表明,所设计的控制系统在负荷大范围变化时,主汽压力波动较小,鲁棒性能良好且AGC性能满足当前电网需求。

  8. Shoe-String Automation

    Energy Technology Data Exchange (ETDEWEB)

    Duncan, M.L.

    2001-07-30

    Faced with a downsizing organization, serious budget reductions and retirement of key metrology personnel, maintaining capabilities to provide necessary services to our customers was becoming increasingly difficult. It appeared that the only solution was to automate some of our more personnel-intensive processes; however, it was crucial that the most personnel-intensive candidate process be automated, at the lowest price possible and with the lowest risk of failure. This discussion relates factors in the selection of the Standard Leak Calibration System for automation, the methods of automation used to provide the lowest-cost solution and the benefits realized as a result of the automation.

  9. 应用多波前法快速求解最优协调电压控制问题%Fast Solution for Optimal Coordinated Voltage Control Using Multifrontal Method

    Institute of Scientific and Technical Information of China (English)

    郑文杰; 刘明波

    2011-01-01

    将长期电压稳定场景下的协调电压控制问题用带有微分-代数方程约束的最优控制模型来描述,借助Radau排列技术将这个动态优化问题转化为大型非线性规划模型,并采用非线性原-对偶内点法求解.重点探讨如何应用多波前方法结合近似最小度排序提高求解稀疏线性修正方程的效率.以IEEE 17机162节点系统和新英格兰10机39节点系统作为算例,通过与近似最小度法和反向Cuthill-McKee法排序下三角分解结果进行对比,证实了所述方法在计算速度上的优越性.%A differential algebraic equation optimization model is used to describe the optimal coordinated voltage control problem in the long-term voltage stability scenario. This dynamic optimization problem can be converted into a large-scale nonlinear programming model using Radau collocation method. The nonlinear primal-dual interior-point method is then applied to solve this problem. This paper focuses on application of the multifrontal method to enhance the efficiency of solving sparse linear correction equations by referring to approximate minimum degree permutation. The IEEE 17-generator 162-bus test system and New England 10-generator 39-bus system are used to verify the effectiveness of the method proposed for comparison with the triangular decomposition using other permutation methods such as approximate minimum degree and reversed Cuthill-McKee.

  10. Image segmentation for automated dental identification

    Science.gov (United States)

    Haj Said, Eyad; Nassar, Diaa Eldin M.; Ammar, Hany H.

    2006-02-01

    Dental features are one of few biometric identifiers that qualify for postmortem identification; therefore, creation of an Automated Dental Identification System (ADIS) with goals and objectives similar to the Automated Fingerprint Identification System (AFIS) has received increased attention. As a part of ADIS, teeth segmentation from dental radiographs films is an essential step in the identification process. In this paper, we introduce a fully automated approach for teeth segmentation with goal to extract at least one tooth from the dental radiograph film. We evaluate our approach based on theoretical and empirical basis, and we compare its performance with the performance of other approaches introduced in the literature. The results show that our approach exhibits the lowest failure rate and the highest optimality among all full automated approaches introduced in the literature.

  11. Coordinated optimization between regional and provincial grids for regional generation right trade%区域发电权交易网省协调优化模式研究

    Institute of Scientific and Technical Information of China (English)

    于琪; 张晶; 王宣元; 张粒子

    2013-01-01

    提出2种区域发电权交易网省协调优化模式,即网省分层优化模式和区域统一优化模式.介绍了2种协调优化模式的组织形式,定性地分析了模式的特点及适用条件,从经济学的角度,分别在有阻塞约束和无阻塞约束的情况下,对模式的经济机理进行了深入剖析,并对2种模式的市场效率进行了比对.得到结论:有输电约束时,采用区域统一优化模式下的总社会福利大于等于网省分层优化模式下的总社会福利;无输电约束时,采用区域统一优化模式下的总社会福利大于网省分层优化模式下的总社会福利;在区域发电权交易市场建设初期,宜采用网省分层优化模式实现发电权交易从省级市场向区域市场过渡;区域统一优化模式规避了网省协调过程中所造成的效率损失,宜采用区域统一优化模式不断完善区域发电权交易市场.算例分析验证了所得结论的正确性.%Two different modes of coordinated optimization between regional and provincial grids are proposed for the regional generation right trade:layered optimization and integrated optimization. Their organizations are introduced,their features and applicability are qualitatively analyzed,their economic mechanisms are discussed in conditions of both with and without transmission congestion, and their market efficiencies are compared. Following conclusions are achieved;the total welfare of the integrated optimization mode is bigger than or equal to that of the layered optimization mode when there is transmission congestion,while it is bigger when there is no transmission congestion;the layered optimization mode should be adopted in the initial stage of market construction for regional generation right trade for the successful transition from provincial market to regional market;the integrated optimization mode should be used to continuously improve the regional market of generation right trade because it avoids the

  12. EPOS for Coordination of Asynchronous Sensor Webs Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Develop, integrate, and deploy software-based tools to coordinate asynchronous, distributed missions and optimize observation planning spanning simultaneous...

  13. Predictive optimal control for coordinated system of a supercritical power unit%超临界机组协调控制系统的预测优化控制

    Institute of Scientific and Technical Information of China (English)

    马良玉; 高志元

    2014-01-01

    为适应电网负荷需求,超临界机组常处于大幅度变工况运行状态,从而使得传统协调控制策略不能很好地适应机网协调运行的要求,影响机组的负荷响应速度,造成主蒸汽压力等参数大幅波动。对此,提出了一种基于神经网络和粒子群(PSO)算法的协调预测优化控制方法,并以某超临界600 MW机组为对象,基于Matlab 软件建立模型预测优化控制(MPOC)算法,借助火电机组仿真机对该控制方法进行验证。结果表明,该方法可大大提高机组的负荷响应速度,且满足主蒸汽压力等参数的控制要求。%In order to meet the requirements of power grid,a supercritical power generating unit often oper-ates under wide-range load-changing conditions.The conventional coordinated control strategy can not well adapt to the load deep regulation situations,and often leads to slow load response and large main steam pressure fluctuations.Therefore,a model predictive optimal control (MPOC)method for coordinated con-trol of the supercritical units was proposed,on the basis of neural network model and particle swarm opti-mization (PSO)algorithm.The proposed MPOC scheme was programmed with MATLAB software and tested by extensive control simulation experiments in the full-scope simulator of a 600 MW supercritical power generating unit.The simulation results show that the proposed MPOC method can improve the load dynamic response speed greatly and keep main steam pressure within safety limits.

  14. Quantum artificial bee colony optimization algorithm based on Bloch coordinates of quantum bit%量子位Bloch坐标的量子人工蜂群优化算法

    Institute of Scientific and Technical Information of China (English)

    易正俊; 何荣花; 侯坤

    2012-01-01

    To solve the problems of slow convergence speed and easily getting into local optimal value for Artificial Bee Colony (ABC) algorithm, a new quantum optimization algorithm was proposed by combining quantum theory and artificial colony algorithm. This algorithm expanded the quantity of the global optimal solution and improved the probability of achieving the global optimal solution by using Bloch coordinates of quantum bit encoding food sources in the artificial colony algorithm; then food sources were updated by quantum rotation gate. This paper put forward a new method for determining the relationship between the two rotation phases in the quantum rotation gate. When the ABC algorithm searched as the equal area on the Bloch sphere, it was proved that the size of the two rotation phases in the quantum rotation gate approximated to the inverse proportion. This avoided blind arbitrary rotation and made the search regular when approaching the optimal solutions. The experiments of two typical optimization issues show that the algorithm is superior to the common Quantum Artificial Bee Colony (QABC) and the simple ABC in both search capability and optimization efficiency.%为了改善人工蜂群(ABC)算法在解决多变量优化问题时存在的收敛速度较慢、容易陷入局部最优的不足,结合量子理论和人工蜂群算法提出一种新的量子优化算法.算法首先采用量子位Bloch坐标时蜂群算法中食物源进行编码,扩展了全局最优解的数量,提高了蜂群算法获得全局最优解的概率;然后用量子旋转门实现搜索过程中的食物源更新.对于量子旋转门的转角关系的确定,提出了一种新的方法.从理论上证明了蜂群算法在Bloch球面每次以等面积搜索时,量子旋转门的两个旋转相位大小近似于反比例关系,避免了固定相位旋转的不均等性,使得搜索呈现规律性.在典型函数优化问题的实验中,所提算法在搜索能力和优化效率两个

  15. Towards reduction of Paradigm coordination models

    OpenAIRE

    Andova, S.; Groenewegen, L. P. J.; Vink, de, E.P.; Aceto, Luca; Mousavi, M.R.

    2011-01-01

    The coordination modelling language Paradigm addresses collaboration between components in terms of dynamic constraints. Within a Paradigm model, component dynamics are consistently specified at a detailed and a global level of abstraction. To enable automated verification of Paradigm models, a translation of Paradigm into process algebra has been defined in previous work. In this paper we investigate, guided by a client-server example, reduction of Paradigm models based on a notion of global...

  16. Programmed device for coordinate measuring mechanisms

    International Nuclear Information System (INIS)

    The programmed device providing the automatic measurement of the coordinate of a continuously travelling carriage with stepwise control of the pitch and measurement range in wide limits is described. The matching of measured points at the direct and return travels of the carriage is within 50 mcm. The device was developed to automize the measurement of charged particles tracks by means of floating-wire method. It may be also used in different recorders

  17. Advanced type placement and geonames database: comprehensive coordination plan

    Energy Technology Data Exchange (ETDEWEB)

    Barnes, A.E.; Gough, E.C.; Brown, R.M.; Zied, A.

    1983-01-01

    This paper presents a preliminary comprehensive coordination plan for the technical development and integration of an automated names database capture and management of an overall automated cartographic system, for the Defense Mapping Agency. It broadly covers the technical issues associated with system and associated subsystems functional requirements, inter-subsystem interaction common technologies in hardware, software, database, and artificial intelligence. The 3-phase RandD cycles for each and all subsystems are also outlined.

  18. Ship block construction plan coordination and optimization based on PERT and CPM%基于PERT与CPM的船舶分段建造计划协调与优化

    Institute of Scientific and Technical Information of China (English)

    张光发; 刘玉君; 纪卓尚; 杨海天

    2011-01-01

    Ship block construction process under modern lean shipbuilding mode and the method of planning shipbuilding in our country are analyzed and summarized. Combining PERT with CPM, the ship block construction plan is created, and the building plan is coordinated and optimized in duration and cost. An applied program using VBA based on MS-Project software is developed to implement the method. As an instance, the block construction planning of a tanker in a Dalian shipyard is programmed and optimized. The calculated instance and results show that the approach is feasible and practical to make ship block construction plan in the early days of shipbuilding.%分析了现代精益造船模式下要求的船舶分段建造流程以及国内船厂生产计划的制定方式,利用计划评审技术(PERT),结合关键路径法(CPM),自动生成船舶分段产品建造计划,并在工期和成本方面对建造计划进行了协调与优化.基于MS-Project软件,用VBA编制了实用程序,并以大连某船厂建造的某油船为实例进行了计划编排与优化.与生产实际对比表明,该计划方法是可行的,为船舶建造初期制定生产计划提供了一种实用简便的方法.

  19. A sensor-based automation system for handling nuclear materials

    International Nuclear Information System (INIS)

    An automated system is being developed for handling large payloads of radioactive nuclear materials in an analytical laboratory. The automation system performs unpacking and repacking of payloads from shipping and storage containers, and delivery of the payloads to the stations in the laboratory. The system uses machine vision and force/torque sensing to provide sensor-based control of the automation system in order to enhance system safety, flexibility, and robustness, and achieve easy remote operation. The automation system also controls the operation of the laboratory measurement systems and the coordination of them with the robotic system. Particular attention has been given to system design features and analytical methods that provide an enhanced level of operational safety. Independent mechanical gripper interlock and tool release mechanisms were designed to prevent payload mishandling. An extensive Failure Modes and Effects Analysis of the automation system was developed as a safety design analysis tool

  20. Tools for Coordinating Planning Between Observatories

    Science.gov (United States)

    Jones, J.; Maks, L.; Fishman, M.; Grella, V.; Kerbel, U.; Misra, D.; Pell, V.

    With the realization of NASA's era of great observatories, there are now more than three space-based telescopes operating in different wave bands. This situation provides astronomers with a unique opportunity to simultaneously observe with multiple observatories. Yet scheduling multiple observatories simultaneously is highly inefficient when compared to observations using only a single observatory. Thus, programs using multiple observatories are limited not by scientific restrictions, but by operational inefficiencies. At present, multi-observatory programs are initiated by submitting observing proposals separately to each concerned observatory. To assure that the proposed observations can be scheduled, each observatory's staff has to check that the observations are valid and meet all constraints for their own observatory; in addition, they have to verify that the observations satisfy the constraints of the other observatories. Thus, coordinated observations require painstaking manual collaboration among staffs at each observatory. Due to the lack of automated tools for coordinated observations, this process is time consuming and error-prone, and the outcome of requests is not certain until the very end. To increase multi-observatory operations efficiency, such resource intensive processes need to be re-engineered. To overcome this critical deficiency, Goddard Space Flight Center's Advanced Architectures and Automation Branch is developing a prototype called the Visual Observation Layout Tool (VOLT). The main objective of VOLT is to provide visual tools to help automate the planning of coordinated observations by multiple astronomical observatories, as well as to increase the probability of scheduling all observations.

  1. 协凋控制系统直接能量平衡控制策略的优化应用%Optimization Application of Coordinated Controlling System Direct Energy Balance Controlling Strategy

    Institute of Scientific and Technical Information of China (English)

    王永涛; 杨保; 张海涛; 乔岩涛

    2015-01-01

    本文阐述了直接能量平衡(DEB)协调控制策略的基本原理与特点,分析了河南华润电力古城有限公司在该控制策略下所遇到的问题,并提出了相应的解决方案。解决方案包括:变负荷过程中智能滑压的优化应用;直接能量平衡协调控制系统的前馈优化应用;锅炉送风风量控制系统的优化应用等。方案实施后,保证了机组更加安全、稳定运行,提高了机组运行的经济效益。%This paper describes the basic principles and characteristics of the direct energy balance (DEB) coordinated control strategy, analyzes the problem encountered in the China Resources Power Henan Gucheng co., LTD, and presents the corresponding solution scheme. This scheme includes: the optimization and applications of intelligent sliding pressure in the process of changes in load、feed forward DEBCCS、the air quantity control system. Scheme implementation ensures more secure and stable operation, and improves the economic efficiency of plant.

  2. Cell-Detection Technique for Automated Patch Clamping

    Science.gov (United States)

    McDowell, Mark; Gray, Elizabeth

    2008-01-01

    A unique and customizable machinevision and image-data-processing technique has been developed for use in automated identification of cells that are optimal for patch clamping. [Patch clamping (in which patch electrodes are pressed against cell membranes) is an electrophysiological technique widely applied for the study of ion channels, and of membrane proteins that regulate the flow of ions across the membranes. Patch clamping is used in many biological research fields such as neurobiology, pharmacology, and molecular biology.] While there exist several hardware techniques for automated patch clamping of cells, very few of those techniques incorporate machine vision for locating cells that are ideal subjects for patch clamping. In contrast, the present technique is embodied in a machine-vision algorithm that, in practical application, enables the user to identify good and bad cells for patch clamping in an image captured by a charge-coupled-device (CCD) camera attached to a microscope, within a processing time of one second. Hence, the present technique can save time, thereby increasing efficiency and reducing cost. The present technique involves the utilization of cell-feature metrics to accurately make decisions on the degree to which individual cells are "good" or "bad" candidates for patch clamping. These metrics include position coordinates (x,y) in the image plane, major-axis length, minor-axis length, area, elongation, roundness, smoothness, angle of orientation, and degree of inclusion in the field of view. The present technique does not require any special hardware beyond commercially available, off-the-shelf patch-clamping hardware: A standard patchclamping microscope system with an attached CCD camera, a personal computer with an imagedata- processing board, and some experience in utilizing imagedata- processing software are all that are needed. A cell image is first captured by the microscope CCD camera and image-data-processing board, then the image

  3. Bayesian Safety Risk Modeling of Human-Flightdeck Automation Interaction

    Science.gov (United States)

    Ancel, Ersin; Shih, Ann T.

    2015-01-01

    Usage of automatic systems in airliners has increased fuel efficiency, added extra capabilities, enhanced safety and reliability, as well as provide improved passenger comfort since its introduction in the late 80's. However, original automation benefits, including reduced flight crew workload, human errors or training requirements, were not achieved as originally expected. Instead, automation introduced new failure modes, redistributed, and sometimes increased workload, brought in new cognitive and attention demands, and increased training requirements. Modern airliners have numerous flight modes, providing more flexibility (and inherently more complexity) to the flight crew. However, the price to pay for the increased flexibility is the need for increased mode awareness, as well as the need to supervise, understand, and predict automated system behavior. Also, over-reliance on automation is linked to manual flight skill degradation and complacency in commercial pilots. As a result, recent accidents involving human errors are often caused by the interactions between humans and the automated systems (e.g., the breakdown in man-machine coordination), deteriorated manual flying skills, and/or loss of situational awareness due to heavy dependence on automated systems. This paper describes the development of the increased complexity and reliance on automation baseline model, named FLAP for FLightdeck Automation Problems. The model development process starts with a comprehensive literature review followed by the construction of a framework comprised of high-level causal factors leading to an automation-related flight anomaly. The framework was then converted into a Bayesian Belief Network (BBN) using the Hugin Software v7.8. The effects of automation on flight crew are incorporated into the model, including flight skill degradation, increased cognitive demand and training requirements along with their interactions. Besides flight crew deficiencies, automation system

  4. Reaction Coordinates and Mechanistic Hypothesis Tests.

    Science.gov (United States)

    Peters, Baron

    2016-05-27

    Reaction coordinates are integral to several classic rate theories that can (a) predict kinetic trends across conditions and homologous reactions, (b) extract activation parameters with a clear physical interpretation from experimental rates, and (c) enable efficient calculations of free energy barriers and rates. New trajectory-based rare events methods can provide rates directly from dynamical trajectories without a reaction coordinate. Trajectory-based frameworks can also generate ideal (but abstract) reaction coordinates such as committors and eigenfunctions of the master equation. However, rates and mechanistic insights obtained from trajectory-based methods and abstract coordinates are not readily generalized across simulation conditions or reaction families. We discuss methods for identifying physically meaningful reaction coordinates, including committor analysis, variational transition state theory, Kramers-Langer-Berezhkovskii-Szabo theory, and statistical inference methods that can use path sampling data to screen, mix, and optimize thousands of trial coordinates. Special focus is given to likelihood maximization and inertial likelihood maximization approaches. PMID:27090846

  5. Reaction Coordinates and Mechanistic Hypothesis Tests.

    Science.gov (United States)

    Peters, Baron

    2016-05-27

    Reaction coordinates are integral to several classic rate theories that can (a) predict kinetic trends across conditions and homologous reactions, (b) extract activation parameters with a clear physical interpretation from experimental rates, and (c) enable efficient calculations of free energy barriers and rates. New trajectory-based rare events methods can provide rates directly from dynamical trajectories without a reaction coordinate. Trajectory-based frameworks can also generate ideal (but abstract) reaction coordinates such as committors and eigenfunctions of the master equation. However, rates and mechanistic insights obtained from trajectory-based methods and abstract coordinates are not readily generalized across simulation conditions or reaction families. We discuss methods for identifying physically meaningful reaction coordinates, including committor analysis, variational transition state theory, Kramers-Langer-Berezhkovskii-Szabo theory, and statistical inference methods that can use path sampling data to screen, mix, and optimize thousands of trial coordinates. Special focus is given to likelihood maximization and inertial likelihood maximization approaches.

  6. Control coordination abilities in shock combat sports

    Directory of Open Access Journals (Sweden)

    Natalya Boychenko

    2014-12-01

    Full Text Available Purpose: optimize the process control level of coordination abilities in martial arts. Material and Methods: analysis and compilation of scientific and methodological literature, interviews with coaches of drum martial arts, video analysis techniques, teacher observations. Results: identified specific types of coordination abilities in shock combat sports. Pod branny and offered specific and nonspecific tests to monitor the level of species athletes coordination abilities. Conclusion: it is determined that in order to achieve victory in the fight martial artists to navigate the space to be able to assess and manage dynamic and spatio-temporal parameters of movements, maintain balance, have a high coordination of movements. The proposed tests to monitor species coordination abilities athletes allow an objective assessment of not only the overall level of coordination, and the level of specific types of manifestations of this ability.

  7. Optimization Analysis of Lattice Store Coordination Based on Revenue Sharing Contract%基于供应链收益共享契约的格子铺经营机制优化分析

    Institute of Scientific and Technical Information of China (English)

    胡本勇; 陈旭

    2012-01-01

    The lattice store business mode (LSBM) has attracted increased attention in theory and practice. Many problems, such as poor management and low occupancy rate of cells, emerge with the rapid development of LSBM. Many solutions, such as increasing the number of sales staff, improving the brand advertising efforts, and improving the rent collection mode (actually, revenue sharing mode) , can break the development bottleneck of lattice stores. These solutions can help the lattice store business develop healthily and rapidly. The academic research of LSBM is rare. LSBM mainly involves two important aspects; promotion (advertising) and revenue sharing. Promotion ( advertising) can improve the management and brand awareness of lattice stores, thereby improving customer demand. Revenue sharing can increase the flexibility of cooperation which is beneficial to the development of the lattice store. Furthermore, the cooperation between the owner and the renter has supply chain characteristics. From the perspective of supply chain, this paper analyzes the influence of revenue sharing contract on the lattice store.In the study, this paper firstly uses production function to depict the stimulating effect of brand building and promotion effort on demand. Secondly, we construct a decision-making model of supply chains. Our analyses show the importance of cooperation, optimal brand building costs, promotional costs, and revenue sharing ratio under centralized and decentralized decision-making systems. This paper also discusses the impact of revenue sharing mechanism on the lattice store.Our research results show that a simple revenue sharing mechanism can not only realize the lattice store's optimal performance, but also improve the cooperation in LSBM. So the paper discusses the reason that the coordination cannot achieve and propose a revised policy. The policy specifies that the shop's rent should correspond to the owner's cost of brand building and promotion. The policy

  8. A Multi-pattern Coordinated Optimization Strategy of Wind Power and Energy Storage System Considering Temporal Dependence%考虑时间相关性的风储系统多模式协调优化策略

    Institute of Scientific and Technical Information of China (English)

    陆秋瑜; 胡伟; 闵勇; 王芝茗; 罗卫华; 成涛

    2015-01-01

    Based on the uncertainty scenarios considering temporal dependence of wind power prediction errors,the energy storage system is applied in both energy shifting pattern and plan following pattern.A multi-pattern coordinated optimization model for the wind power and energy storage combined system is proposed.Meanwhile,in view of the large-scale mixed-integer programming problem in the combined operation of wind power and energy storage system,the shifting ratio of energy storage capacity is proposed to develop the bi-level iterative optimization algorithm for fast sloving the model.The shifting ratio of energy storage capacity is further applied to quantitatively evaluate the capacity allocation.The results of the simulation based on actual data show that the optimal operation pattern of the wind power and energy storage system is influenced by factors including the selling price,penalty price and stochastic characteristics of wind power.And the model proposed is able to realize rational allocation and highly efficient utilization of the limited energy storage among different scenarios,providing important information for research on wind power and energy storage combined system operation and coordinated dispatch with power grids.%基于考虑风电预测误差时间相关性的不确定性场景,将能量型储能同时应用于削峰填谷和计划跟踪两种模式,提出了风储联合系统多模式协调优化模型。同时,针对风储联合运行的大规模混合整数规划问题,提出储能调峰容量系数的概念,用于构造两层迭代优化算法,对模型进行快速求解,并可对储能在不同模式的容量分配进行量化评估。基于实际数据进行了计算机仿真验证,仿真结果表明,风储联合系统的最优运行模式受上网电价、惩罚价格以及风电随机特性等因素的影响;而所提出的模型可实现储能有限容量在不同模式间的合理分配和高效利

  9. Wind Farm Coordinated Control for Power Optimization

    Institute of Scientific and Technical Information of China (English)

    SHU Jin; HAO Zhiguo; ZHANG Baohui; BO Zhiqian

    2011-01-01

    The total wind energy capture would decrease with the aerodynamic interaction among turbines known as wake effect, and the conventional maximum power point track (MPPT) schemes for individual wind turbine generator (WTG) can not maximize the total farm power.

  10. Embedding Temporal Constraints for Coordinated Execution in Habitat Automation

    Data.gov (United States)

    National Aeronautics and Space Administration — Future NASA plans call for long-duration deep space missions with human crews. Because of light-time delay and other considerations, increased autonomy will be...

  11. Automated support for rapid coordination of joint UUV operation

    OpenAIRE

    Johns, Seneca R.

    2015-01-01

    Approved for public release; distribution is unlimited Recently, marine services company Phoenix International headed the search efforts for Malaysian Airlines flight 370 using its Bluefin-21 autonomous unmanned underwater vehicle (UUV). In total, it conducted 270 hours of in-water time and covered approximately 250 square miles of ocean floor. Deploying multiple UUVs simultaneously would have increased the coverage area substantially within the same time period. Ideally, a coalition of co...

  12. Embedding Temporal Constraints For Coordinated Execution in Habitat Automation

    Science.gov (United States)

    Morris, Paul; Schwabacher, Mark; Dalal, Michael; Fry, Charles

    2013-01-01

    Future NASA plans call for long-duration deep space missions with human crews. Because of light-time delay and other considerations, increased autonomy will be needed. This will necessitate integration of tools in such areas as anomaly detection, diagnosis, planning, and execution. In this paper we investigate an approach that integrates planning and execution by embedding planner-derived temporal constraints in an execution procedure. To avoid the need for propagation, we convert the temporal constraints to dispatchable form. We handle some uncertainty in the durations without it affecting the execution; larger variations may cause activities to be skipped.

  13. Automate functional testing

    Directory of Open Access Journals (Sweden)

    Ramesh Kalindri

    2014-06-01

    Full Text Available Currently, software engineers are increasingly turning to the option of automating functional tests, but not always have successful in this endeavor. Reasons range from low planning until over cost in the process. Some principles that can guide teams in automating these tests are described in this article.

  14. Automation in Warehouse Development

    NARCIS (Netherlands)

    Hamberg, R.; Verriet, J.

    2012-01-01

    The warehouses of the future will come in a variety of forms, but with a few common ingredients. Firstly, human operational handling of items in warehouses is increasingly being replaced by automated item handling. Extended warehouse automation counteracts the scarcity of human operators and support

  15. Work and Programmable Automation.

    Science.gov (United States)

    DeVore, Paul W.

    A new industrial era based on electronics and the microprocessor has arrived, an era that is being called intelligent automation. Intelligent automation, in the form of robots, replaces workers, and the new products, using microelectronic devices, require significantly less labor to produce than the goods they replace. The microprocessor thus…

  16. Library Automation Style Guide.

    Science.gov (United States)

    Gaylord Bros., Liverpool, NY.

    This library automation style guide lists specific terms and names often used in the library automation industry. The terms and/or acronyms are listed alphabetically and each is followed by a brief definition. The guide refers to the "Chicago Manual of Style" for general rules, and a notes section is included for the convenience of individual…

  17. Automation in immunohematology.

    Science.gov (United States)

    Bajpai, Meenu; Kaur, Ravneet; Gupta, Ekta

    2012-07-01

    There have been rapid technological advances in blood banking in South Asian region over the past decade with an increasing emphasis on quality and safety of blood products. The conventional test tube technique has given way to newer techniques such as column agglutination technique, solid phase red cell adherence assay, and erythrocyte-magnetized technique. These new technologies are adaptable to automation and major manufacturers in this field have come up with semi and fully automated equipments for immunohematology tests in the blood bank. Automation improves the objectivity and reproducibility of tests. It reduces human errors in patient identification and transcription errors. Documentation and traceability of tests, reagents and processes and archiving of results is another major advantage of automation. Shifting from manual methods to automation is a major undertaking for any transfusion service to provide quality patient care with lesser turnaround time for their ever increasing workload. This article discusses the various issues involved in the process.

  18. Automation in Immunohematology

    Directory of Open Access Journals (Sweden)

    Meenu Bajpai

    2012-01-01

    Full Text Available There have been rapid technological advances in blood banking in South Asian region over the past decade with an increasing emphasis on quality and safety of blood products. The conventional test tube technique has given way to newer techniques such as column agglutination technique, solid phase red cell adherence assay, and erythrocyte-magnetized technique. These new technologies are adaptable to automation and major manufacturers in this field have come up with semi and fully automated equipments for immunohematology tests in the blood bank. Automation improves the objectivity and reproducibility of tests. It reduces human errors in patient identification and transcription errors. Documentation and traceability of tests, reagents and processes and archiving of results is another major advantage of automation. Shifting from manual methods to automation is a major undertaking for any transfusion service to provide quality patient care with lesser turnaround time for their ever increasing workload. This article discusses the various issues involved in the process.

  19. Automation in immunohematology.

    Science.gov (United States)

    Bajpai, Meenu; Kaur, Ravneet; Gupta, Ekta

    2012-07-01

    There have been rapid technological advances in blood banking in South Asian region over the past decade with an increasing emphasis on quality and safety of blood products. The conventional test tube technique has given way to newer techniques such as column agglutination technique, solid phase red cell adherence assay, and erythrocyte-magnetized technique. These new technologies are adaptable to automation and major manufacturers in this field have come up with semi and fully automated equipments for immunohematology tests in the blood bank. Automation improves the objectivity and reproducibility of tests. It reduces human errors in patient identification and transcription errors. Documentation and traceability of tests, reagents and processes and archiving of results is another major advantage of automation. Shifting from manual methods to automation is a major undertaking for any transfusion service to provide quality patient care with lesser turnaround time for their ever increasing workload. This article discusses the various issues involved in the process. PMID:22988378

  20. Automation in Warehouse Development

    CERN Document Server

    Verriet, Jacques

    2012-01-01

    The warehouses of the future will come in a variety of forms, but with a few common ingredients. Firstly, human operational handling of items in warehouses is increasingly being replaced by automated item handling. Extended warehouse automation counteracts the scarcity of human operators and supports the quality of picking processes. Secondly, the development of models to simulate and analyse warehouse designs and their components facilitates the challenging task of developing warehouses that take into account each customer’s individual requirements and logistic processes. Automation in Warehouse Development addresses both types of automation from the innovative perspective of applied science. In particular, it describes the outcomes of the Falcon project, a joint endeavour by a consortium of industrial and academic partners. The results include a model-based approach to automate warehouse control design, analysis models for warehouse design, concepts for robotic item handling and computer vision, and auton...

  1. Advances in inspection automation

    Science.gov (United States)

    Weber, Walter H.; Mair, H. Douglas; Jansen, Dion; Lombardi, Luciano

    2013-01-01

    This new session at QNDE reflects the growing interest in inspection automation. Our paper describes a newly developed platform that makes the complex NDE automation possible without the need for software programmers. Inspection tasks that are tedious, error-prone or impossible for humans to perform can now be automated using a form of drag and drop visual scripting. Our work attempts to rectify the problem that NDE is not keeping pace with the rest of factory automation. Outside of NDE, robots routinely and autonomously machine parts, assemble components, weld structures and report progress to corporate databases. By contrast, components arriving in the NDT department typically require manual part handling, calibrations and analysis. The automation examples in this paper cover the development of robotic thickness gauging and the use of adaptive contour following on the NRU reactor inspection at Chalk River.

  2. Automated model building

    CERN Document Server

    Caferra, Ricardo; Peltier, Nicholas

    2004-01-01

    This is the first book on automated model building, a discipline of automated deduction that is of growing importance Although models and their construction are important per se, automated model building has appeared as a natural enrichment of automated deduction, especially in the attempt to capture the human way of reasoning The book provides an historical overview of the field of automated deduction, and presents the foundations of different existing approaches to model construction, in particular those developed by the authors Finite and infinite model building techniques are presented The main emphasis is on calculi-based methods, and relevant practical results are provided The book is of interest to researchers and graduate students in computer science, computational logic and artificial intelligence It can also be used as a textbook in advanced undergraduate courses

  3. Coordination control of distributed systems

    CERN Document Server

    Villa, Tiziano

    2015-01-01

    This book describes how control of distributed systems can be advanced by an integration of control, communication, and computation. The global control objectives are met by judicious combinations of local and nonlocal observations taking advantage of various forms of communication exchanges between distributed controllers. Control architectures are considered according to  increasing degrees of cooperation of local controllers:  fully distributed or decentralized control,  control with communication between controllers,  coordination control, and multilevel control.  The book covers also topics bridging computer science, communication, and control, like communication for control of networks, average consensus for distributed systems, and modeling and verification of discrete and of hybrid systems. Examples and case studies are introduced in the first part of the text and developed throughout the book. They include: control of underwater vehicles, automated-guided vehicles on a container terminal, contro...

  4. Hybrid Architecture for Coordination of AGVs in FMS

    OpenAIRE

    Eduardo G. Hernandez-Martinez; Foyo-Valdes, Sergio A.; Puga-Velazquez, Erika S.; Jesús A. Meda-Campaña

    2014-01-01

    This paper presents a hybrid control architecture that coordinates the motion of groups of automated guided vehicles in flexible manufacturing systems. The high-level control is based on a Petri net model, using the industrial standard ISA-95, obtaining a task-based coordination of equipment and storage considering process restrictions, logical precedences, shared resources and the assignment of robots to move workpieces individually or in subgroups. On the other hand, in the low-level contro...

  5. Energy-Saving Coordinated Optimal Dispatch of Distributed Combined Cool, Heat and Power Supply%分布式冷热电三联供系统节能协调优化调度

    Institute of Scientific and Technical Information of China (English)

    周任军; 冉晓洪; 毛发龙; 付靖茜; 李星朗; 林绿浩

    2012-01-01

    Distributed combined cool, heat and power (DCCHP) is being developed into important trend of energy source utilization. To reveal the potential characteristics of both production cost and environment cost of DCCHP, the equivalent performance coefficient of heat supply is led in for the equivalent conversion among cool, heat and power, and a new objective function of production cost and environment cost of DCCHP is proposed. According to the unmatched property and randomness of the demand on cool, heat and power loads and considering the condition that generation units are operated in different production states of cool, heat and electric power, combining with time-of-use (TOU) price a cost function for the coordination of cool, heat and power is put forward, and on this basis a multi-objective energy-saving dispatching model, in which the production cost, environment cost and coordination cost for cool, heat and electric power are included, is built. Using the principle of maximum membership, the proposed model is turned into single object optimization problem and solved by quadratic programming. Results of case simulation show that the proposed model and method for energy-saving optimal dispatch of DCCHP is effective in high efficient utilization of energy source and economic power dispatching and pollution emission reduction.%分布式冷热电三联供将是能源利用的重要发展方向。为了揭示冷热电三联供生产成本和环境成本的潜在特性,引入供热当量性能系数,将冷、热、电能量等价转化,提出了新的冷热电联供生产成本和环境成本目标函数。针对冷热、电负荷需求的不匹配性和随机性,考虑机组运行在不同的冷热电生产状态下,结合分时电价,设立了冷热电协调成本函数。由此建立了含生产成本、环境成本和冷热电协调成本的多目标节能调度模型。运用目标隶属度函数模糊算法,将其转化为单目标优化

  6. 能源互联网动态协调优化控制体系构建%Construction of Dynamic Coordinated Optimization Control System for Energy Internet

    Institute of Scientific and Technical Information of China (English)

    孙秋野; 滕菲; 张化光; 马大中

    2015-01-01

    能源互联网是多种能源网络与信息互联网高度耦合的产物。针对如何充分地利用能源互联网的共享信息,对能源互联网中多种能源进行合理规划利用的问题,提出一种能源互联网协调优化控制策略以实现分布式可再生能源的高效利用。通过对一次能源侧的一次能源进行调度优化,实现分布式可再生能源的大规模利用和共享,最小化一次能源侧能源消耗,降低供能成本,提高供能可靠性,对供电网络、供热网络等多种主能源网络起到强有力的支撑作用。为快速、有效地平抑用户侧负荷波动,维持网络稳定性,提出一种基于动态多智能体系统的动态负荷需求响应团队机制。采用多智能体包含控制算法进行能源互联网内电能的电压控制,保证输出电能的高质量。%Energy internet is a kind of novel interconnected multi-energy-grids, highly coupled with the information internet. To utilize the distributed renewable energies in the energy internet reasonably, a coordinated optimization control strategy was proposed, based on the shared information internet. Firstly, the large-scale use and share of distributed renewable energies could be achieved by scheduling and optimizing primary energies. Meanwhile, the primary energy consumption could be minimized, the energy supply costs could be reduced, and the reliability of energy supply could be improved, while the energy internet provided strong supports for the power system, the heating system and other main energy grids. Then a dynamic load demand response team mechanism, based on a dynamic multi-agent system, was established to eliminate load fluctuation quickly and efficiently, maintaining the stability of the energy internet. Finally, multi-agent containment control algorithm was applied in the voltage control to guarantee high power quality.

  7. Design Optimization of Internal Flow Devices

    DEFF Research Database (Denmark)

    Madsen, Jens Ingemann

    The power of computational fluid dynamics is boosted through the use of automated design optimization methodologies. The thesis considers both derivative-based search optimization and the use of response surface methodologies.......The power of computational fluid dynamics is boosted through the use of automated design optimization methodologies. The thesis considers both derivative-based search optimization and the use of response surface methodologies....

  8. Chef infrastructure automation cookbook

    CERN Document Server

    Marschall, Matthias

    2013-01-01

    Chef Infrastructure Automation Cookbook contains practical recipes on everything you will need to automate your infrastructure using Chef. The book is packed with illustrated code examples to automate your server and cloud infrastructure.The book first shows you the simplest way to achieve a certain task. Then it explains every step in detail, so that you can build your knowledge about how things work. Eventually, the book shows you additional things to consider for each approach. That way, you can learn step-by-step and build profound knowledge on how to go about your configuration management

  9. Decentralized Coordination of Autonomous Vehicles at intersections

    OpenAIRE

    Makarem, Laleh; Gillet, Denis

    2011-01-01

    In this paper, the decentralized coordination of point-mass autonomous vehicles at intersections using navigation functions is considered. As main contribution, the inertia of the vehicles is taken into account to enable on-board energy optimization for crossing. In such a way, heavier vehicles that need more energy and time for acceleration or breaking are given an indirect priority at intersections. The proposed decentralized coordination scheme of autonomous vehicles at intersection is com...

  10. Optimization Analysis of Lattice Store Coordination based on Revenue Sharing Contract%基于供应链收益共享契约的格子铺经营机制优化分析

    Institute of Scientific and Technical Information of China (English)

    胡本勇; 陈旭

    2012-01-01

    , promotional costs and revenue sharing ratio under centralized and decentralized decision-making system. Furthermore, this paper discusses the impact of revenue sharing mechanism on the lattice store. Our analysis results also show that a simple revenue sharing mechanism can not realize the lattice store's optimal performance, also can not improve the cooperation in LSBM.This paper further discusses the reason that the coordination cannot achieve and put forward a revised policy, in which the shop's rent should correspond to the owner's cost of brand building and promotion. Further research shows that revenue sharing mechanism can be achieved when the rent is adjusted to a certain proportion of advertising cost. Although some conclusions are obtained under certain assumptions, the modeling thinking, quantitative analysis used in LSBM study, and those conclusions achieved will help managers of lattice stores make better decisions in practice.

  11. Processing Coordination Ambiguity

    Science.gov (United States)

    Engelhardt, Paul E.; Ferreira, Fernanda

    2010-01-01

    We examined temporarily ambiguous coordination structures such as "put the butter in the bowl and the pan on the towel." Minimal Attachment predicts that the ambiguous noun phrase "the pan" will be interpreted as a noun-phrase coordination structure because it is syntactically simpler than clausal coordination. Constraint-based theories assume…

  12. I-94 Automation FAQs

    Data.gov (United States)

    Department of Homeland Security — In order to increase efficiency, reduce operating costs and streamline the admissions process, U.S. Customs and Border Protection has automated Form I-94 at air and...

  13. Hydrometeorological Automated Data System

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Office of Hydrologic Development of the National Weather Service operates HADS, the Hydrometeorological Automated Data System. This data set contains the last...

  14. Automated Vehicles Symposium 2015

    CERN Document Server

    Beiker, Sven

    2016-01-01

    This edited book comprises papers about the impacts, benefits and challenges of connected and automated cars. It is the third volume of the LNMOB series dealing with Road Vehicle Automation. The book comprises contributions from researchers, industry practitioners and policy makers, covering perspectives from the U.S., Europe and Japan. It is based on the Automated Vehicles Symposium 2015 which was jointly organized by the Association of Unmanned Vehicle Systems International (AUVSI) and the Transportation Research Board (TRB) in Ann Arbor, Michigan, in July 2015. The topical spectrum includes, but is not limited to, public sector activities, human factors, ethical and business aspects, energy and technological perspectives, vehicle systems and transportation infrastructure. This book is an indispensable source of information for academic researchers, industrial engineers and policy makers interested in the topic of road vehicle automation.

  15. Automated Vehicles Symposium 2014

    CERN Document Server

    Beiker, Sven; Road Vehicle Automation 2

    2015-01-01

    This paper collection is the second volume of the LNMOB series on Road Vehicle Automation. The book contains a comprehensive review of current technical, socio-economic, and legal perspectives written by experts coming from public authorities, companies and universities in the U.S., Europe and Japan. It originates from the Automated Vehicle Symposium 2014, which was jointly organized by the Association for Unmanned Vehicle Systems International (AUVSI) and the Transportation Research Board (TRB) in Burlingame, CA, in July 2014. The contributions discuss the challenges arising from the integration of highly automated and self-driving vehicles into the transportation system, with a focus on human factors and different deployment scenarios. This book is an indispensable source of information for academic researchers, industrial engineers, and policy makers interested in the topic of road vehicle automation.

  16. The problem of organization of a coastal coordinating computer center

    Science.gov (United States)

    Dyubkin, I. A.; Lodkin, I. I.

    1974-01-01

    The fundamental principles of the operation of a coastal coordinating and computing center under conditions of automation are presented. Special attention is devoted to the work of Coastal Computer Center of the Arctic and Antarctic Scientific Research Institute. This center generalizes from data collected in expeditions and also from observations made at polar stations.

  17. Autonomous Vehicle Coordination with Wireless Sensor and Actuator Networks

    NARCIS (Netherlands)

    Marin-Perianu, Mihai; Bosch, Stephan; Marin-Perianu, Raluca; Scholten, Hans; Havinga, Paul

    2010-01-01

    A coordinated team of mobile wireless sensor and actuator nodes can bring numerous benefits for various applications in the field of cooperative surveillance, mapping unknown areas, disaster management, automated highway and space exploration. This article explores the idea of mobile nodes using veh

  18. Disassembly automation automated systems with cognitive abilities

    CERN Document Server

    Vongbunyong, Supachai

    2015-01-01

    This book presents a number of aspects to be considered in the development of disassembly automation, including the mechanical system, vision system and intelligent planner. The implementation of cognitive robotics increases the flexibility and degree of autonomy of the disassembly system. Disassembly, as a step in the treatment of end-of-life products, can allow the recovery of embodied value left within disposed products, as well as the appropriate separation of potentially-hazardous components. In the end-of-life treatment industry, disassembly has largely been limited to manual labor, which is expensive in developed countries. Automation is one possible solution for economic feasibility. The target audience primarily comprises researchers and experts in the field, but the book may also be beneficial for graduate students.

  19. Instant Sikuli test automation

    CERN Document Server

    Lau, Ben

    2013-01-01

    Get to grips with a new technology, understand what it is and what it can do for you, and then get to work with the most important features and tasks. A concise guide written in an easy-to follow style using the Starter guide approach.This book is aimed at automation and testing professionals who want to use Sikuli to automate GUI. Some Python programming experience is assumed.

  20. Automated security management

    CERN Document Server

    Al-Shaer, Ehab; Xie, Geoffrey

    2013-01-01

    In this contributed volume, leading international researchers explore configuration modeling and checking, vulnerability and risk assessment, configuration analysis, and diagnostics and discovery. The authors equip readers to understand automated security management systems and techniques that increase overall network assurability and usability. These constantly changing networks defend against cyber attacks by integrating hundreds of security devices such as firewalls, IPSec gateways, IDS/IPS, authentication servers, authorization/RBAC servers, and crypto systems. Automated Security Managemen

  1. Automated Lattice Perturbation Theory

    Energy Technology Data Exchange (ETDEWEB)

    Monahan, Christopher

    2014-11-01

    I review recent developments in automated lattice perturbation theory. Starting with an overview of lattice perturbation theory, I focus on the three automation packages currently "on the market": HiPPy/HPsrc, Pastor and PhySyCAl. I highlight some recent applications of these methods, particularly in B physics. In the final section I briefly discuss the related, but distinct, approach of numerical stochastic perturbation theory.

  2. Cassini Tour Atlas Automated Generation

    Science.gov (United States)

    Grazier, Kevin R.; Roumeliotis, Chris; Lange, Robert D.

    2011-01-01

    During the Cassini spacecraft s cruise phase and nominal mission, the Cassini Science Planning Team developed and maintained an online database of geometric and timing information called the Cassini Tour Atlas. The Tour Atlas consisted of several hundreds of megabytes of EVENTS mission planning software outputs, tables, plots, and images used by mission scientists for observation planning. Each time the nominal mission trajectory was altered or tweaked, a new Tour Atlas had to be regenerated manually. In the early phases of Cassini s Equinox Mission planning, an a priori estimate suggested that mission tour designers would develop approximately 30 candidate tours within a short period of time. So that Cassini scientists could properly analyze the science opportunities in each candidate tour quickly and thoroughly so that the optimal series of orbits for science return could be selected, a separate Tour Atlas was required for each trajectory. The task of manually generating the number of trajectory analyses in the allotted time would have been impossible, so the entire task was automated using code written in five different programming languages. This software automates the generation of the Cassini Tour Atlas database. It performs with one UNIX command what previously took a day or two of human labor.

  3. Movement coordination during conversation.

    Science.gov (United States)

    Latif, Nida; Barbosa, Adriano V; Vatikiotis-Bateson, Eric; Vatiokiotis-Bateson, Eric; Castelhano, Monica S; Munhall, K G

    2014-01-01

    Behavioral coordination and synchrony contribute to a common biological mechanism that maintains communication, cooperation and bonding within many social species, such as primates and birds. Similarly, human language and social systems may also be attuned to coordination to facilitate communication and the formation of relationships. Gross similarities in movement patterns and convergence in the acoustic properties of speech have already been demonstrated between interacting individuals. In the present studies, we investigated how coordinated movements contribute to observers' perception of affiliation (friends vs. strangers) between two conversing individuals. We used novel computational methods to quantify motor coordination and demonstrated that individuals familiar with each other coordinated their movements more frequently. Observers used coordination to judge affiliation between conversing pairs but only when the perceptual stimuli were restricted to head and face regions. These results suggest that observed movement coordination in humans might contribute to perceptual decisions based on availability of information to perceivers. PMID:25119189

  4. Movement coordination during conversation.

    Directory of Open Access Journals (Sweden)

    Nida Latif

    Full Text Available Behavioral coordination and synchrony contribute to a common biological mechanism that maintains communication, cooperation and bonding within many social species, such as primates and birds. Similarly, human language and social systems may also be attuned to coordination to facilitate communication and the formation of relationships. Gross similarities in movement patterns and convergence in the acoustic properties of speech have already been demonstrated between interacting individuals. In the present studies, we investigated how coordinated movements contribute to observers' perception of affiliation (friends vs. strangers between two conversing individuals. We used novel computational methods to quantify motor coordination and demonstrated that individuals familiar with each other coordinated their movements more frequently. Observers used coordination to judge affiliation between conversing pairs but only when the perceptual stimuli were restricted to head and face regions. These results suggest that observed movement coordination in humans might contribute to perceptual decisions based on availability of information to perceivers.

  5. Adventures in Coordinate Space

    Science.gov (United States)

    Chambers, J. E.

    2003-08-01

    A variety of coordinate systems have been used to study the N-body problem for cases involving a dominant central mass. These include the traditional Keplerian orbital elements and the canonical Delaunay variables, which both incorporate conserved quantities of the two-body problem. Recently, Cartesian coordinate systems have returned to favour with the rise of mixed-variable symplectic integrators, since these coordinates prove to be more efficient than using orbital elements. Three sets of canonical Cartesian coordinates are well known, each with its own advantages and disadvantages. Inertial coordinates (which include barycentric coordinates as a special case) are the simplest and easiest to implement. However, they suffer from the disadvantage that the motion of the central body must be calculated explicitly, leading to relatively large errors in general. Jacobi coordinates overcome this problem by replacing the coordinates and momenta of the central body with those of the system as a whole, so that momentum is conserved exactly. This leads to substantial improvements in accuracy, but has the disadvantage that every object is treated differently, and interactions between each pair of bodies are now expressed in a complicated manner involving the coordinates of many bodies. Canonical heliocentric coordinates (also known as democratic heliocentric coordinates) treat all bodies equally, and conserve the centre of mass motion, but at the cost of introducing momentum cross terms into the kinetic energy. This complicates the development of higher order symplectic integrators and symplectic correctors, as well as the development of methods used to resolve close encounters with the central body. Here I will re-examine the set of possible canonical Cartesian coordinate systems to determine if it is possible to (a) conserve the centre of mass motion, (b) treat all bodies equally, and (c) eliminate the momentum cross terms. I will demonstrate that this is indeed possible

  6. Study on Coordination Optimization Model and System Analysis of Agricultural Machinery Supporting Service%农机配套服务的系统分析和协调优化模型研究

    Institute of Scientific and Technical Information of China (English)

    于晓秋; 冷志杰

    2013-01-01

    This paper,base on the definition of agricultural machinery service.Firstly,it analyzes systematically influence factors of agricultural machinery supporting services.Take data of farm machinery and grain yield and so on of Heilongjiang reclamation 51 farms near 10 year as the examples,use correlation analysis and establishes the most superior model,the result indicates correlation coefficient absolute value existences drop tendency between a complete set and the grain yield year by year in the Heilongjiang reclamation area farm machinery at present.Take the data of a Seven Star farm near 10 year as the example,we find there is closer relevance between the small agricultural machinery tools and the grain yield.The large and middle scale tractor agricultural machinery tools indicates the remarkable role to the grain yield by using the most superior regression model.Finally,satisfied condition of decision variables are determined by establishing and the analyzing coordinated optimization model of farm machinery necessary service.It has the vital significance in order to advance the agricultural machinery supporting serve.%在界定农机配套服务的基础上,首先系统分析了农机配套服务的影响因素,以黑龙江垦区51个农场近10年农机和粮食产量等数据为例,采用相关分析、建立最优模型等方法知目前黑龙江垦区农机配套比与粮食单产相关系数的绝对值存在逐年减少趋势.再以七星农场近10年数据为例知小型农机具配套比与粮食产量相关性最密切,而利用建立最优回归模型揭示大中型拖拉机配套农机具对粮食产量起到显著作用.最后建立和分析农机配套服务的协调优化模型,确定了决策变量的满足条件,为推进农机配套服务具有重要意义.

  7. Torque coordinating optimal control for dry dual clutch transmission in shifting process%干式双离合器变速器换挡过程的转矩协调最优控制

    Institute of Scientific and Technical Information of China (English)

    赵治国; 王琪; 陈海军; 刁威振

    2013-01-01

    A torque coordinating optimalcontrol strategy was developed using different driving intentions in different shifting process phases for a six-speed Dry Dual-Clutch-Transmission (DDCT) of a self-developed car. A simulation model was set up on the Matlab/Simulink software platform and then used to simulate the control strategy for DDCT shifting process. The sub-divided phases included the quadratic optimal control adopted in torque phase;the ignition parameters and fuel supply controlin inertia phase;the control factor mapping method, which relfects driving intension, both in micro-slipping phase and in demanded torque switching phase. The results show that shift jerks are within-2 m/s3 while the total frictional energy losses are within 2 kJ in shift process. Therefore, this control strategy relfects driving intentions and meets the demand of shift quality.%对于某一自主开发汽车的六速干式双离合器变速器(DDCT),提出了对不同驾驶意图、分阶段采取不同措施的转矩协调最优控制策略:在转矩相阶段,运用二次型最优控制确定了离合器转矩变化率;在惯性相阶段,采用发动机点火参数与燃油供给调节控制以缩短惯性相时间;在微滑摩阶段及需求转矩切换阶段,建立了体现驾驶意图的控制因子映射。并在Matlab/Simulink软件平台上,搭建了DDCT换挡过程模型,进行仿真试验。结果表明:换挡过程中的换挡冲击在-2 m/s3以内,产生的总滑摩功在2 kJ以内;因而,该控制策略能体现驾驶员的换挡意图,且满足换挡品质需求。

  8. Automated Deployment of Customer-Premises Equipment

    OpenAIRE

    Åberg, Christian

    2016-01-01

    Zero touch deployment tools perform installation and configuration of computer networking devices without human interaction. Modern zero touch deployment tools generally lack support for routers and are platform dependent. This forces Internet Service Providers to rely on manual node deployment methods when deploying Customer-Premise Equipment which is time consuming and error prone. This thesis investigates how the process of deploying Customer-Premises Equipment can be automated and optimize...

  9. Automated optical sensing system for biochemical assays

    Science.gov (United States)

    Oroszlan, Peter; Duveneck, Gert L.; Ehrat, Markus; Widmer, H. M.

    1994-03-01

    In this paper, we present a new system called FOBIA that was developed and optimized with respect to automated operation of repetitive assay cycles with regenerable bioaffinity sensors. The reliability and precision of the new system is demonstrated by an application in a competitive assay for the detection of the triazine herbicide Atrazine. Using one sensor in more than 300 repetitive cycles, a signal precision better than 5% was achieved.

  10. Fast Automated Decoupling at RHIC

    CERN Document Server

    Beebe-Wang, Joanne

    2005-01-01

    Coupling correction is essential for the operational performance of RHIC. The independence of the transverse degrees of freedom makes diagnostics and tune control easier, and it is advantageous to operate an accelerator close to the coupling resonance to minimize nearby nonlinear sidebands. An automated decoupling application has been developed at RHIC for coupling correction during routine operations. The application decouples RHIC globally by minimizing the tune separation through finding the optimal settings of two orthogonal skew quadrupole families. The program provides options of automatic, semi-automatic and manual decoupling operations. It accesses tune information from all RHIC tune measurement systems: the PLL (Phase Lock Loop), the high frequency Schottky system, and the tune meter. It also supplies tune and skew quadrupole scans, finding the minimum tune separation, display the real time results and interface with the RHIC control system. We summarize the capabilities of the decoupling application...

  11. Process development for automated solar cell and module production. Task 4: Automated array assembly

    Science.gov (United States)

    Hagerty, J. J.

    1981-01-01

    Progress in the development of automated solar cell and module production is reported. The unimate robot is programmed for the final 35 cell pattern to be used in the fabrication of the deliverable modules. The mechanical construction of the automated lamination station and final assembly station phases are completed and the first operational testing is underway. The final controlling program is written and optimized. The glass reinforced concrete (GRC) panels to be used for testing and deliverables are in production. Test routines are grouped together and defined to produce the final control program.

  12. Automated Verification of IGRT-based Patient Positioning

    Science.gov (United States)

    Jiang, Xiaojun; Fox, Tim; Cordova, Scott S; Schreibmann, Eduard

    2016-01-01

    A system for automated quality assurance in radiotherapy of a therapist’s registration was designed and tested in clinical practice. The approach compliments the clinical software’s automated registration in terms of algorithm configuration and performance, and constitutes a practical approach for ensuring safe patient setups. Per our convergence analysis, evolutionary algorithms perform better in finding the global optima of the cost function with discrepancies from a deterministic optimizer seen sporadically. PMID:26699548

  13. Towards full automation of accelerators through computer control

    CERN Document Server

    Gamble, J; Kemp, D; Keyser, R; Koutchouk, Jean-Pierre; Martucci, P P; Tausch, Lothar A; Vos, L

    1980-01-01

    The computer control system of the Intersecting Storage Rings (ISR) at CERN has always laid emphasis on two particular operational aspects, the first being the reproducibility of machine conditions and the second that of giving the operators the possibility to work in terms of machine parameters such as the tune. Already certain phases of the operation are optimized by the control system, whilst others are automated with a minimum of manual intervention. The authors describe this present control system with emphasis on the existing automated facilities and the features of the control system which make it possible. It then discusses the steps needed to completely automate the operational procedure of accelerators. (7 refs).

  14. Coordinated control of HVAC systems

    OpenAIRE

    Federspiel, Clifford C; Lanning, Sasha D; Li, Hulin; David M. Auslander

    2001-01-01

    This paper describes the development of new control logic for starting and stopping energy-intensive equipment in buildings such as staged air-conditioning units. The concept is to use pulse-width modulation (PWM) instead of level-crossing logic. A finite state machine is used to handle the case where a single unit has multiple stages of operation. An optimized coordinator determines the phase of the PWM signals of each unit so that peak demand for power is minimized over each PWM period. Con...

  15. Coordinating Interactions: The Event Coordination Notation

    DEFF Research Database (Denmark)

    Kindler, Ekkart

    on a much more technical level. The Event Coordination Notation (ECNO) allows modelling the behaviour of an application on a high level of abstraction that is closer to the application’s domain than to the software realizing it. Still, these models contain all necessary details for actually executing....... The global behaviour of the application results from different elements jointly engaging in such events, which is called an interaction. Which events are supposed to be jointly executed and which elements need to join in is defined by so-called coordination diagrams of the ECNO. Together, the models...

  16. TOURISME-TRANSPORT : CAPACITE DE COORDINATION

    OpenAIRE

    Marques, Bruno

    2016-01-01

    Two major findings emerge from the theoretical analysis of Transport-Tourism link via a game theory type model of capacity coordination. Firstly the model explains the optimal capacities ratio of Transport and Tourism by the quotient of the ratio of tourism type (ratio of the length of stay in the destination and of transport duration) divided by the installation costs ratio (of transport and tourism).The corollary of this first finding gives the second outcome: The optimal transport and Tour...

  17. TOURISME-TRANSPORT : CAPACITÉ DE COORDINATION

    OpenAIRE

    Marques, Bruno

    2016-01-01

    Two major findings emerge from the theoretical analysis of Transport-Tourism link via a game theory type model of capacity coordination. Firstly the model explains the optimal capacities ratio of Transport and Tourism by the quotient of the ratio of tourism type (ratio of the length of stay in the destination and of transport duration) divided by the installation costs ratio (of transport and tourism).The corollary of this first finding gives the second outcome: The optimal transport and Tour...

  18. Enhanced time overcurrent coordination

    Energy Technology Data Exchange (ETDEWEB)

    Enriquez, Arturo Conde; Martinez, Ernesto Vazquez [Universidad Autonoma de Nuevo Leon, Facultad de Ingenieria Mecanica y Electrica, Apdo. Postal 114-F, Ciudad Universitaria, CP 66450 San Nicolas de los Garza, Nuevo Leon (Mexico)

    2006-04-15

    In this paper, we recommend a new coordination system for time overcurrent relays. The purpose of the coordination process is to find a time element function that allows it to operate using a constant back-up time delay, for any fault current. In this article, we describe the implementation and coordination results of time overcurrent relays, fuses and reclosers. Experiments were carried out in a laboratory test situation using signals of a power electrical system physics simulator. (author)

  19. Materials Testing and Automation

    Science.gov (United States)

    Cooper, Wayne D.; Zweigoron, Ronald B.

    1980-07-01

    The advent of automation in materials testing has been in large part responsible for recent radical changes in the materials testing field: Tests virtually impossible to perform without a computer have become more straightforward to conduct. In addition, standardized tests may be performed with enhanced efficiency and repeatability. A typical automated system is described in terms of its primary subsystems — an analog station, a digital computer, and a processor interface. The processor interface links the analog functions with the digital computer; it includes data acquisition, command function generation, and test control functions. Features of automated testing are described with emphasis on calculated variable control, control of a variable that is computed by the processor and cannot be read directly from a transducer. Three calculated variable tests are described: a yield surface probe test, a thermomechanical fatigue test, and a constant-stress-intensity range crack-growth test. Future developments are discussed.

  20. Automating the CMS DAQ

    Energy Technology Data Exchange (ETDEWEB)

    Bauer, G.; et al.

    2014-01-01

    We present the automation mechanisms that have been added to the Data Acquisition and Run Control systems of the Compact Muon Solenoid (CMS) experiment during Run 1 of the LHC, ranging from the automation of routine tasks to automatic error recovery and context-sensitive guidance to the operator. These mechanisms helped CMS to maintain a data taking efficiency above 90% and to even improve it to 95% towards the end of Run 1, despite an increase in the occurrence of single-event upsets in sub-detector electronics at high LHC luminosity.

  1. Automated phantom assay system

    International Nuclear Information System (INIS)

    This paper describes an automated phantom assay system developed for assaying phantoms spiked with minute quantities of radionuclides. The system includes a computer-controlled linear-translation table that positions the phantom at exact distances from a spectrometer. A multichannel analyzer (MCA) interfaces with a computer to collect gamma spectral data. Signals transmitted between the controller and MCA synchronize data collection and phantom positioning. Measured data are then stored on disk for subsequent analysis. The automated system allows continuous unattended operation and ensures reproducible results

  2. Automated gas chromatography

    Science.gov (United States)

    Mowry, Curtis D.; Blair, Dianna S.; Rodacy, Philip J.; Reber, Stephen D.

    1999-01-01

    An apparatus and process for the continuous, near real-time monitoring of low-level concentrations of organic compounds in a liquid, and, more particularly, a water stream. A small liquid volume of flow from a liquid process stream containing organic compounds is diverted by an automated process to a heated vaporization capillary where the liquid volume is vaporized to a gas that flows to an automated gas chromatograph separation column to chromatographically separate the organic compounds. Organic compounds are detected and the information transmitted to a control system for use in process control. Concentrations of organic compounds less than one part per million are detected in less than one minute.

  3. The Robo-AO automated intelligent queue system

    CERN Document Server

    Riddle, Reed L; Papadopoulos, Athanasios; Baranec, Christoph; Law, Nicholas M

    2014-01-01

    Robo-AO is the first automated laser adaptive optics instrument. In just its second year of scientific operations, it has completed the largest adaptive optics surveys to date, each comprising thousands of targets. Robo-AO uses a fully automated queue scheduling system that selects targets based on criteria entered on a per observing program or per target basis, and includes the ability to coordinate with US Strategic Command automatically to avoid lasing space assets. This enables Robo-AO to select among thousands of targets at a time, and achieve an average observation rate of approximately 20 targets per hour.

  4. Incremental learning for automated knowledge capture.

    Energy Technology Data Exchange (ETDEWEB)

    Benz, Zachary O.; Basilico, Justin Derrick; Davis, Warren Leon,; Dixon, Kevin R.; Jones, Brian S.; Martin, Nathaniel; Wendt, Jeremy Daniel

    2013-12-01

    People responding to high-consequence national-security situations need tools to help them make the right decision quickly. The dynamic, time-critical, and ever-changing nature of these situations, especially those involving an adversary, require models of decision support that can dynamically react as a situation unfolds and changes. Automated knowledge capture is a key part of creating individualized models of decision making in many situations because it has been demonstrated as a very robust way to populate computational models of cognition. However, existing automated knowledge capture techniques only populate a knowledge model with data prior to its use, after which the knowledge model is static and unchanging. In contrast, humans, including our national-security adversaries, continually learn, adapt, and create new knowledge as they make decisions and witness their effect. This artificial dichotomy between creation and use exists because the majority of automated knowledge capture techniques are based on traditional batch machine-learning and statistical algorithms. These algorithms are primarily designed to optimize the accuracy of their predictions and only secondarily, if at all, concerned with issues such as speed, memory use, or ability to be incrementally updated. Thus, when new data arrives, batch algorithms used for automated knowledge capture currently require significant recomputation, frequently from scratch, which makes them ill suited for use in dynamic, timecritical, high-consequence decision making environments. In this work we seek to explore and expand upon the capabilities of dynamic, incremental models that can adapt to an ever-changing feature space.

  5. 基于全网统筹的联络线分层优化调度%Optimization of Tie-line Hierarchical Schedule Based on Network-wide Coordination

    Institute of Scientific and Technical Information of China (English)

    许丹; 李晓磊; 丁强; 崔晖; 韩彬

    2014-01-01

    The traditional tie-line schedule is entirely based on the electricity transaction,hence it is loosely coupled with power grid operation all along and relatively independent from the generation scheduling.These cause difficulties in adj usting tie-line schedule and lack of capacity in global allocation of resources.In face of these problems,a hierarchical tie-line schedule model based on network-wide coordination is proposed by resorting to the existing tie-line planning.The upper model gets the ideal tie-line schedule through the economical dispatch constrained by the whole network security.The lower model uses the ideal tie-line schedule as the optimization obj ective,and takes electricity transactions as constraints to achieve the tie-line schedule. This model has achieved automatic preparation and flexible adj ustment of tie-line scheduling,while providing an available manner that combines the schedule with the grid operating state.The proposed algorithm is applied to the tie-line schedule of Central China power grid,while the comparison between the proposed method and the traditional method showing the correctness and effectiveness of the proposed algorithm.%传统联络线计划完全基于电力交易,与电网运行长期处于松耦合状态且与机组计划相对独立,存在联络线计划调整困难,资源全局配置能力不足等问题。针对上述问题,文中以现有联络线计划编制方式为基础,提出了基于全网统筹的联络线分层优化模型。上层模型通过全网安全约束经济调度求取理想联络线计划。下层模型以理想联络线计划为优化目标,以交易合同实际执行为相关约束求解联络线计划。该模型实现了联络线计划自动编制与灵活调整,提供了一种将联络线计划与电网运行状态相结合的可用方式。将所提模型运用于华中电网联络线计划编制,并对优化结果与传统计划进行对比分析,验证了所提方法的正确性与有效性。

  6. Coordination failure caused by sunspots

    DEFF Research Database (Denmark)

    Beugnot, Julie; Gürgüç, Zeynep; Øvlisen, Frederik Roose;

    2012-01-01

    In a coordination game with Pareto-ranked equilibria, we study whether a sunspot can lead to either coordination on an inferior equilibrium (mis-coordination) or to out-of equilibrium behavior (dis-coordination). While much of the literature searches for mechanisms to attain coordination on the e......In a coordination game with Pareto-ranked equilibria, we study whether a sunspot can lead to either coordination on an inferior equilibrium (mis-coordination) or to out-of equilibrium behavior (dis-coordination). While much of the literature searches for mechanisms to attain coordination...

  7. Social Postural Coordination

    Science.gov (United States)

    Varlet, Manuel; Marin, Ludovic; Lagarde, Julien; Bardy, Benoit G.

    2011-01-01

    The goal of the current study was to investigate whether a visual coupling between two people can produce spontaneous interpersonal postural coordination and change their intrapersonal postural coordination involved in the control of stance. We examined the front-to-back head displacements of participants and the angular motion of their hip and…

  8. Coordinate measuring machines

    DEFF Research Database (Denmark)

    De Chiffre, Leonardo

    This document is used in connection with three exercises of 2 hours duration as a part of the course GEOMETRICAL METROLOGY AND MACHINE TESTING. The exercises concern three aspects of coordinate measuring: 1) Measuring and verification of tolerances on coordinate measuring machines, 2) Traceability...

  9. Automated solvent concentrator

    Science.gov (United States)

    Griffith, J. S.; Stuart, J. L.

    1976-01-01

    Designed for automated drug identification system (AUDRI), device increases concentration by 100. Sample is first filtered, removing particulate contaminants and reducing water content of sample. Sample is extracted from filtered residue by specific solvent. Concentrator provides input material to analysis subsystem.

  10. Protokoller til Home Automation

    DEFF Research Database (Denmark)

    Kjær, Kristian Ellebæk

    2008-01-01

    computer, der kan skifte mellem foruddefinerede indstillinger. Nogle gange kan computeren fjernstyres over internettet, så man kan se hjemmets status fra en computer eller måske endda fra en mobiltelefon. Mens nævnte anvendelser er klassiske indenfor home automation, er yderligere funktionalitet dukket op...

  11. ELECTROPNEUMATIC AUTOMATION EDUCATIONAL LABORATORY

    OpenAIRE

    Dolgorukov, S. O.; National Aviation University; Roman, B. V.; National Aviation University

    2013-01-01

    The article reflects current situation in education regarding mechatronics learning difficulties. Com-plex of laboratory test benches on electropneumatic automation are considered as a tool in advancing through technical science. Course of laboratory works developed to meet the requirement of efficient and reliable way of practical skills acquisition is regarded the simplest way for students to learn the ba-sics of mechatronics.

  12. Building Automation Systems.

    Science.gov (United States)

    Honeywell, Inc., Minneapolis, Minn.

    A number of different automation systems for use in monitoring and controlling building equipment are described in this brochure. The system functions include--(1) collection of information, (2) processing and display of data at a central panel, and (3) taking corrective action by sounding alarms, making adjustments, or automatically starting and…

  13. Automated Web Applications Testing

    Directory of Open Access Journals (Sweden)

    Alexandru Dan CĂPRIŢĂ

    2009-01-01

    Full Text Available Unit tests are a vital part of several software development practicesand processes such as Test-First Programming, Extreme Programming andTest-Driven Development. This article shortly presents the software quality andtesting concepts as well as an introduction to an automated unit testingframework for PHP web based applications.

  14. Automated Student Model Improvement

    Science.gov (United States)

    Koedinger, Kenneth R.; McLaughlin, Elizabeth A.; Stamper, John C.

    2012-01-01

    Student modeling plays a critical role in developing and improving instruction and instructional technologies. We present a technique for automated improvement of student models that leverages the DataShop repository, crowd sourcing, and a version of the Learning Factors Analysis algorithm. We demonstrate this method on eleven educational…

  15. Myths in test automation

    Directory of Open Access Journals (Sweden)

    Jazmine Francis

    2015-01-01

    Full Text Available Myths in automation of software testing is an issue of discussion that echoes about the areas of service in validation of software industry. Probably, the first though that appears in knowledgeable reader would be Why this old topic again? What's New to discuss the matter? But, for the first time everyone agrees that undoubtedly automation testing today is not today what it used to be ten or fifteen years ago, because it has evolved in scope and magnitude. What began as a simple linear scripts for web applications today has a complex architecture and a hybrid framework to facilitate the implementation of testing applications developed with various platforms and technologies. Undoubtedly automation has advanced, but so did the myths associated with it. The change in perspective and knowledge of people on automation has altered the terrain. This article reflects the points of views and experience of the author in what has to do with the transformation of the original myths in new versions, and how they are derived; also provides his thoughts on the new generation of myths.

  16. Automating spectral measurements

    Science.gov (United States)

    Goldstein, Fred T.

    2008-09-01

    This paper discusses the architecture of software utilized in spectroscopic measurements. As optical coatings become more sophisticated, there is mounting need to automate data acquisition (DAQ) from spectrophotometers. Such need is exacerbated when 100% inspection is required, ancillary devices are utilized, cost reduction is crucial, or security is vital. While instrument manufacturers normally provide point-and-click DAQ software, an application programming interface (API) may be missing. In such cases automation is impossible or expensive. An API is typically provided in libraries (*.dll, *.ocx) which may be embedded in user-developed applications. Users can thereby implement DAQ automation in several Windows languages. Another possibility, developed by FTG as an alternative to instrument manufacturers' software, is the ActiveX application (*.exe). ActiveX, a component of many Windows applications, provides means for programming and interoperability. This architecture permits a point-and-click program to act as automation client and server. Excel, for example, can control and be controlled by DAQ applications. Most importantly, ActiveX permits ancillary devices such as barcode readers and XY-stages to be easily and economically integrated into scanning procedures. Since an ActiveX application has its own user-interface, it can be independently tested. The ActiveX application then runs (visibly or invisibly) under DAQ software control. Automation capabilities are accessed via a built-in spectro-BASIC language with industry-standard (VBA-compatible) syntax. Supplementing ActiveX, spectro-BASIC also includes auxiliary serial port commands for interfacing programmable logic controllers (PLC). A typical application is automatic filter handling.

  17. The research and design of automated warehouse optimal management system based on RFID%基于RFID的自动化立体仓库优化管理系统的设计与实现

    Institute of Scientific and Technical Information of China (English)

    杨玮; 鄢陈; 张志远; 张成泽; 张亚楠

    2013-01-01

    This paper introduces the technology of RFID,and constructs the basic structure of the automated warehouse management system which is based on RFID technology,and also analyzes structure and function of the system.Through the study of the typical RFID reader,it designs the data acquisition system based on the middleware technology,realizing the normal communication between the RFID hardware and scheduling system software,guiding the design of the warehouse management system based on RDID Technology.Through the application of RFID technology,improved the operational efficiency and accuracy of the automated warehouse,and reduced the man-made error.%介绍了RFID技术,构建了基于RFID技术的自动化立体仓库管理系统的基本结构,对其结构和功能进行了分析.并通过对典型RFID阅读器的研究,对基于中间件技术的数据采集子系统进行设计,实现RFID硬件与调度系统软件的正常通信,对基于RDID技术的仓储管理系统的设计有指导意义.通过RFID技术的应用,提高了自动化立体仓库的运作效率和准确性,减少了人为的差错.

  18. A scheme for a future distribution automation system in Finnish utilities

    Energy Technology Data Exchange (ETDEWEB)

    Lehtonen, M.; Kaerkkaeinen, S. [VTT Energy, Espoo (Finland); Partanen, J. [Lappeenranta Univ. of Technology (Finland)

    1998-08-01

    This presentation summarizes the results of a project, the aim of which was to define the optimal set of functions for the future distribution automation (DA) systems in Finland. The general factors, which affect the automation needs, are first discussed. The benefits of various functions of DA and demand side management (DSM) are then studied. Next a computer model for a DA feasibility analysis is presented, and some computation results are given. From these, the proposed automation scheme is finally concluded

  19. A scheme for a future distribution automation system in Finnish utilities

    Energy Technology Data Exchange (ETDEWEB)

    Lehtonen, M.; Kaerkkaeinen, S. [VTT Energy, Espoo (Finland); Partanen, J. [Lappeenranta Univ. of Technology (Finland)

    1996-12-31

    This presentation summarizes the results of a project, the aim of which was to define the optimal set of functions for the future distribution automation (DA) systems in Finland. The general factors. which affect the automation needs, are first discussed. The benefits of various functions of DA and demand side management (DSM) are then studied. Next a computer model for a DA feasibility analysis is presented, and some computation results are given. From these. the proposed automation scheme is finally concluded

  20. Automated measurement of Drosophila wings

    Directory of Open Access Journals (Sweden)

    Mezey Jason

    2003-12-01

    Full Text Available Abstract Background Many studies in evolutionary biology and genetics are limited by the rate at which phenotypic information can be acquired. The wings of Drosophila species are a favorable target for automated analysis because of the many interesting questions in evolution and development that can be addressed with them, and because of their simple structure. Results We have developed an automated image analysis system (WINGMACHINE that measures the positions of all the veins and the edges of the wing blade of Drosophilid flies. A video image is obtained with the aid of a simple suction device that immobilizes the wing of a live fly. Low-level processing is used to find the major intersections of the veins. High-level processing then optimizes the fit of an a priori B-spline model of wing shape. WINGMACHINE allows the measurement of 1 wing per minute, including handling, imaging, analysis, and data editing. The repeatabilities of 12 vein intersections averaged 86% in a sample of flies of the same species and sex. Comparison of 2400 wings of 25 Drosophilid species shows that wing shape is quite conservative within the group, but that almost all taxa are diagnosably different from one another. Wing shape retains some phylogenetic structure, although some species have shapes very different from closely related species. The WINGMACHINE system facilitates artificial selection experiments on complex aspects of wing shape. We selected on an index which is a function of 14 separate measurements of each wing. After 14 generations, we achieved a 15 S.D. difference between up and down-selected treatments. Conclusion WINGMACHINE enables rapid, highly repeatable measurements of wings in the family Drosophilidae. Our approach to image analysis may be applicable to a variety of biological objects that can be represented as a framework of connected lines.

  1. System automation for a bacterial colony detection and identification instrument via forward scattering

    International Nuclear Information System (INIS)

    A system design and automation of a microbiological instrument that locates bacterial colonies and captures the forward-scattering signatures are presented. The proposed instrument integrates three major components: a colony locator, a forward scatterometer and a motion controller. The colony locator utilizes an off-axis light source to illuminate a Petri dish and an IEEE1394 camera to capture the diffusively scattered light to provide the number of bacterial colonies and two-dimensional coordinate information of the bacterial colonies with the help of a segmentation algorithm with region-growing. Then the Petri dish is automatically aligned with the respective centroid coordinate with a trajectory optimization method, such as the Traveling Salesman Algorithm. The forward scatterometer automatically computes the scattered laser beam from a monochromatic image sensor via quadrant intensity balancing and quantitatively determines the centeredness of the forward-scattering pattern. The final scattering signatures are stored to be analyzed to provide rapid identification and classification of the bacterial samples

  2. Advances in Automation and Robotics, Vol 2

    CERN Document Server

    International conference on Automation and Robotics-ICAR2011

    2012-01-01

    The international conference on Automation and Robotics-ICAR2011 is held during December 12-13, 2011 in Dubai, UAE. The conference is intended to bring together the researchers and engineers/technologists working in different aspects of intelligent control systems and optimization, robotics and automation, signal processing, sensors, systems modeling and control, industrial engineering, production and management. This part of proceedings includes 82 papers contributed by many researchers in relevant topic areas covered at ICAR2011 from various countries such as France, Japan, USA, Korea and China etc.  The session topics of this proceedings are signal processing and industrial engineering, production and management, which includes papers about signal reconstruction, mechanical sensors, real-time systems control system identification, change detection problems, business process modeling, production planning, scheduling and control, computer-based manufacturing technologies, systems modeling and simulation, fa...

  3. Advanced beamline automation for biological crystallography experiments.

    Science.gov (United States)

    Cork, Carl; O'Neill, James; Taylor, John; Earnest, Thomas

    2006-08-01

    An automated crystal-mounting/alignment system has been developed at Lawrence Berkeley National Laboratory and has been installed on three of the protein-crystallography experimental stations at the Advanced Light Source (ALS); it is currently being implemented at synchrotron crystallography beamlines at CHESS, NSLS and the APS. The benefits to using an automounter system include (i) optimization of the use of synchrotron beam time, (ii) facilitation of advanced data-collection techniques, (iii) collection of higher quality data, (iv) reduction of the risk to crystals and (v) exploration of systematic studies of experimental protocols. Developments on the next-generation automounter with improvements in robustness, automated alignment and sample tracking are under way, with an end-to-end data-flow process being developed to allow remote data collection and monitoring. PMID:16855300

  4. Advances in Automation and Robotics, Vol1

    CERN Document Server

    International conference on Automation and Robotics ICAR2011

    2012-01-01

    The international conference on Automation and Robotics-ICAR2011 is held during December 12-13, 2011 in Dubai, UAE. The proceedings of ICAR2011 have been published by Springer Lecture Notes in Electrical Engineering, which include 163 excellent papers selected from more than 400 submitted papers.   The conference is intended to bring together the researchers and engineers/technologists working in different aspects of intelligent control systems and optimization, robotics and automation, signal processing, sensors, systems modeling and control, industrial engineering, production and management.   This part of proceedings includes 81 papers contributed by many researchers in relevant topic areas covered at ICAR2011 from various countries such as France, Japan, USA, Korea and China etc.     Many papers introduced their advanced research work recently; some of them gave a new solution to problems in the field, with powerful evidence and detail demonstration. Others stated the application of their designed and...

  5. Uranyl ion coordination

    Science.gov (United States)

    Evans, H.T.

    1963-01-01

    A review of the known crystal structures containing the uranyl ion shows that plane-pentagon coordination is equally as prevalent as plane-square or plane-hexagon. It is suggested that puckered-hexagon configurations of OH - or H2O about the uranyl group will tend to revert to plane-pentagon coordination. The concept of pentagonal coordination is invoked for possible explanations of the complex crystallography of the natural uranyl hydroxides and the unusual behavior of polynuclear ions in hydrolyzed uranyl solutions.

  6. Automation in photogrammetry: Recent developments and applications (1972-1976)

    Science.gov (United States)

    Thompson, M.M.; Mikhail, E.M.

    1976-01-01

    An overview of recent developments in the automation of photogrammetry in various countries is presented. Conclusions regarding automated photogrammetry reached at the 1972 Congress in Ottawa are reviewed first as a background for examining the developments of 1972-1976. Applications are described for each country reporting significant developments. Among fifteen conclusions listed are statements concerning: the widespread practice of equipping existing stereoplotters with simple digitizers; the growing tendency to use minicomputers on-line with stereoplotters; the optimization of production of digital terrain models by progressive sampling in stereomodels; the potential of digitization of a photogrammetric model by density correlation on epipolar lines; the capabilities and economic aspects of advanced systems which permit simultaneous production of orthophotos, contours, and digital terrain models; the economy of off-line orthophoto systems; applications of digital image processing; automation by optical techniques; applications of sensors other than photographic imagery, and the role of photogrammetric phases in a completely automated cartographic system. ?? 1976.

  7. Case Studies on an Approach to Multiple Autonomous Vehicle Motion Coordination

    Institute of Scientific and Technical Information of China (English)

    D.K. Liu; X. Wu; G. Paul; G. Dissanayake

    2006-01-01

    This paper conducts a series of case studies on a novel Simultaneous Path and Motion Planning (SiPaMoP) approach[1] to multiple autonomous or Automated Guided Vehicle (AGV) motion coordination in bidirectional networks. The SiPaMoP approach plans collision-free paths for vehicles based on the principle of shortest path by dynamically changing the vehicles' paths, traveling speeds or waiting times, whichever gives the shortest traveling time. It integrates path planning, collision avoidance and motion planning into a comprehensive model and optimizes the vehicles' path and motion to minimize the completion time of a set of tasks. Five case studies, i.e., head-on collision avoidance,catching-up collision avoidance, buffer node generation and collision avoidance, prioritybased motion coordination, and safety distance based planning, are presented. The results demonstrated that the method can effectively plan the path and motion for a team of autonomous vehicles or AGVs, and solve the problems of traffic congestion and collision under various conditions.

  8. Retrieval-based Face Annotation by Weak Label Regularized Local Coordinate Coding.

    Science.gov (United States)

    Wang, Dayong; Hoi, Steven C H; He, Ying; Zhu, Jianke; Mei, Tao; Luo, Jiebo

    2013-08-01

    Retrieval-based face annotation is a promising paradigm of mining massive web facial images for automated face annotation. This paper addresses a critical problem of such paradigm, i.e., how to effectively perform annotation by exploiting the similar facial images and their weak labels which are often noisy and incomplete. In particular, we propose an effective Weak Label Regularized Local Coordinate Coding (WLRLCC) technique, which exploits the principle of local coordinate coding in learning sparse features, and employs the idea of graph-based weak label regularization to enhance the weak labels of the similar facial images. We present an efficient optimization algorithm to solve the WLRLCC task. We conduct extensive empirical studies on two large-scale web facial image databases: (i) a Western celebrity database with a total of $6,025$ persons and $714,454$ web facial images, and (ii)an Asian celebrity database with $1,200$ persons and $126,070$ web facial images. The encouraging results validate the efficacy of the proposed WLRLCC algorithm. To further improve the efficiency and scalability, we also propose a PCA-based approximation scheme and an offline approximation scheme (AWLRLCC), which generally maintains comparable results but significantly saves much time cost. Finally, we show that WLRLCC can also tackle two existing face annotation tasks with promising performance.

  9. Rapid automated nuclear chemistry

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, R.A.

    1979-05-31

    Rapid Automated Nuclear Chemistry (RANC) can be thought of as the Z-separation of Neutron-rich Isotopes by Automated Methods. The range of RANC studies of fission and its products is large. In a sense, the studies can be categorized into various energy ranges from the highest where the fission process and particle emission are considered, to low energies where nuclear dynamics are being explored. This paper presents a table which gives examples of current research using RANC on fission and fission products. The remainder of this text is divided into three parts. The first contains a discussion of the chemical methods available for the fission product elements, the second describes the major techniques, and in the last section, examples of recent results are discussed as illustrations of the use of RANC.

  10. Automated theorem proving.

    Science.gov (United States)

    Plaisted, David A

    2014-03-01

    Automated theorem proving is the use of computers to prove or disprove mathematical or logical statements. Such statements can express properties of hardware or software systems, or facts about the world that are relevant for applications such as natural language processing and planning. A brief introduction to propositional and first-order logic is given, along with some of the main methods of automated theorem proving in these logics. These methods of theorem proving include resolution, Davis and Putnam-style approaches, and others. Methods for handling the equality axioms are also presented. Methods of theorem proving in propositional logic are presented first, and then methods for first-order logic. WIREs Cogn Sci 2014, 5:115-128. doi: 10.1002/wcs.1269 CONFLICT OF INTEREST: The authors has declared no conflicts of interest for this article. For further resources related to this article, please visit the WIREs website. PMID:26304304

  11. ATLAS Distributed Computing Automation

    CERN Document Server

    Schovancova, J; The ATLAS collaboration; Borrego, C; Campana, S; Di Girolamo, A; Elmsheuser, J; Hejbal, J; Kouba, T; Legger, F; Magradze, E; Medrano Llamas, R; Negri, G; Rinaldi, L; Sciacca, G; Serfon, C; Van Der Ster, D C

    2012-01-01

    The ATLAS Experiment benefits from computing resources distributed worldwide at more than 100 WLCG sites. The ATLAS Grid sites provide over 100k CPU job slots, over 100 PB of storage space on disk or tape. Monitoring of status of such a complex infrastructure is essential. The ATLAS Grid infrastructure is monitored 24/7 by two teams of shifters distributed world-wide, by the ATLAS Distributed Computing experts, and by site administrators. In this paper we summarize automation efforts performed within the ATLAS Distributed Computing team in order to reduce manpower costs and improve the reliability of the system. Different aspects of the automation process are described: from the ATLAS Grid site topology provided by the ATLAS Grid Information System, via automatic site testing by the HammerCloud, to automatic exclusion from production or analysis activities.

  12. Rapid automated nuclear chemistry

    International Nuclear Information System (INIS)

    Rapid Automated Nuclear Chemistry (RANC) can be thought of as the Z-separation of Neutron-rich Isotopes by Automated Methods. The range of RANC studies of fission and its products is large. In a sense, the studies can be categorized into various energy ranges from the highest where the fission process and particle emission are considered, to low energies where nuclear dynamics are being explored. This paper presents a table which gives examples of current research using RANC on fission and fission products. The remainder of this text is divided into three parts. The first contains a discussion of the chemical methods available for the fission product elements, the second describes the major techniques, and in the last section, examples of recent results are discussed as illustrations of the use of RANC

  13. Supercritical Airfoil Coordinates

    Data.gov (United States)

    National Aeronautics and Space Administration — Rectangular Supercritical Wing (Ricketts) - design and measured locations are provided in an Excel file RSW_airfoil_coordinates_ricketts.xls . One sheet is with Non...

  14. Dimensions of Organizational Coordination

    DEFF Research Database (Denmark)

    Jensen, Andreas Schmidt; Aldewereld, Huib; Dignum, Virginia

    2013-01-01

    be supported to include organizational objectives and constraints into their reasoning processes by considering two alternatives: agent reasoning and middleware regulation. We show how agents can use an organizational specification to achieve organizational objectives by delegating and coordinating...

  15. Understanding social motor coordination.

    Science.gov (United States)

    Schmidt, R C; Fitzpatrick, Paula; Caron, Robert; Mergeche, Joanna

    2011-10-01

    Recently there has been much interest in social coordination of motor movements, or as it is referred to by some researchers, joint action. This paper reviews the cognitive perspective's common coding/mirror neuron theory of joint action, describes some of its limitations and then presents the behavioral dynamics perspective as an alternative way of understanding social motor coordination. In particular, behavioral dynamics' ability to explain the temporal coordination of interacting individuals is detailed. Two experiments are then described that demonstrate how dynamical processes of synchronization are apparent in the coordination underlying everyday joint actions such as martial art exercises, hand-clapping games, and conversations. The import of this evidence is that emergent dynamic patterns such as synchronization are the behavioral order that any neural substrate supporting joint action (e.g., mirror systems) would have to sustain.

  16. The curvature coordinate system

    DEFF Research Database (Denmark)

    Almegaard, Henrik

    2007-01-01

    The paper describes a concept for a curvature coordinate system on regular curved surfaces from which faceted surfaces with plane quadrangular facets can be designed. The lines of curvature are used as parametric lines for the curvature coordinate system on the surface. A new conjugate set of lines......, called middle curvature lines, is introduced. These lines define the curvature coordinate system. Using the curvature coordinate system, the surface can be conformally mapped on the plane. In this mapping, elliptic sections are mapped as circles, and hyperbolic sections are mapped as equilateral...... hyperbolas. This means that when a plane orthogonal system of curves for which the vertices in a mesh always lie on a circle is mapped on a surface with positive Gaussian curvature using inverse mapping, and the mapped vertices are connected by straight lines, this network will form a faceted surface...

  17. The Automated Medical Office

    OpenAIRE

    Petreman, Mel

    1990-01-01

    With shock and surprise many physicians learned in the 1980s that they must change the way they do business. Competition for patients, increasing government regulation, and the rapidly escalating risk of litigation forces physicians to seek modern remedies in office management. The author describes a medical clinic that strives to be paperless using electronic innovation to solve the problems of medical practice management. A computer software program to automate information management in a c...

  18. Coordinating Work with Groupware

    DEFF Research Database (Denmark)

    Pors, Jens Kaaber; Simonsen, Jesper

    2003-01-01

    One important goal of employing groupware is to make possible complex collaboration between geographically distributed groups. This requires a dual transformation of both technology and work practice. The challenge is to re­duce the complexity of the coordination work by successfully inte....... Using the CSCW frame­work of coordination mechanisms, we have elicited six general factors influencing the integration of the groupware application in two situations....

  19. Attribute coordination in organizations

    OpenAIRE

    Yingyi Qian; Gerard Roland; Chenggang Xu

    2001-01-01

    We study coordination in organizations with a variety of organizational forms. Coordination in organization is modeled as the adjustment of attributes and capacities of tasks when facing external shocks. An M-form (U-form) organization groups complementary (substitutable) tasks together in one unit. In the presence of only attribute shocks, particularly when gains from specialization are small, communication is poor, or shocks are more likely, the expected payoff of the decentralized M-form i...

  20. Coordination under the Shadow of Career Concerns

    DEFF Research Database (Denmark)

    Koch, Alexander; Morgenstern, Albrecht

    To innovate, firms require their employees to develop novel ideas and to coordinate with each other to turn these ideas into products, services or business strategies. Because the quality of implemented designs that employees are associated with affects their labor market opportunities, career...... concerns arise that can both be ‘good’ (enhancing incentives for effort in developing ideas) and ‘bad’ (preventing voluntary coordination). Depending on the strength of career concerns, either group-based incentives or team production are optimal. This finding provides a possible link between the increased...

  1. Continuous parallel coordinates.

    Science.gov (United States)

    Heinrich, Julian; Weiskopf, Daniel

    2009-01-01

    Typical scientific data is represented on a grid with appropriate interpolation or approximation schemes,defined on a continuous domain. The visualization of such data in parallel coordinates may reveal patterns latently contained in the data and thus can improve the understanding of multidimensional relations. In this paper, we adopt the concept of continuous scatterplots for the visualization of spatially continuous input data to derive a density model for parallel coordinates. Based on the point-line duality between scatterplots and parallel coordinates, we propose a mathematical model that maps density from a continuous scatterplot to parallel coordinates and present different algorithms for both numerical and analytical computation of the resulting density field. In addition, we show how the 2-D model can be used to successively construct continuous parallel coordinates with an arbitrary number of dimensions. Since continuous parallel coordinates interpolate data values within grid cells, a scalable and dense visualization is achieved, which will be demonstrated for typical multi-variate scientific data.

  2. Continuous parallel coordinates.

    Science.gov (United States)

    Heinrich, Julian; Weiskopf, Daniel

    2009-01-01

    Typical scientific data is represented on a grid with appropriate interpolation or approximation schemes,defined on a continuous domain. The visualization of such data in parallel coordinates may reveal patterns latently contained in the data and thus can improve the understanding of multidimensional relations. In this paper, we adopt the concept of continuous scatterplots for the visualization of spatially continuous input data to derive a density model for parallel coordinates. Based on the point-line duality between scatterplots and parallel coordinates, we propose a mathematical model that maps density from a continuous scatterplot to parallel coordinates and present different algorithms for both numerical and analytical computation of the resulting density field. In addition, we show how the 2-D model can be used to successively construct continuous parallel coordinates with an arbitrary number of dimensions. Since continuous parallel coordinates interpolate data values within grid cells, a scalable and dense visualization is achieved, which will be demonstrated for typical multi-variate scientific data. PMID:19834230

  3. Magnetic Coordinate Systems

    Science.gov (United States)

    Laundal, K. M.; Richmond, A. D.

    2016-07-01

    Geospace phenomena such as the aurora, plasma motion, ionospheric currents and associated magnetic field disturbances are highly organized by Earth's main magnetic field. This is due to the fact that the charged particles that comprise space plasma can move almost freely along magnetic field lines, but not across them. For this reason it is sensible to present such phenomena relative to Earth's magnetic field. A large variety of magnetic coordinate systems exist, designed for different purposes and regions, ranging from the magnetopause to the ionosphere. In this paper we review the most common magnetic coordinate systems and describe how they are defined, where they are used, and how to convert between them. The definitions are presented based on the spherical harmonic expansion coefficients of the International Geomagnetic Reference Field (IGRF) and, in some of the coordinate systems, the position of the Sun which we show how to calculate from the time and date. The most detailed coordinate systems take the full IGRF into account and define magnetic latitude and longitude such that they are constant along field lines. These coordinate systems, which are useful at ionospheric altitudes, are non-orthogonal. We show how to handle vectors and vector calculus in such coordinates, and discuss how systematic errors may appear if this is not done correctly.

  4. An automated approach to magnetic divertor configuration design

    OpenAIRE

    Blommaert, Maarten; Dekeyser, Wouter; Baelmans, Martine; Gauger, Nicolas Ralph; Reiter, Detlev

    2015-01-01

    Automated methods based on optimization can greatly assist computational engineering design in many areas. In this paper an optimization approach to the magnetic design of a nuclear fusion reactor divertor is proposed and applied to a tokamak edge magnetic configuration in a first feasibility study. The approach is based on reduced models for magnetic field and plasma edge, which are integrated with a grid generator into one sensitivity code. The design objective chosen here for demonstrat...

  5. 基于Bloch球面坐标编码的量子粒子群算法及应用%Bloch Coordinates-Based Quantum Particle Swarm Optimization Algorithm and Its Application

    Institute of Scientific and Technical Information of China (English)

    李盼池; 王海英

    2012-01-01

    为提高粒子群算法的优化效率,在分析粒子群优化算法的基础上,提出了一种基于Bloch球面坐标编码的量子粒子群优化算法.该算法每个粒子占据空间三个位置,每个位置代表一个优化解.采用传统粒子群优化方法的搜索机制调整量子位的两个参数,可以实现量子位在Bloch球面上的旋转,从而使每个粒子代表的三个优化解同时得到更新,并快速逼近全局最优解.标准测试函数极值优化和模糊控制其参数优化的实验结果表明,与同类算法相比,该算法在优化能力和优化效率两方面都有改进.%To improve the efficiency of particle swarm optimization, a quantum particle swarm optimization algorithm is proposed on the basis of analyzing the search process of particle swarm optimization algorithm. In the proposed algorithm, particles are endoded by qubits described on the Bloch sphere, each particle occupy three locations of the search space, and each location represents a optimization solution. By employing the search method of general PSO to adjust the two parameters of qubit, the qubits rotation are performed on the Bloch sphere, which can simultaneously update three loations occupied by a qubit and quickly approach the global optimal solution. The experimental results of standard test function extreme optimization and fuzzy controller parameters optimization show that the proposed algorithm is superior to other similar algorithm in optimization ability and optimization efficiency.

  6. Optimal Real-time Dispatch for Integrated Energy Systems: An Ontology-Based Multi-Agent Approach

    DEFF Research Database (Denmark)

    Anvari-Moghaddam, Amjad; Rahimi-Kian, Ashkan; Mirian, Maryam S.;

    2016-01-01

    into a cohesive, networked package that fully utilizes smart energy-efficient end-use devices, advanced building control/automation systems, and integrated communications architectures, it is possible to efficiently manage energy and comfort at the end-use location. In this paper, an ontology-driven multi......With the emerging of small-scale integrated energy systems (IESs), there are significant potentials to increase the functionality of a typical demand-side management (DSM) strategy and typical implementation of building-level distributed energy resources (DERs). By integrating DSM and DERs......-agent control system with intelligent optimizers is proposed for optimal real-time dispatch of an integrated building and microgrid system considering coordinated demand response (DR) and DERs management. The optimal dispatch problem is formulated as a mixed integer nonlinear programing problem (MINLP...

  7. Automation in organizations: Eternal conflict

    Science.gov (United States)

    Dieterly, D. L.

    1981-01-01

    Some ideas on and insights into the problems associated with automation in organizations are presented with emphasis on the concept of automation, its relationship to the individual, and its impact on system performance. An analogy is drawn, based on an American folk hero, to emphasize the extent of the problems encountered when dealing with automation within an organization. A model is proposed to focus attention on a set of appropriate dimensions. The function allocation process becomes a prominent aspect of the model. The current state of automation research is mentioned in relation to the ideas introduced. Proposed directions for an improved understanding of automation's effect on the individual's efficiency are discussed. The importance of understanding the individual's perception of the system in terms of the degree of automation is highlighted.

  8. Fully Automated Molecular Biology Routines on a Plasmid-Based Functional Proteomic Workcell: Evaluation and Characterization of Yeast Strains Optimized for Growth on Xylose and Engineered to Express an Insecticidal Peptide

    Science.gov (United States)

    Optimization of genes important to production of fuel ethanol from hemicellulosic biomass for use in engineering improved commercial yeast strains is necessary to meet the United States' rapidly expanding need for ethanol. United States Department of Agriculture, Agricultural Research Service, Nati...

  9. Automated carbon dioxide cleaning system

    Science.gov (United States)

    Hoppe, David T.

    1991-01-01

    Solidified CO2 pellets are an effective blast media for the cleaning of a variety of materials. CO2 is obtained from the waste gas streams generated from other manufacturing processes and therefore does not contribute to the greenhouse effect, depletion of the ozone layer, or the environmental burden of hazardous waste disposal. The system is capable of removing as much as 90 percent of the contamination from a surface in one pass or to a high cleanliness level after multiple passes. Although the system is packaged and designed for manual hand held cleaning processes, the nozzle can easily be attached to the end effector of a robot for automated cleaning of predefined and known geometries. Specific tailoring of cleaning parameters are required to optimize the process for each individual geometry. Using optimum cleaning parameters the CO2 systems were shown to be capable of cleaning to molecular levels below 0.7 mg/sq ft. The systems were effective for removing a variety of contaminants such as lubricating oils, cutting oils, grease, alcohol residue, biological films, and silicone. The system was effective on steel, aluminum, and carbon phenolic substrates.

  10. Optimizing elliptic curve scalar multiplication for small scalars

    Science.gov (United States)

    Giorgi, Pascal; Imbert, Laurent; Izard, Thomas

    2009-08-01

    On an elliptic curve, the multiplication of a point P by a scalar k is defined by a series of operations over the field of definition of the curve E, usually a finite field Fq. The computational cost of [k]P = P + P + ...+ P (k times) is therefore expressed as the number of field operations (additions, multiplications, inversions). Scalar multiplication is usually computed using variants of the binary algorithm (double-and-add, NAF, wNAF, etc). If s is a small integer, optimized formula for [s]P can be used within a s-ary algorithm or with double-base methods with bases 2 and s. Optimized formulas exists for very small scalars (s number of field operations makes it a very difficult task when s > 5. We present a generic method to automate transformations of formulas for elliptic curves over prime fields in various systems of coordinates. Our method uses a directed acyclic graph structure to find possible common subexpressions appearing in the formula and several arithmetic transformations. It produces efficient formulas to compute [s]P for a large set of small scalars s. In particular, we present a faster formula for [5]P in Jacobian coordinates. Moreover, our program can produce code for various mathematical software (Magma) and libraries (PACE).

  11. Automated Assessment, Face to Face

    OpenAIRE

    Rizik M. H. Al-Sayyed; Amjad Hudaib; Muhannad AL-Shboul; Yousef Majdalawi; Mohammed Bataineh

    2010-01-01

    This research paper evaluates the usability of automated exams and compares them with the paper-and-pencil traditional ones. It presents the results of a detailed study conducted at The University of Jordan (UoJ) that comprised students from 15 faculties. A set of 613 students were asked about their opinions concerning automated exams; and their opinions were deeply analyzed. The results indicate that most students reported that they are satisfied with using automated exams but they have sugg...

  12. Automation System Products and Research

    OpenAIRE

    Rintala, Mikko; Sormunen, Jussi; Kuisma, Petri; Rahkala, Matti

    2014-01-01

    Automation systems are used in most buildings nowadays. In the past they were mainly used in industry to control and monitor critical systems. During the past few decades the automation systems have become more common and are used today from big industrial solutions to homes of private customers. With the growing need for ecologic and cost-efficient management systems, home and building automation systems are becoming a standard way of controlling lighting, ventilation, heating etc. Auto...

  13. Test Automation of Online Games

    OpenAIRE

    Schoenfeldt, Alexander

    2015-01-01

    State of the art browser games are increasingly complex pieces of software with extensive code basis. With increasing complexity, a software becomes harder to maintain. Automated regression testing can simplify these maintenance processes and thereby enable developers as well as testers to spend their workforce more efficiently. This thesis addresses the utilization of automated tests in web applications. As a use case test automation is applied to an online-based strategy game for the bro...

  14. SafetyNet: Streamlining and Automating QA in radiotherapy.

    Science.gov (United States)

    Hadley, Scott W; Kessler, Marc L; Litzenberg, Dale W; Lee, Choonik; Irrer, Jim; Chen, Xiaoping; Acosta, Eduardo; Weyburne, Grant; Keranen, Wayne; Lam, Kwok; Covington, Elizabeth; Younge, Kelly C; Matuszak, Martha M; Moran, Jean M

    2016-01-01

    Proper quality assurance (QA) of the radiotherapy process can be time-consuming and expensive. Many QA efforts, such as data export and import, are inefficient when done by humans. Additionally, humans can be unreliable, lose attention, and fail to complete critical steps that are required for smooth operations. In our group we have sought to break down the QA tasks into separate steps and to automate those steps that are better done by software running autonomously or at the instigation of a human. A team of medical physicists and software engineers worked together to identify opportunities to streamline and automate QA. Development efforts follow a formal cycle of writing software requirements, developing software, testing and commissioning. The clinical release process is separated into clinical evaluation testing, training, and finally clinical release. We have improved six processes related to QA and safety. Steps that were previously performed by humans have been automated or streamlined to increase first-time quality, reduce time spent by humans doing low-level tasks, and expedite QA tests. Much of the gains were had by automating data transfer, implementing computer-based checking and automation of systems with an event-driven framework. These coordinated efforts by software engineers and clinical physicists have resulted in speed improvements in expediting patient-sensitive QA tests. PMID:26894365

  15. Automated Critical Peak Pricing Field Tests: Program Descriptionand Results

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Watson, David; Motegi, Naoya; Kiliccote, Sila; Xu, Peng

    2006-04-06

    will respond to this form of automation for CPP. (4) Evaluate what type of DR shifting and shedding strategies can be automated. (5) Explore how automation of control strategies can increase participation rates and DR saving levels with CPP. (6) Identify optimal demand response control strategies. (7) Determine occupant and tenant response.

  16. Mechatronic Design Automation

    DEFF Research Database (Denmark)

    Fan, Zhun

    This book proposes a novel design method that combines both genetic programming (GP) to automatically explore the open-ended design space and bond graphs (BG) to unify design representations of multi-domain Mechatronic systems. Results show that the method, formally called GPBG method, can...... successfully design analogue filters, vibration absorbers, micro-electro-mechanical systems, and vehicle suspension systems, all in an automatic or semi-automatic way. It also investigates the very important issue of co-designing plant-structures and dynamic controllers in automated design of Mechatronic...

  17. The automated medical office.

    Science.gov (United States)

    Petreman, M

    1990-08-01

    With shock and surprise many physicians learned in the 1980s that they must change the way they do business. Competition for patients, increasing government regulation, and the rapidly escalating risk of litigation forces physicians to seek modern remedies in office management. The author describes a medical clinic that strives to be paperless using electronic innovation to solve the problems of medical practice management. A computer software program to automate information management in a clinic shows that practical thinking linked to advanced technology can greatly improve office efficiency.

  18. AUTOMATED API TESTING APPROACH

    Directory of Open Access Journals (Sweden)

    SUNIL L. BANGARE

    2012-02-01

    Full Text Available Software testing is an investigation conducted to provide stakeholders with information about the quality of the product or service under test. With the help of software testing we can verify or validate the software product. Normally testing will be done after development of software but we can perform the software testing at the time of development process also. This paper will give you a brief introduction about Automated API Testing Tool. This tool of testing will reduce lots of headache after the whole development of software. It saves time as well as money. Such type of testing is helpful in the Industries & Colleges also.

  19. World-wide distribution automation systems

    Energy Technology Data Exchange (ETDEWEB)

    Devaney, T.M.

    1994-12-31

    A worldwide power distribution automation system is outlined. Distribution automation is defined and the status of utility automation is discussed. Other topics discussed include a distribution management system, substation feeder, and customer functions, potential benefits, automation costs, planning and engineering considerations, automation trends, databases, system operation, computer modeling of system, and distribution management systems.

  20. Quantifying linguistic coordination

    DEFF Research Database (Denmark)

    Fusaroli, Riccardo; Tylén, Kristian

    ). We employ nominal recurrence analysis (Orsucci et al 2005, Dale et al 2011) on the decision-making conversations between the participants. We report strong correlations between various indexes of recurrence and collective performance. We argue this method allows us to quantify the qualities......Language has been defined as a social coordination device (Clark 1996) enabling innovative modalities of joint action. However, the exact coordinative dynamics over time and their effects are still insufficiently investigated and quantified. Relying on the data produced in a collective decision...

  1. Coordinate Standard Measurement Development

    Energy Technology Data Exchange (ETDEWEB)

    Hanshaw, R.A.

    2000-02-18

    A Shelton Precision Interferometer Base, which is used for calibration of coordinate standards, was improved through hardware replacement, software geometry error correction, and reduction of vibration effects. Substantial increases in resolution and reliability, as well as reduction in sampling time, were achieved through hardware replacement; vibration effects were reduced substantially through modification of the machine component dampening and software routines; and the majority of the machine's geometry error was corrected through software geometry error correction. Because of these modifications, the uncertainty of coordinate standards calibrated on this device has been reduced dramatically.

  2. Introduction to Coordination Chemistry

    CERN Document Server

    Lawrance, Geoffrey Alan

    2010-01-01

    Introduction to Coordination Chemistry examines and explains how metals and molecules that bind as ligands interact, and the consequences of this assembly process. This book describes the chemical and physical properties and behavior of the complex assemblies that form, and applications that may arise as a result of these properties. Coordination complexes are an important but often hidden part of our world?even part of us?and what they do is probed in this book. This book distills the essence of this topic for undergraduate students and for research scientists.

  3. 基于优化的多机协同目标被动跟踪与控制方法%Optimized Passive Target Tracking and Coordinated Control by Multi-UAV Cooperation

    Institute of Scientific and Technical Information of China (English)

    李飞飞; 李超; 周锐

    2014-01-01

    To improve the observability and locating precision of targets under electronic silence and tactic invisibility,the optimal passive target tracking by multiple Unmanned Aerial Vehicles (UAVs) using Receding Horizon Optimization ( RHO) was studied.With respect to optimal tracking by dual cooperative UAVs,the spacial layout of the UAVs has great effect on the quality of passive target locating and tracking .To realize the optimal target tracking,the RHO was used to obtain the optimal control solution,and the headings of UAVs were adjusted in real time .Simulation results demonstrate the effectiveness of this method .%为提高“战术隐身”和“静默攻击”环境下目标的可观测性和跟踪定位精度,研究并完成了基于滚动时域优化( RHO)的纯方位角被动测量的UAV协同最优目标跟踪问题。对于双机协同最优跟踪的情况,观测UAV的空间相对布局对于被动定位与跟踪的质量起着极为重要的作用,采用最优控制的思想求取最优估计的控制量,实时调整UAV的航向,达到最优目标跟踪,并进行了仿真验证,结果证明了该方法的有效性。

  4. Automation of a Versatile Crane (the LSMS) for Lunar Outpost Construction, Maintenance and Inspection

    Science.gov (United States)

    Doggett, William R.; Roithmayr, Carlos M.; Dorsey, John T.; Jones, Thomas C.; Shen, Haijun; Seywald, Hans; King, Bruce D.; Mikulas, Martin M., Jr.

    2009-01-01

    , greatly expanding the operational versatility of the LSMS. This paper develops the equations describing the forward and inverse relation between LSMS joint angles and Cartesian coordinates of the LSMS tip. These equations allow a variety of schemes to be used to maneuver the LSMS to optimize the maneuver. One such scheme will be described in detail that eliminates undesirable swinging of the payload at the conclusion of a maneuver, even when the payload is suspended from a passive rigid link. The swinging is undesirable when performing precision maneuvers, such as aligning an object for mating or positioning a camera. Use of the equations described here enables automated control of the LSMS greatly improving its operational versatility.

  5. ASteCA - Automated Stellar Cluster Analysis

    CERN Document Server

    Perren, Gabriel I; Piatti, Andrés E

    2014-01-01

    We present ASteCA (Automated Stellar Cluster Analysis), a suit of tools designed to fully automatize the standard tests applied on stellar clusters to determine their basic parameters. The set of functions included in the code make use of positional and photometric data to obtain precise and objective values for a given cluster's center coordinates, radius, luminosity function and integrated color magnitude, as well as characterizing through a statistical estimator its probability of being a true physical cluster rather than a random overdensity of field stars. ASteCA incorporates a Bayesian field star decontamination algorithm capable of assigning membership probabilities using photometric data alone. An isochrone fitting process based on the generation of synthetic clusters from theoretical isochrones and selection of the best fit through a genetic algorithm is also present, which allows ASteCA to provide accurate estimates for a cluster's metallicity, age, extinction and distance values along with its unce...

  6. Automation from pictures

    International Nuclear Information System (INIS)

    The state transition diagram (STD) model has been helpful in the design of real time software, especially with the emergence of graphical computer aided software engineering (CASE) tools. Nevertheless, the translation of the STD to real time code has in the past been primarily a manual task. At Los Alamos we have automated this process. The designer constructs the STD using a CASE tool (Cadre Teamwork) using a special notation for events and actions. A translator converts the STD into an intermediate state notation language (SNL), and this SNL is compiled directly into C code (a state program). Execution of the state program is driven by external events, allowing multiple state programs to effectively share the resources of the host processor. Since the design and the code are tightly integrated through the CASE tool, the design and code never diverge, and we avoid design obsolescence. Furthermore, the CASE tool automates the production of formal technical documents from the graphic description encapsulated by the CASE tool. (author)

  7. Automated Test Case Generation

    CERN Document Server

    CERN. Geneva

    2015-01-01

    I would like to present the concept of automated test case generation. I work on it as part of my PhD and I think it would be interesting also for other people. It is also the topic of a workshop paper that I am introducing in Paris. (abstract below) Please note that the talk itself would be more general and not about the specifics of my PhD, but about the broad field of Automated Test Case Generation. I would introduce the main approaches (combinatorial testing, symbolic execution, adaptive random testing) and their advantages and problems. (oracle problem, combinatorial explosion, ...) Abstract of the paper: Over the last decade code-based test case generation techniques such as combinatorial testing or dynamic symbolic execution have seen growing research popularity. Most algorithms and tool implementations are based on finding assignments for input parameter values in order to maximise the execution branch coverage. Only few of them consider dependencies from outside the Code Under Test’s scope such...

  8. Automated Postediting of Documents

    CERN Document Server

    Knight, K; Knight, Kevin; Chander, Ishwar

    1994-01-01

    Large amounts of low- to medium-quality English texts are now being produced by machine translation (MT) systems, optical character readers (OCR), and non-native speakers of English. Most of this text must be postedited by hand before it sees the light of day. Improving text quality is tedious work, but its automation has not received much research attention. Anyone who has postedited a technical report or thesis written by a non-native speaker of English knows the potential of an automated postediting system. For the case of MT-generated text, we argue for the construction of postediting modules that are portable across MT systems, as an alternative to hardcoding improvements inside any one system. As an example, we have built a complete self-contained postediting module for the task of article selection (a, an, the) for English noun phrases. This is a notoriously difficult problem for Japanese-English MT. Our system contains over 200,000 rules derived automatically from online text resources. We report on l...

  9. Maneuver Automation Software

    Science.gov (United States)

    Uffelman, Hal; Goodson, Troy; Pellegrin, Michael; Stavert, Lynn; Burk, Thomas; Beach, David; Signorelli, Joel; Jones, Jeremy; Hahn, Yungsun; Attiyah, Ahlam; Illsley, Jeannette

    2009-01-01

    The Maneuver Automation Software (MAS) automates the process of generating commands for maneuvers to keep the spacecraft of the Cassini-Huygens mission on a predetermined prime mission trajectory. Before MAS became available, a team of approximately 10 members had to work about two weeks to design, test, and implement each maneuver in a process that involved running many maneuver-related application programs and then serially handing off data products to other parts of the team. MAS enables a three-member team to design, test, and implement a maneuver in about one-half hour after Navigation has process-tracking data. MAS accepts more than 60 parameters and 22 files as input directly from users. MAS consists of Practical Extraction and Reporting Language (PERL) scripts that link, sequence, and execute the maneuver- related application programs: "Pushing a single button" on a graphical user interface causes MAS to run navigation programs that design a maneuver; programs that create sequences of commands to execute the maneuver on the spacecraft; and a program that generates predictions about maneuver performance and generates reports and other files that enable users to quickly review and verify the maneuver design. MAS can also generate presentation materials, initiate electronic command request forms, and archive all data products for future reference.

  10. Quantitative determination of wine highly volatile sulfur compounds by using automated headspace solid-phase microextraction and gas chromatography-pulsed flame photometric detection. Critical study and optimization of a new procedure.

    Science.gov (United States)

    López, Ricardo; Lapeña, Ana Cristina; Cacho, Juan; Ferreira, Vicente

    2007-03-01

    The quantitative determination of wine volatile sulfur compounds by automated headspace solid-phase microextraction (HS-SPME) with a carboxen-polydimethylsiloxane (CAR-PDMS) fiber and subsequent gas chromatography-pulsed flame photometric detection (GC-PFPD) has been evaluated. The direct extraction of the sulfur compounds in 5 ml of wine has been found to suffer from matrix effects and short linear ranges, problems which could not be solved by the use of different internal standards or by multiple headspace SPME. These problems were attributed to saturation of the fiber and to competitive effects between analytes, internal standards and other wine volatiles. Another problem was the oxidation of analytes during the procedure. The reduction in sample volume by a factor 50 (0.1 ml diluted with water or brine) brought about a reduction in the amount of sulfur compounds taken in the fiber by a factor just 3.3. Consequently, a new procedure has been proposed. In a sealed vial containing 4.9 ml of saturated NaCl brine, the air is thoroughly displaced with nitrogen, and the wine (0.1 ml) and the internal standards (0.02 ml) are further introduced with a syringe through the vial septum. This sample is extracted at 35 degrees C for 20 min. This procedure makes a satisfactory determination possible of hydrogen sulfide, methanethiol, ethanethiol, dimethyl sulfide, diethyl sulfide and dimethyl disulfide. The linear dynamic ranges cover the normal ranges of occurrence of these analytes in wine with typical r2 between 0.9823 and 0.9980. Reproducibility in real samples ranges from 10 to 20% and repeatability is better than 10% in most cases. The method accuracy is satisfactory, with errors below 20% for hydrogen sulfide and mostly below 10% for the other compounds. The proposed method has been applied to the analysis of 34 Spanish wines.

  11. Advice for Coordination

    DEFF Research Database (Denmark)

    Hankin, Chris; Nielson, Flemming; Nielson, Hanne Riis;

    2008-01-01

    We show how to extend a coordination language with support for aspect oriented programming. The main challenge is how to properly deal with the trapping of actions before the actual data have been bound to the formal parameters. This necessitates dealing with open joinpoints – which is more...

  12. Block coordination copolymers

    Energy Technology Data Exchange (ETDEWEB)

    Koh, Kyoung Moo; Wong-Foy, Antek G; Matzger, Adam J; Benin, Annabelle I; Willis, Richard R

    2014-11-11

    The present invention provides compositions of crystalline coordination copolymers wherein multiple organic molecules are assembled to produce porous framework materials with layered or core-shell structures. These materials are synthesized by sequential growth techniques such as the seed growth technique. In addition, the invention provides a simple procedure for controlling functionality.

  13. Recursive Advice for Coordination

    DEFF Research Database (Denmark)

    Terepeta, Michal Tomasz; Nielson, Hanne Riis; Nielson, Flemming

    2012-01-01

    Aspect-oriented programming is a programming paradigm that is often praised for the ability to create modular software and separate cross-cutting concerns. Recently aspects have been also considered in the context of coordination languages, offering similar advantages. However, introducing aspects...

  14. Block coordination copolymers

    Science.gov (United States)

    Koh, Kyoung Moo; Wong-Foy, Antek G; Matzger, Adam J; Benin, Annabelle I; Willis, Richard R

    2012-11-13

    The present invention provides compositions of crystalline coordination copolymers wherein multiple organic molecules are assembled to produce porous framework materials with layered or core-shell structures. These materials are synthesized by sequential growth techniques such as the seed growth technique. In addition, the invention provides a simple procedure for controlling functionality.

  15. 76 FR 70721 - Voltage Coordination on High Voltage Grids; Notice of Staff Workshop

    Science.gov (United States)

    2011-11-15

    ... Energy Regulatory Commission Voltage Coordination on High Voltage Grids; Notice of Staff Workshop Take notice that the Federal Energy Regulatory Commission will hold a Workshop on Voltage Coordination on High... improve coordination and optimization of transfer capability across the Bulk-Power System from...

  16. 76 FR 72203 - Voltage Coordination on High Voltage Grids; Notice of Reliability Workshop Agenda

    Science.gov (United States)

    2011-11-22

    ... Energy Regulatory Commission Voltage Coordination on High Voltage Grids; Notice of Reliability Workshop... existing and emerging software to improve coordination and optimization of the Bulk-Power System from a..., 2011. Kimberly D. Bose, Secretary. TN22NO11.003 Staff Workshop on Voltage Coordination on High...

  17. The Automated Logistics Element Planning System (ALEPS)

    Science.gov (United States)

    Schwaab, Douglas G.

    1992-01-01

    ALEPS, which is being developed to provide the SSF program with a computer system to automate logistics resupply/return cargo load planning and verification, is presented. ALEPS will make it possible to simultaneously optimize both the resupply flight load plan and the return flight reload plan for any of the logistics carriers. In the verification mode ALEPS will support the carrier's flight readiness reviews and control proper execution of the approved plans. It will also support the SSF inventory management system by providing electronic block updates to the inventory database on the cargo arriving at or departing the station aboard a logistics carrier. A prototype drawer packing algorithm is described which is capable of generating solutions for 3D packing of cargo items into a logistics carrier storage accommodation. It is concluded that ALEPS will provide the capability to generate and modify optimized loading plans for the logistics elements fleet.

  18. A New Approach to an Automated Air Traffic Control

    Institute of Scientific and Technical Information of China (English)

    Patchev Dragoljub

    2014-01-01

    This paper identifies areas of improvements of the air traffic control system and proposes modification of the concept of automation by using available technologies. With the proposed modification, the current Europe wide en route network structure can be modified in order to make routes more optimal. For this new route network structure, a new concept of automation will be used to manage with the air traffic. The first identified area of improvement is implementation of automation process that will enable decentralization of the air traffic control functionality to each individual aircraft and this will be achieved through automated routing of the aircrafts and CD&R (conflict detection and resolution). The FMS (flight management system) at the aircraft will make decisions for the optimal flight route based on the sensor inputs, information on selection of the routes, next hope points and flight levels, all received by ADS-B (automatic dependant surveillance-broadcast). The second area is processing the information about the deviation from the optimal route as in flight plan due to a traffic management (vectoring, level change) and taking it into consideration when further actions are undertaken. For each action, a cost factor will be calculated from the fuel burned for that action. This factor will be used to select conflict resolution protocol. The proposed concept shall increase the capacity of the network, and enable the air traff~c more efficient and more environmentally friendly while maintaining safe separation.

  19. 23rd International Conference on Flexible Automation & Intelligent Manufacturing

    CERN Document Server

    2013-01-01

    The proceedings includes the set of revised papers from the 23rd International Conference on Flexible Automation and Intelligent Manufacturing (FAIM 2013). This conference aims to provide an international forum for the exchange of leading edge scientific knowledge and industrial experience regarding the development and integration of the various aspects of Flexible Automation and Intelligent Manufacturing Systems covering the complete life-cycle of a company’s Products and Processes. Contents will include topics such as: Product, Process and Factory Integrated Design, Manufacturing Technology and Intelligent Systems, Manufacturing Operations Management and Optimization and Manufacturing Networks and MicroFactories.

  20. PDB_REDO: automated re-refinement of X-ray structure models in the PDB

    OpenAIRE

    Joosten, R.P.; Salzemann, J.; Bloch, V.; Stockinger, H.; Berglund, A; Blanchet, C.; Bongcam-Rudloff, E.; Combet, C.; Da Costa, A.L.; Deleage, G.; Diarena, M.; Fabbretti, R.; Fettahi, G.; Flegel, V.; Gisel, A.

    2009-01-01

    Structural biology, homology modelling and rational drug design require accurate three-dimensional macromolecular coordinates. However, the coordinates in the Protein Data Bank (PDB) have not all been obtained using the latest experimental and computational methods. In this study a method is presented for automated re-refinement of existing structure models in the PDB. A large-scale benchmark with 16 807 PDB entries showed that they can be improved in terms of fit to the deposited experimenta...

  1. Get smart! automate your house!

    NARCIS (Netherlands)

    Van Amstel, P.; Gorter, N.; De Rouw, J.

    2016-01-01

    This "designers' manual" is made during the TIDO-course AR0531 Innovation and Sustainability This manual will help you in reducing both energy usage and costs by automating your home. It gives an introduction to a number of home automation systems that every homeowner can install.

  2. Automated Methods Of Corrosion Measurements

    DEFF Research Database (Denmark)

    Bech-Nielsen, Gregers; Andersen, Jens Enevold Thaulov; Reeve, John Ch;

    1997-01-01

    The chapter describes the following automated measurements: Corrosion Measurements by Titration, Imaging Corrosion by Scanning Probe Microscopy, Critical Pitting Temperature and Application of the Electrochemical Hydrogen Permeation Cell.......The chapter describes the following automated measurements: Corrosion Measurements by Titration, Imaging Corrosion by Scanning Probe Microscopy, Critical Pitting Temperature and Application of the Electrochemical Hydrogen Permeation Cell....

  3. Automated separation for heterogeneous immunoassays

    OpenAIRE

    Truchaud, A.; Barclay, J; Yvert, J. P.; Capolaghi, B.

    1991-01-01

    Beside general requirements for modern automated systems, immunoassay automation involves specific requirements as a separation step for heterogeneous immunoassays. Systems are designed according to the solid phase selected: dedicated or open robots for coated tubes and wells, systems nearly similar to chemistry analysers in the case of magnetic particles, and a completely original design for those using porous and film materials.

  4. Translation: Aids, Robots, and Automation.

    Science.gov (United States)

    Andreyewsky, Alexander

    1981-01-01

    Examines electronic aids to translation both as ways to automate it and as an approach to solve problems resulting from shortage of qualified translators. Describes the limitations of robotic MT (Machine Translation) systems, viewing MAT (Machine-Aided Translation) as the only practical solution and the best vehicle for further automation. (MES)

  5. Opening up Library Automation Software

    Science.gov (United States)

    Breeding, Marshall

    2009-01-01

    Throughout the history of library automation, the author has seen a steady advancement toward more open systems. In the early days of library automation, when proprietary systems dominated, the need for standards was paramount since other means of inter-operability and data exchange weren't possible. Today's focus on Application Programming…

  6. 基于仿真的半导体自动物料搬运系统调度优化%Simulation-based Dynamic Scheduling Optimization for Automated Material Handling Systems in Semi-conductor Manufacturing

    Institute of Scientific and Technical Information of China (English)

    沈正花; 陆志强

    2011-01-01

    利用Arena仿真软件对半导体制造中的Intrabay系统进行了仿真建模,根据关键因子和等待比例两个变量将系统状态分为三种情况,利用遗传算法与仿真模型结合来确定两个变量的阈值和三种系统状态下各自采用的调度规则,实现系统的动态调度.仿真实验表明,所确定的调度规则比使用静态调度最长等待时间规则获得了更好的性能指标.%A discrete event simulation model of intrabay system based on Arena was developed to implement the dispatching rules. The status of the system was classified into three categories by critical factor and percent of waiting. Simulation based Genetic Algorithm procedure was employed to search for optimized critical factor and percent of waiting as well as rules to use under the three different situations. The result shows that the optimized dynamic scheduling ruleoutperforms the longest waiting time rule.

  7. Quantification of Aromaticity Based on Interaction Coordinates: A New Proposal.

    Science.gov (United States)

    Pandey, Sarvesh Kumar; Manogaran, Dhivya; Manogaran, Sadasivam; Schaefer, Henry F

    2016-05-12

    Attempts to establish degrees of aromaticity in molecules are legion. In the present study, we begin with a fictitious fragment arising from only those atoms contributing to the aromatic ring and having a force field projected from the original system. For example, in benzene, we adopt a fictitious C6 fragment with a force field projected from the full benzene force field. When one bond or angle is stretched and kept fixed, followed by a partial optimization for all other internal coordinates, structures change from their respective equilibria. These changes are the responses of all other internal coordinates for constraining the bond or angle by unit displacements and relaxing the forces on all other internal coordinates. The "interaction coordinate" derived from the redundant internal coordinate compliance constants measures how a bond (its electron density) responds for constrained optimization when another bond or angle is stretched by a specified unit (its electron density is perturbed by a finite amount). The sum of interaction coordinates (responses) of all bonded neighbors for all internal coordinates of the fictitious fragment is a measure of the strength of the σ and π electron interactions leading to aromatic stability. This sum, based on interaction coordinates, appears to be successful as an aromaticity index for a range of chemical systems. Since the concept involves analyzing a fragment rather than the whole molecule, this idea is more general and is likely to lead to new insights.

  8. Automated Motivic Analysis

    DEFF Research Database (Denmark)

    Lartillot, Olivier

    2016-01-01

    Motivic analysis provides very detailed understanding of musical composi- tions, but is also particularly difficult to formalize and systematize. A computational automation of the discovery of motivic patterns cannot be reduced to a mere extraction of all possible sequences of descriptions....... The systematic approach inexorably leads to a proliferation of redundant structures that needs to be addressed properly. Global filtering techniques cause a drastic elimination of interesting structures that damages the quality of the analysis. On the other hand, a selection of closed patterns allows...... for lossless compression. The structural complexity resulting from successive repetitions of patterns can be controlled through a simple modelling of cycles. Generally, motivic patterns cannot always be defined solely as sequences of descriptions in a fixed set of dimensions: throughout the descriptions...

  9. Robust automated knowledge capture.

    Energy Technology Data Exchange (ETDEWEB)

    Stevens-Adams, Susan Marie; Abbott, Robert G.; Forsythe, James Chris; Trumbo, Michael Christopher Stefan; Haass, Michael Joseph; Hendrickson, Stacey M. Langfitt

    2011-10-01

    This report summarizes research conducted through the Sandia National Laboratories Robust Automated Knowledge Capture Laboratory Directed Research and Development project. The objective of this project was to advance scientific understanding of the influence of individual cognitive attributes on decision making. The project has developed a quantitative model known as RumRunner that has proven effective in predicting the propensity of an individual to shift strategies on the basis of task and experience related parameters. Three separate studies are described which have validated the basic RumRunner model. This work provides a basis for better understanding human decision making in high consequent national security applications, and in particular, the individual characteristics that underlie adaptive thinking.

  10. Automated Electrostatics Environmental Chamber

    Science.gov (United States)

    Calle, Carlos; Lewis, Dean C.; Buchanan, Randy K.; Buchanan, Aubri

    2005-01-01

    The Mars Electrostatics Chamber (MEC) is an environmental chamber designed primarily to create atmospheric conditions like those at the surface of Mars to support experiments on electrostatic effects in the Martian environment. The chamber is equipped with a vacuum system, a cryogenic cooling system, an atmospheric-gas replenishing and analysis system, and a computerized control system that can be programmed by the user and that provides both automation and options for manual control. The control system can be set to maintain steady Mars-like conditions or to impose temperature and pressure variations of a Mars diurnal cycle at any given season and latitude. In addition, the MEC can be used in other areas of research because it can create steady or varying atmospheric conditions anywhere within the wide temperature, pressure, and composition ranges between the extremes of Mars-like and Earth-like conditions.

  11. Automated Standard Hazard Tool

    Science.gov (United States)

    Stebler, Shane

    2014-01-01

    The current system used to generate standard hazard reports is considered cumbersome and iterative. This study defines a structure for this system's process in a clear, algorithmic way so that standard hazard reports and basic hazard analysis may be completed using a centralized, web-based computer application. To accomplish this task, a test server is used to host a prototype of the tool during development. The prototype is configured to easily integrate into NASA's current server systems with minimal alteration. Additionally, the tool is easily updated and provides NASA with a system that may grow to accommodate future requirements and possibly, different applications. Results of this project's success are outlined in positive, subjective reviews complete by payload providers and NASA Safety and Mission Assurance personnel. Ideally, this prototype will increase interest in the concept of standard hazard automation and lead to the full-scale production of a user-ready application.

  12. Automated synthetic scene generation

    Science.gov (United States)

    Givens, Ryan N.

    Physics-based simulations generate synthetic imagery to help organizations anticipate system performance of proposed remote sensing systems. However, manually constructing synthetic scenes which are sophisticated enough to capture the complexity of real-world sites can take days to months depending on the size of the site and desired fidelity of the scene. This research, sponsored by the Air Force Research Laboratory's Sensors Directorate, successfully developed an automated approach to fuse high-resolution RGB imagery, lidar data, and hyperspectral imagery and then extract the necessary scene components. The method greatly reduces the time and money required to generate realistic synthetic scenes and developed new approaches to improve material identification using information from all three of the input datasets.

  13. Automated Essay Scoring

    Directory of Open Access Journals (Sweden)

    Semire DIKLI

    2006-01-01

    Full Text Available Automated Essay Scoring Semire DIKLI Florida State University Tallahassee, FL, USA ABSTRACT The impacts of computers on writing have been widely studied for three decades. Even basic computers functions, i.e. word processing, have been of great assistance to writers in modifying their essays. The research on Automated Essay Scoring (AES has revealed that computers have the capacity to function as a more effective cognitive tool (Attali, 2004. AES is defined as the computer technology that evaluates and scores the written prose (Shermis & Barrera, 2002; Shermis & Burstein, 2003; Shermis, Raymat, & Barrera, 2003. Revision and feedback are essential aspects of the writing process. Students need to receive feedback in order to increase their writing quality. However, responding to student papers can be a burden for teachers. Particularly if they have large number of students and if they assign frequent writing assignments, providing individual feedback to student essays might be quite time consuming. AES systems can be very useful because they can provide the student with a score as well as feedback within seconds (Page, 2003. Four types of AES systems, which are widely used by testing companies, universities, and public schools: Project Essay Grader (PEG, Intelligent Essay Assessor (IEA, E-rater, and IntelliMetric. AES is a developing technology. Many AES systems are used to overcome time, cost, and generalizability issues in writing assessment. The accuracy and reliability of these systems have been proven to be high. The search for excellence in machine scoring of essays is continuing and numerous studies are being conducted to improve the effectiveness of the AES systems.

  14. Principles of Coordination Polymerisation

    Science.gov (United States)

    Kuran, Witold

    2001-11-01

    The first all-inclusive text covering coordination polymerisation, including important classes of non-hydrocarbon monomers. Charting the achievements and progress in the field, in terms of both basic and industrial research, this book offers a unified and complete overview of coordination polymerisation. Provides detailed description of the historical development of the subject Presents a unified view of catalysis, mechanisms, structures and utility Encourages learning through a step-by-step progression from basic to in-depth text Features end-of-chapter exercises to reinforce understanding Offers a full bibliography and comprehensive literature review Requisite reading for research students studying introductory and advanced courses in; polymer science, catalysis and polymerisation catalysis, and valuable reference for researchers and technicians in industry.

  15. Motion Planning Based Coordinated Control for Hydraulic Excavators

    Institute of Scientific and Technical Information of China (English)

    GAO Yingjie; JIN Yanchao; ZHANG Qin

    2009-01-01

    Hydraulic excavator is one type of the most widely applied construction equipment for various applications mainly because of its versatility and mobility. Among the tasks performed by a hydraulic excavator, repeatable level digging or flat surface finishing may take a large percentage. Using automated functions to perform such repeatable and tedious jobs will not only greatly increase the overall productivity but more importantly also improve the operation safety. For the purpose of investigating the technology without loss of generality, this research is conducted to create a coordinate control method for the boom, arm and bucket cylinders on a hydraulic excavator to perform accurate and effective works. On the basis of the kinematic analysis of the excavator linkage system, the tip trajectory of the end-effector can be determined in terms of three hydraulic cylinders coordinated motion with a visualized method. The coordination of those hydraulic cylinders is realized by controlling three electro-hydranlic proportional valves coordinately. Therefore,the complex control algorithm of a hydraulic excavator can be simplified into coordinated motion control of three individual systems.This coordinate control algorithm was validated on a wheeled hydraulic excavator, and the validation results indicated that this developed control method could satisfaetorily accomplish the auto-digging function for level digging or flat surface finishing.

  16. Communication and interference coordination

    OpenAIRE

    Blasco-Serrano, Ricardo; Thobaben, Ragnar; Skoglund, Mikael

    2014-01-01

    We study the problem of controlling the interference created to an external observer by a communication processes. We model the interference in terms of its type (empirical distribution), and we analyze the consequences of placing constraints on the admissible type. Considering a single interfering link, we characterize the communication-interference capacity region. Then, we look at a scenario where the interference is jointly created by two users allowed to coordinate their actions prior to...

  17. International Monetary Policy Coordination

    OpenAIRE

    Carlberg, Michael

    2005-01-01

    This paper studies the international coordination of monetary policies in the world economy. It carefully discusses the process of policy competition and the structure of policy cooperation. As to policy competition, the focus is on monetary competition between Europe and America. Similarly, as to policy cooperation, the focus is on monetary cooperation between Europe and America. The spillover effects of monetary policy are negative. The policy targets are price stability and full employment.

  18. Global coordination: weighted voting

    Directory of Open Access Journals (Sweden)

    Jan-Erik Lane

    2014-03-01

    Full Text Available In order to halt the depletion of global ecological capital, a number of different kinds of meetings between Governments of countries in the world has been scheduled. The need for global coordination of environmental policies has become ever more obvious, supported by more and more evidence of the running down of ecological capital. But there are no formal or binding arrangements in sight, as global environmental coordination suffers from high transaction costs (qualitative voting. The CO2 equivalent emissions, resulting in global warming, are driven by the unstoppable economic expansion in the global market economy, employing mainly fossil fuel generated energy, although at the same time lifting sharply the GDP per capita of several emerging countries. Only global environmental coordination on the successful model of the World Band and the IMF (quantitative voting can stem the rising emissions numbers and stop further environmental degradation. However, the system of weighted voting in the WB and the IMF must be reformed by reducing the excessive voting power disparities, for instance by reducing all member country votes by the cube root expression.

  19. Improving Project Manufacturing Coordination

    Directory of Open Access Journals (Sweden)

    Korpivaara Ville

    2014-09-01

    Full Text Available The objective of this research is to develop firms’ project manufacturing coordination. The development will be made by centralizing the manufacturing information flows in one system. To be able to centralize information, a deep user need assessment is required. After user needs have been identified, the existing system will be developed to match these needs. The theoretical background is achieved through exploring the literature of project manufacturing, development project success factors and different frameworks and tools for development project execution. The focus of this research is rather in customer need assessment than in system’s technical expertise. To ensure the deep understanding of customer needs this study is executed by action research method. As a result of this research the information system for project manufacturing coordination was developed to respond revealed needs of the stakeholders. The new system improves the quality of the manufacturing information, eliminates waste in manufacturing coordination processes and offers a better visibility to the project manufacturing. Hence it provides a solid base for the further development of project manufacturing.

  20. A fully-automated software pipeline for integrating breast density and parenchymal texture analysis for digital mammograms: parameter optimization in a case-control breast cancer risk assessment study

    Science.gov (United States)

    Zheng, Yuanjie; Wang, Yan; Keller, Brad M.; Conant, Emily; Gee, James C.; Kontos, Despina

    2013-02-01

    Estimating a woman's risk of breast cancer is becoming increasingly important in clinical practice. Mammographic density, estimated as the percent of dense (PD) tissue area within the breast, has been shown to be a strong risk factor. Studies also support a relationship between mammographic texture and breast cancer risk. We have developed a fullyautomated software pipeline for computerized analysis of digital mammography parenchymal patterns by quantitatively measuring both breast density and texture properties. Our pipeline combines advanced computer algorithms of pattern recognition, computer vision, and machine learning and offers a standardized tool for breast cancer risk assessment studies. Different from many existing methods performing parenchymal texture analysis within specific breast subregions, our pipeline extracts texture descriptors for points on a spatial regular lattice and from a surrounding window of each lattice point, to characterize the local mammographic appearance throughout the whole breast. To demonstrate the utility of our pipeline, and optimize its parameters, we perform a case-control study by retrospectively analyzing a total of 472 digital mammography studies. Specifically, we investigate the window size, which is a lattice related parameter, and compare the performance of texture features to that of breast PD in classifying case-control status. Our results suggest that different window sizes may be optimal for raw (12.7mm2) versus vendor post-processed images (6.3mm2). We also show that the combination of PD and texture features outperforms PD alone. The improvement is significant (p=0.03) when raw images and window size of 12.7mm2 are used, having an ROC AUC of 0.66. The combination of PD and our texture features computed from post-processed images with a window size of 6.3 mm2 achieves an ROC AUC of 0.75.

  1. Integrated Microreactors for Reaction Automation: New Approaches to Reaction Development

    Science.gov (United States)

    McMullen, Jonathan P.; Jensen, Klavs F.

    2010-07-01

    Applications of microsystems (microreactors) in continuous-flow chemistry have expanded rapidly over the past two decades, with numerous reports of higher conversions and yields compared to conventional batch benchtop equipment. Synthesis applications are enhanced by chemical information gained from integrating microreactor components with sensors, actuators, and automated fluid handling. Moreover, miniaturized systems allow experiments on well-defined samples at conditions not easily accessed by conventional means, such as reactions at high pressure and temperatures. The wealth of synthesis information that could potentially be acquired through use of microreactors integrated with physical sensors and analytical chemistry techniques for online reaction monitoring has not yet been well explored. The increased efficiency resulting from use of continuous-flow microreactor platforms to automate reaction screening and optimization encourages a shift from current batchwise chemical reaction development to this new approach. We review advances in this new area and provide application examples of online monitoring and automation.

  2. Automated Tuning of the Advanced Photon Source Booster Synchrotron

    Science.gov (United States)

    Biedron, S. G.; Carwardine, J. A.; Milton, S. V.

    1997-05-01

    The acceleration cycle of the Advanced Photon Source (APS) booster synchrotron is completed within 250 ms and is repeated at 2 Hz. Unless properly corrected, transverse and longitudinal injection errors can lead to inefficient booster performance. Ramped-magnet tracking errors can also lead to losses during the acceleration cycle. In order to simplify daily operation, automated tuning methods have been developed. Through the use of empirically determined response functions, transfer line corrector magnets, and beam position monitor readings, the injection process is optimized by correcting the first turn trajectory to the measured closed orbit. An automated version of this correction technique has been implemented using the feedback-based program sddscontrollaw. Further automation is used to adjust and minimize tracking errors between the five main ramped power supplies. These tuning algorithms and their implementation are described here along with an evaluation of their! performance.

  3. 融合阶段满意度的动态供应链产销协同优化%Dynamic coordinated optimization of production-distribution under stage satisfaction degree

    Institute of Scientific and Technical Information of China (English)

    张雷; 孔艳岩

    2012-01-01

    将产品阶段满意度作为决策变量引入到供应链网络优化模型,针对易逝性电子产品生命周期阶段需求特征差异性引起的生产模式的不同,考虑阶段满意度的动态性,构建以单一产品整个生命周期内盈利最大化为目标函数的0-1混合整数优化模型;利用Lingo9.0软件设计算法程序求解.运用模型进行了实证分析,结果表明了该模型的合理性及有效性.%This paper introduced the stage satisfaction degree into the supply chain network optimization model as decision variable. With differences of production mode which caused by differences of stage demand characteristics throughout perishable electron product life cycle, considering dynamic character of stage satisfaction degree, it built 0-1 mixed integer optimization model with maximum profit throughout single product' s life-cycle as objective function. By making use of Iingo9.0 software , it designed programmed algorithm to solve the model. Finally, it analyzed the model by examples. The results show that the proposed model is rational and effective.

  4. 基于Plant Simulation的大型游乐园区域协调与优化研究%A Research on Area Coordinating and Optimizing of Amusement Park by Using Plant Simulation

    Institute of Scientific and Technical Information of China (English)

    曹欢欢; 许志沛

    2016-01-01

    In order to maximize the customers' satisfaction and improve the operators' economic benefit, the layout plans and intercoordination of the typical amusement equipment and people's routes in the amusement park were analyzed and optimized. The people's routes and the working of amusement equipment were simulated in different customer proportion, different festivals and different amusement equipment proportion by plant simulation. The ways to optimizing the layouts of amusement equipment were explored and reasonable advices for similar parks were proposed.%为使游客达到最大满意度、提高运营经济效益,对典型的大型游乐园游乐设施和游乐线路的布局、规划、相互协调进行了分析和择优。并运用plant simulation模拟软件对不同客源比例、节假日和游乐设施配置游乐园运作状况进行仿真和研究,分析其功能区域间、游乐设施间协调的利弊,探讨优化途径,提出针对性的合理化建议,供同类游乐园规划与运营调整的借鉴与参考。

  5. Automating the radiographic NDT process

    International Nuclear Information System (INIS)

    Automation, the removal of the human element in inspection, has not been generally applied to film radiographic NDT. The justication for automating is not only productivity but also reliability of results. Film remains in the automated system of the future because of its extremely high image content, approximately 8 x 109 bits per 14 x 17. The equivalent to 2200 computer floppy discs. Parts handling systems and robotics applied for manufacturing and some NDT modalities, should now be applied to film radiographic NDT systems. Automatic film handling can be achieved with the daylight NDT film handling system. Automatic film processing is becoming the standard in industry and can be coupled to the daylight system. Robots offer the opportunity to automate fully the exposure step. Finally, computer aided interpretation appears on the horizon. A unit which laser scans a 14 x 17 (inch) film in 6 - 8 seconds can digitize film information for further manipulation and possible automatic interrogations (computer aided interpretation). The system called FDRS (for Film Digital Radiography System) is moving toward 50 micron (*approx* 16 lines/mm) resolution. This is believed to meet the need of the majority of image content needs. We expect the automated system to appear first in parts (modules) as certain operations are automated. The future will see it all come together in an automated film radiographic NDT system (author)

  6. Extensible automated dispersive liquid–liquid microextraction

    Energy Technology Data Exchange (ETDEWEB)

    Li, Songqing; Hu, Lu; Chen, Ketao; Gao, Haixiang, E-mail: hxgao@cau.edu.cn

    2015-05-04

    Highlights: • An extensible automated dispersive liquid–liquid microextraction was developed. • A fully automatic SPE workstation with a modified operation program was used. • Ionic liquid-based in situ DLLME was used as model method. • SPE columns packed with nonwoven polypropylene fiber was used for phase separation. • The approach was applied to the determination of benzoylurea insecticides in water. - Abstract: In this study, a convenient and extensible automated ionic liquid-based in situ dispersive liquid–liquid microextraction (automated IL-based in situ DLLME) was developed. 1-Octyl-3-methylimidazolium bis[(trifluoromethane)sulfonyl]imide ([C{sub 8}MIM]NTf{sub 2}) is formed through the reaction between [C{sub 8}MIM]Cl and lithium bis[(trifluoromethane)sulfonyl]imide (LiNTf{sub 2}) to extract the analytes. Using a fully automatic SPE workstation, special SPE columns packed with nonwoven polypropylene (NWPP) fiber, and a modified operation program, the procedures of the IL-based in situ DLLME, including the collection of a water sample, injection of an ion exchange solvent, phase separation of the emulsified solution, elution of the retained extraction phase, and collection of the eluent into vials, can be performed automatically. The developed approach, coupled with high-performance liquid chromatography–diode array detection (HPLC–DAD), was successfully applied to the detection and concentration determination of benzoylurea (BU) insecticides in water samples. Parameters affecting the extraction performance were investigated and optimized. Under the optimized conditions, the proposed method achieved extraction recoveries of 80% to 89% for water samples. The limits of detection (LODs) of the method were in the range of 0.16–0.45 ng mL{sup −1}. The intra-column and inter-column relative standard deviations (RSDs) were <8.6%. Good linearity (r > 0.9986) was obtained over the calibration range from 2 to 500 ng mL{sup −1}. The proposed

  7. Automated Fluid Interface System (AFIS)

    Science.gov (United States)

    1990-01-01

    Automated remote fluid servicing will be necessary for future space missions, as future satellites will be designed for on-orbit consumable replenishment. In order to develop an on-orbit remote servicing capability, a standard interface between a tanker and the receiving satellite is needed. The objective of the Automated Fluid Interface System (AFIS) program is to design, fabricate, and functionally demonstrate compliance with all design requirements for an automated fluid interface system. A description and documentation of the Fairchild AFIS design is provided.

  8. Exploring the Lived Experiences of Program Managers Regarding an Automated Logistics Environment

    Science.gov (United States)

    Allen, Ronald Timothy

    2014-01-01

    Automated Logistics Environment (ALE) is a new term used by Navy and aerospace industry executives to describe the aggregate of logistics-related information systems that support modern aircraft weapon systems. The development of logistics information systems is not always well coordinated among programs, often resulting in solutions that cannot…

  9. Optimal Reorientation Of Spacecraft Orbit

    Directory of Open Access Journals (Sweden)

    Chelnokov Yuriy Nikolaevich

    2014-06-01

    Full Text Available The problem of optimal reorientation of the spacecraft orbit is considered. For solving the problem we used quaternion equations of motion written in rotating coordinate system. The use of quaternion variables makes this consideration more efficient. The problem of optimal control is solved on the basis of the maximum principle. An example of numerical solution of the problem is given.

  10. Information-driven coordination: experimental results with heterogeneous individuals

    OpenAIRE

    Semeshenko, Viktoriya; Garapin, Alexis; Ruffieux, Bernard; Mirta B. Gordon

    2009-01-01

    We study experimentally a coordination game with N heterogeneous individuals under different information treatments. We explore the effects of information on the emergence of Pareto-efficient outcomes, by means of a gradual decrease of the information content provided to the players in successive experiments. We observe that successful coordination is possible with private information alone, although not on a Pareto-optimal equilibrium. Reinforcement-based learning models reproduce the qualit...

  11. Contribution to MPC coordination of distributed and power generation systems

    OpenAIRE

    Sandoval Moreno, John Anderson

    2014-01-01

    This thesis is mainly about coordination of distributed systems, with a special attention to multi-energy electric power generation ones. For purposes of optimality, as well as constraint enforcement, Model Predictive Control (MPC) is chosen as the underlying tool, while wind turbines, fuel cells, photovoltaic panels, and hydroelectric plants are mostly considered as power sources to be controlled and coordinated. In the first place, an application of MPC to a micro-grid system is proposed, i...

  12. Rowing Crew Coordination Dynamics at Increasing Stroke Rates

    OpenAIRE

    Cuijpers, Laura S.; Zaal, Frank T.J.M.; De Poel, Harjo J.

    2015-01-01

    In rowing, perfect synchronisation is important for optimal performance of a crew. Remarkably, a recent study on ergometers demonstrated that antiphase crew coordination might be mechanically more efficient by reducing the power lost to within-cycle velocity fluctuations of the boat. However, coupled oscillator dynamics predict the stability of the coordination to decrease with increasing stroke rate, which in case of antiphase may eventually yield breakdowns to in-phase. Therefore, this stud...

  13. Scalable Coordinated Beamforming for Dense Wireless Cooperative Networks

    OpenAIRE

    Shi, Yuanming; Zhang, Jun; Letaief, Khaled B.

    2014-01-01

    To meet the ever growing demand for both high throughput and uniform coverage in future wireless networks, dense network deployment will be ubiquitous, for which co- operation among the access points is critical. Considering the computational complexity of designing coordinated beamformers for dense networks, low-complexity and suboptimal precoding strategies are often adopted. However, it is not clear how much performance loss will be caused. To enable optimal coordinated beamforming, in thi...

  14. An overview of the contaminant analysis automation program

    International Nuclear Information System (INIS)

    The Department of Energy (DOE) has significant amounts of radioactive and hazardous wastes stored, buried, and still being generated at many sites within the United States. These wastes must be characterized to determine the elemental, isotopic, and compound content before remediation can begin. In this paper, the authors project that sampling requirements will necessitate generating more than 10 million samples by 1995, which will far exceed the capabilities of our current manual chemical analysis laboratories. The Contaminant Analysis Automation effort (CAA), with Los Alamos National Laboratory (LANL) as to the coordinating Laboratory, is designing and fabricating robotic systems that will standardize and automate both the hardware and the software of the most common environmental chemical methods. This will be accomplished by designing and producing several unique analysis systems called Standard Analysis Methods (SAM). Each SAM will automate a specific chemical method, including sample preparation, the analytical analysis, and the data interpretation, by using a building block known as the Standard Laboratory Module (SLM). This concept allows the chemist to assemble an automated environmental method using standardized SLMs easily and without the worry of hardware compatibility or the necessity of generating complicated control programs

  15. Symmetric two-coordinate photodiode

    Directory of Open Access Journals (Sweden)

    Dobrovolskiy Yu. G.

    2008-12-01

    Full Text Available The two-coordinate photodiode is developed and explored on the longitudinal photoeffect, which allows to get the coordinate descriptions symmetric on the steepness and longitudinal resistance great exactness. It was shown, that the best type of the coordinate description is observed in the case of scanning by the optical probe on the central part of the photosensitive element. The ways of improvement of steepness and linear of its coordinate description were analyzed.

  16. Invariant Manifolds and Collective Coordinates

    CERN Document Server

    Papenbrock, T

    2001-01-01

    We introduce suitable coordinate systems for interacting many-body systems with invariant manifolds. These are Cartesian in coordinate and momentum space and chosen such that several components are identically zero for motion on the invariant manifold. In this sense these coordinates are collective. We make a connection to Zickendraht's collective coordinates and present certain configurations of few-body systems where rotations and vibrations decouple from single-particle motion. These configurations do not depend on details of the interaction.

  17. Communication, leadership and coordination failure

    OpenAIRE

    Dong, Lu; Montero, Maria; Possajennikov, Alex

    2015-01-01

    Using experimental methods, this paper investigates the limits of communication and leadership in aiding group coordination in a minimum effort game. Choosing the highest effort is the payoff dominant Nash equilibrium in this game, and communication and leadership are expected to help in coordinating on such an equilibrium. We consider an environment in which the benefits of coordination are low compared to the cost of mis-coordination. In this environment, players converge to the most ineffi...

  18. AUTOMATION OF BUISNESS-PROCESSES OF A TRAINING CENTER

    Directory of Open Access Journals (Sweden)

    Kovalenko A. V.

    2015-06-01

    Full Text Available The modern Russian companies have realized the need of automation of document flow not only as a mean of keeping documents in order, but also as a tool of optimization of expenses, as an assistant in adoption of administrative decisions. The Russian market of information systems for long time had no software products intended for educational institutions. The majority of the automated systems are intended for the enterprises with an activity in the sphere of trade and production. In comparison with the above companies, the list of software products for commercial training centers is small. Even considering the developed line of programs it is impossible to speak about meeting all the requirements for companies of such activity. At creation of the automated system for training center, the analysis of the existing software products intended for automation of training centers and adjacent institutes was carried out; a number of features of activity are revealed. The article is devoted to the description of the developed automated information system of document flow of a commercial educational institution, namely the developed configuration of "Training center" on a platform of "1C: Enterprise 8.2". The developed program complex serving as the administrative tool for the analysis of economic activity of training center, scheduling of the educational equipment and teaching structure, payroll calculation taking into account specifics of branch has been presented in the article

  19. Open Automated Demand Response Communications Specification (Version 1.0)

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Ghatikar, Girish; Kiliccote, Sila; Koch, Ed; Hennage, Dan; Palensky, Peter; McParland, Charles

    2009-02-28

    The development of the Open Automated Demand Response Communications Specification, also known as OpenADR or Open Auto-DR, began in 2002 following the California electricity crisis. The work has been carried out by the Demand Response Research Center (DRRC), which is managed by Lawrence Berkeley National Laboratory. This specification describes an open standards-based communications data model designed to facilitate sending and receiving demand response price and reliability signals from a utility or Independent System Operator to electric customers. OpenADR is one element of the Smart Grid information and communications technologies that are being developed to improve optimization between electric supply and demand. The intention of the open automated demand response communications data model is to provide interoperable signals to building and industrial control systems that are preprogrammed to take action based on a demand response signal, enabling a demand response event to be fully automated, with no manual intervention. The OpenADR specification is a flexible infrastructure to facilitate common information exchange between the utility or Independent System Operator and end-use participants. The concept of an open specification is intended to allow anyone to implement the signaling systems, the automation server or the automation clients.

  20. Data Assimilation by delay-coordinate nudging

    Science.gov (United States)

    Pazo, Diego; Lopez, Juan Manuel; Carrassi, Alberto

    2016-04-01

    A new nudging method for data assimilation, delay-coordinate nudging, is presented. Delay-coordinate nudging makes explicit use of present and past observations in the formulation of the forcing driving the model evolution at each time-step. Numerical experiments with a low order chaotic system show that the new method systematically outperforms standard nudging in different model and observational scenarios, also when using an un-optimized formulation of the delay-nudging coefficients. A connection between the optimal delay and the dominant Lyapunov exponent of the dynamics is found based on heuristic arguments and is confirmed by the numerical results, providing a guideline for the practical implementation of the algorithm. Delay-coordinate nudging preserves the easiness of implementation, the intuitive functioning and the reduced computational cost of the standard nudging, making it a potential alternative especially in the field of seasonal-to-decadal predictions with large Earth system models that limit the use of more sophisticated data assimilation procedures.

  1. Modeling Students' Units Coordinating Activity

    OpenAIRE

    Boyce, Steven James

    2014-01-01

    Primarily via constructivist teaching experiment methodology, units coordination (Steffe, 1992) has emerged as a useful construct for modeling students' psychological constructions pertaining to several mathematical domains, including counting sequences, whole number multiplicative conceptions, and fractions schemes. I describe how consideration of units coordination as a Piagetian (1970b) structure is useful for modeling units coordination across contexts. In this study, I extend teaching ...

  2. Coordination Hydrothermal Interconnection Java-Bali Using Simulated Annealing

    Science.gov (United States)

    Wicaksono, B.; Abdullah, A. G.; Saputra, W. S.

    2016-04-01

    Hydrothermal power plant coordination aims to minimize the total cost of operating system that is represented by fuel costand constraints during optimization. To perform the optimization, there are several methods that can be used. Simulated Annealing (SA) is a method that can be used to solve the optimization problems. This method was inspired by annealing or cooling process in the manufacture of materials composed of crystals. The basic principle of hydrothermal power plant coordination includes the use of hydro power plants to support basic load while thermal power plants were used to support the remaining load. This study used two hydro power plant units and six thermal power plant units with 25 buses by calculating transmission losses and considering power limits in each power plant unit aided by MATLAB software during the process. Hydrothermal power plant coordination using simulated annealing plants showed that a total cost of generation for 24 hours is 13,288,508.01.

  3. National Automated Conformity Inspection Process

    Data.gov (United States)

    Department of Transportation — The National Automated Conformity Inspection Process (NACIP) Application is intended to expedite the workflow process as it pertains to the FAA Form 81 0-10 Request...

  4. Evolution of Home Automation Technology

    Directory of Open Access Journals (Sweden)

    Mohd. Rihan

    2009-01-01

    Full Text Available In modern society home and office automation has becomeincreasingly important, providing ways to interconnectvarious home appliances. This interconnection results infaster transfer of information within home/offices leading tobetter home management and improved user experience.Home Automation, in essence, is a technology thatintegrates various electrical systems of a home to provideenhanced comfort and security. Users are grantedconvenient and complete control over all the electrical homeappliances and they are relieved from the tasks thatpreviously required manual control. This paper tracks thedevelopment of home automation technology over the lasttwo decades. Various home automation technologies havebeen explained briefly, giving a chronological account of theevolution of one of the most talked about technologies ofrecent times.

  5. Automation of antimicrobial activity screening.

    Science.gov (United States)

    Forry, Samuel P; Madonna, Megan C; López-Pérez, Daneli; Lin, Nancy J; Pasco, Madeleine D

    2016-03-01

    Manual and automated methods were compared for routine screening of compounds for antimicrobial activity. Automation generally accelerated assays and required less user intervention while producing comparable results. Automated protocols were validated for planktonic, biofilm, and agar cultures of the oral microbe Streptococcus mutans that is commonly associated with tooth decay. Toxicity assays for the known antimicrobial compound cetylpyridinium chloride (CPC) were validated against planktonic, biofilm forming, and 24 h biofilm culture conditions, and several commonly reported toxicity/antimicrobial activity measures were evaluated: the 50 % inhibitory concentration (IC50), the minimum inhibitory concentration (MIC), and the minimum bactericidal concentration (MBC). Using automated methods, three halide salts of cetylpyridinium (CPC, CPB, CPI) were rapidly screened with no detectable effect of the counter ion on antimicrobial activity. PMID:26970766

  6. Automating the Purple Crow Lidar

    Science.gov (United States)

    Hicks, Shannon; Sica, R. J.; Argall, P. S.

    2016-06-01

    The Purple Crow LiDAR (PCL) was built to measure short and long term coupling between the lower, middle, and upper atmosphere. The initial component of my MSc. project is to automate two key elements of the PCL: the rotating liquid mercury mirror and the Zaber alignment mirror. In addition to the automation of the Zaber alignment mirror, it is also necessary to describe the mirror's movement and positioning errors. Its properties will then be added into the alignment software. Once the alignment software has been completed, we will compare the new alignment method with the previous manual procedure. This is the first among several projects that will culminate in a fully-automated lidar. Eventually, we will be able to work remotely, thereby increasing the amount of data we collect. This paper will describe the motivation for automation, the methods we propose, preliminary results for the Zaber alignment error analysis, and future work.

  7. Home automation with Intel Galileo

    CERN Document Server

    Dundar, Onur

    2015-01-01

    This book is for anyone who wants to learn Intel Galileo for home automation and cross-platform software development. No knowledge of programming with Intel Galileo is assumed, but knowledge of the C programming language is essential.

  8. Work Coordination Engine

    Science.gov (United States)

    Zendejas, Silvino; Bui, Tung; Bui, Bach; Malhotra, Shantanu; Chen, Fannie; Kim, Rachel; Allen, Christopher; Luong, Ivy; Chang, George; Sadaqathulla, Syed

    2009-01-01

    The Work Coordination Engine (WCE) is a Java application integrated into the Service Management Database (SMDB), which coordinates the dispatching and monitoring of a work order system. WCE de-queues work orders from SMDB and orchestrates the dispatching of work to a registered set of software worker applications distributed over a set of local, or remote, heterogeneous computing systems. WCE monitors the execution of work orders once dispatched, and accepts the results of the work order by storing to the SMDB persistent store. The software leverages the use of a relational database, Java Messaging System (JMS), and Web Services using Simple Object Access Protocol (SOAP) technologies to implement an efficient work-order dispatching mechanism capable of coordinating the work of multiple computer servers on various platforms working concurrently on different, or similar, types of data or algorithmic processing. Existing (legacy) applications can be wrapped with a proxy object so that no changes to the application are needed to make them available for integration into the work order system as "workers." WCE automatically reschedules work orders that fail to be executed by one server to a different server if available. From initiation to completion, the system manages the execution state of work orders and workers via a well-defined set of events, states, and actions. It allows for configurable work-order execution timeouts by work-order type. This innovation eliminates a current processing bottleneck by providing a highly scalable, distributed work-order system used to quickly generate products needed by the Deep Space Network (DSN) to support space flight operations. WCE is driven by asynchronous messages delivered via JMS indicating the availability of new work or workers. It runs completely unattended in support of the lights-out operations concept in the DSN.

  9. Towards automated traceability maintenance.

    Science.gov (United States)

    Mäder, Patrick; Gotel, Orlena

    2012-10-01

    Traceability relations support stakeholders in understanding the dependencies between artifacts created during the development of a software system and thus enable many development-related tasks. To ensure that the anticipated benefits of these tasks can be realized, it is necessary to have an up-to-date set of traceability relations between the established artifacts. This goal requires the creation of traceability relations during the initial development process. Furthermore, the goal also requires the maintenance of traceability relations over time as the software system evolves in order to prevent their decay. In this paper, an approach is discussed that supports the (semi-) automated update of traceability relations between requirements, analysis and design models of software systems expressed in the UML. This is made possible by analyzing change events that have been captured while working within a third-party UML modeling tool. Within the captured flow of events, development activities comprised of several events are recognized. These are matched with predefined rules that direct the update of impacted traceability relations. The overall approach is supported by a prototype tool and empirical results on the effectiveness of tool-supported traceability maintenance are provided. PMID:23471308

  10. Automated Gas Distribution System

    Science.gov (United States)

    Starke, Allen; Clark, Henry

    2012-10-01

    The cyclotron of Texas A&M University is one of the few and prized cyclotrons in the country. Behind the scenes of the cyclotron is a confusing, and dangerous setup of the ion sources that supplies the cyclotron with particles for acceleration. To use this machine there is a time consuming, and even wasteful step by step process of switching gases, purging, and other important features that must be done manually to keep the system functioning properly, while also trying to maintain the safety of the working environment. Developing a new gas distribution system to the ion source prevents many of the problems generated by the older manually setup process. This developed system can be controlled manually in an easier fashion than before, but like most of the technology and machines in the cyclotron now, is mainly operated based on software programming developed through graphical coding environment Labview. The automated gas distribution system provides multi-ports for a selection of different gases to decrease the amount of gas wasted through switching gases, and a port for the vacuum to decrease the amount of time spent purging the manifold. The Labview software makes the operation of the cyclotron and ion sources easier, and safer for anyone to use.

  11. 一种结合页分配和组调度的内存功耗优化方法*%Memory Power Optimization Policy of Coordinating Page Allocation and Group Scheduling

    Institute of Scientific and Technical Information of China (English)

    贾刚勇; 万健; 李曦; 蒋从锋; 代栋

    2014-01-01

    多核系统中,内存子系统消耗大量的能耗并且比例还会继续增大.因此,解决内存的功耗问题成为系统功耗优化的关键.根据线程的内存地址空间和负载均衡策略将系统中的线程划分成不同的线程组,根据线程所属的组,给同一组内的线程分配相同内存 rank 中的物理页,然后,根据划分的线程组以组为单位进行调度.提出了结合页分配和组调度的内存功耗优化方法(CAS).CAS 周期性地激活当前需要的内存 rank,从而可以将暂时不使用的内存 rank置为低功耗状态,同时延长低功耗内存 rank 的空闲时间.仿真实验结果显示:与其他同类方法相比,CAS 方法能够平均降低10%的内存功耗,同时提高8%的性能.%Main memory accounts for a large and increasing fraction of the energy consumption in multi-core systems. Therefore, it is critical to address the power issue in the memory subsystem. This paper presents a solution to improve memory power efficiency through coordinating page allocation and thread group scheduling (CAS). Under the proposed page allocation, all threads are partitioned into different thread groups, where threads in the same thread group occupy the same memory rank. Thread group scheduling is then implemented by adjusting default Linux CFS. The CAS alternates active partial memory periodically to allow others power down and prolongs the idle ranks. Experimental results show that this approach improves energy saving by 10% and reduces performance overhead by 8% comparing with the state of the art polices.

  12. Aprendizaje automático

    OpenAIRE

    Moreno, Antonio

    2006-01-01

    En este libro se introducen los conceptos básicos en una de las ramas más estudiadas actualmente dentro de la inteligencia artificial: el aprendizaje automático. Se estudian temas como el aprendizaje inductivo, el razonamiento analógico, el aprendizaje basado en explicaciones, las redes neuronales, los algoritmos genéticos, el razonamiento basado en casos o las aproximaciones teóricas al aprendizaje automático.

  13. 2015 Chinese Intelligent Automation Conference

    CERN Document Server

    Li, Hongbo

    2015-01-01

    Proceedings of the 2015 Chinese Intelligent Automation Conference presents selected research papers from the CIAC’15, held in Fuzhou, China. The topics include adaptive control, fuzzy control, neural network based control, knowledge based control, hybrid intelligent control, learning control, evolutionary mechanism based control, multi-sensor integration, failure diagnosis, reconfigurable control, etc. Engineers and researchers from academia, industry and the government can gain valuable insights into interdisciplinary solutions in the field of intelligent automation.

  14. Technology modernization assessment flexible automation

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, D.W.; Boyd, D.R.; Hansen, N.H.; Hansen, M.A.; Yount, J.A.

    1990-12-01

    The objectives of this report are: to present technology assessment guidelines to be considered in conjunction with defense regulations before an automation project is developed to give examples showing how assessment guidelines may be applied to a current project to present several potential areas where automation might be applied successfully in the depot system. Depots perform primarily repair and remanufacturing operations, with limited small batch manufacturing runs. While certain activities (such as Management Information Systems and warehousing) are directly applicable to either environment, the majority of applications will require combining existing and emerging technologies in different ways, with the special needs of depot remanufacturing environment. Industry generally enjoys the ability to make revisions to its product lines seasonally, followed by batch runs of thousands or more. Depot batch runs are in the tens, at best the hundreds, of parts with a potential for large variation in product mix; reconfiguration may be required on a week-to-week basis. This need for a higher degree of flexibility suggests a higher level of operator interaction, and, in turn, control systems that go beyond the state of the art for less flexible automation and industry in general. This report investigates the benefits and barriers to automation and concludes that, while significant benefits do exist for automation, depots must be prepared to carefully investigate the technical feasibility of each opportunity and the life-cycle costs associated with implementation. Implementation is suggested in two ways: (1) develop an implementation plan for automation technologies based on results of small demonstration automation projects; (2) use phased implementation for both these and later stage automation projects to allow major technical and administrative risk issues to be addressed. 10 refs., 2 figs., 2 tabs. (JF)

  15. 基于关键链优化的课程教学与实践教学契合 OA 绩效方法研究%OA Research on Coordination Methods of Course Teaching and Practical Teaching Based on Critical Chain Optimization

    Institute of Scientific and Technical Information of China (English)

    朱卫未; 黄阳; 李彦东

    2012-01-01

      作为高等教育体系中的两大主要模块,课程教学与实践教学在时间维度、逻辑维度和认知协同维度往往会存在矛盾,导致了教学质量的降低。本文从这三个维度出发,首先确定了具有普适性的教学绩效评估方法;随后根据多项目管理的关键链法提出了教学优化的具体步骤;最后以工商管理专业课程为实例根据上述教学绩效评估方法验证了教学优化的效果。%  As the two major modules in Higher Education System, course teaching and practical teaching are often in contradiction with each other from the dimensions of time, logic and cognition coordination, which lowers the quality of Higher Education. This paper probes into the universal teaching performance evaluation approach from the three dimensions, and concrete teaching optimization procedures using critical chain method are put forward as well. Illustrated by the case of Business Administration major, effect of the teaching optimization using the aforementioned performance e-valuation approach was verified.

  16. The European NEO Coordination Centre

    Science.gov (United States)

    Perozzi, E.; Borgia, B.; Micheli, M.

    An operational approach to NEO (Near-Earth Object) hazard monitoring has been developed at European level within the framework of the Space Situational Awareness Program (SSA) of the European Space Agency (ESA). Through federating European assets and profiting of the expertise developed in European Universities and Research Centers, it has been possible to start the deployment of the so-called SSA NEO Segment. This initiative aims to provide a significant contribution to the worldwide effort to the discovery, follow-up and characterization of the near-Earth object population. A major achievement has been the inauguration in May 2013 of the ESA NEO Coordination Centre located at ESRIN (Frascati, Italy). The goal of the NEOCC Precursor Service operations is twofold: to make available updated information on the NEO population and the associated hazard and to contribute to optimize the NEO observational efforts. This is done by maintaining and improving a Web Portal publicly available at http://neo.ssa.esa.int and by performing follow-up observations through a network of collaborating telescopes and facilities. An overview of the SSA-NEO System and a summary of the first two years of NEOCC operations is presented.

  17. Coordination Processes in International Organisations

    DEFF Research Database (Denmark)

    Nedergaard, Peter

    2008-01-01

    The EU is not a member of the International Labour Organisation (ILO), but relatively elaborate EU coordination takes place anyway. This paper addresses two research questions: 1) How is it possible to evaluate the coordination of the EU in its specific observable configuration in the ILO?, and 2...... to coordinate relatively elaborate agreements due to the strength of its coordination as far as professional or technical and political activities (excepting the ILO budget) are concerned. In other more clear-cut or 'simple' policy areas such as the ILO budget, the EU coordination is weak: this contrast...

  18. Generación automática de variantes de trayectorias aplicada al diseño óptimo bajo criterios múltiples de redes hidráulicas de abasto. // Automatic generation of trajectory variants applied to the optimal design of water supply system under multiple criter

    Directory of Open Access Journals (Sweden)

    J. R. Hechavarría Hernández

    2007-05-01

    Full Text Available La determinación de las trayectorias más eficientes que deben tener las redes, instalaciones o vías de transporte es unproblema que motiva a muchos investigadores de diversas ingenierías: informática, civil, mecánica, hidráulica, etc., cuyassoluciones requieren ser realizadas sobre la base de la elevada integración de la información durante el proceso de análisisy estudio de la tarea, de la aplicación de los métodos modernos de preparación y toma de decisiones, así como laorganización racional de los procedimientos de cálculo de ingeniería. Para definir el trazado de trayectorias en losproyectos de ingeniería se tienen en cuenta determinadas condiciones propias del entorno. Estas trayectorias, pese a sudiferente designación, pueden coincidir en determinadas zonas y compartir espacios limitados. Por esta razón, un sistemapara la generación automática de variantes de trayectorias deberá considerar las limitaciones del espacio disponible alestablecer el tipo y dimensiones límites. En el artículo se presenta un procedimiento que apoyado en un sistema informáticopermite obtener de manera automática variantes de trayectorias cerradas las cuales dependiendo de su destino de servicioserán optimizadas bajo criterios de eficiencias.Palabras claves: Sistema CAD, Generación trayectorias, red hidráulica, diseño óptimo._____________________________________________________________________________Abstract:Determining the most efficient trajectories of networks, installations or transportation roads is a problem to a lot ofinvestigators of different fields: Computer Science, Civil Engineering, Mechanical Engineering, Hydraulics, etc.Solutions should be carried out taking into account a high integration of information in the analysis and study processof the task, the application of the modern methods of preparation and decision making, as well as the rationalorganization of the procedures of engineering calculation. Several

  19. Optimization and Optimal Control

    CERN Document Server

    Chinchuluun, Altannar; Enkhbat, Rentsen; Tseveendorj, Ider

    2010-01-01

    During the last four decades there has been a remarkable development in optimization and optimal control. Due to its wide variety of applications, many scientists and researchers have paid attention to fields of optimization and optimal control. A huge number of new theoretical, algorithmic, and computational results have been observed in the last few years. This book gives the latest advances, and due to the rapid development of these fields, there are no other recent publications on the same topics. Key features: Provides a collection of selected contributions giving a state-of-the-art accou

  20. Quantifying, Visualizing, and Monitoring Lead Optimization.

    Science.gov (United States)

    Maynard, Andrew T; Roberts, Christopher D

    2016-05-12

    Although lead optimization (LO) is by definition a process, process-centric analysis and visualization of this important phase of pharmaceutical R&D has been lacking. Here we describe a simple statistical framework to quantify and visualize the progression of LO projects so that the vital signs of LO convergence can be monitored. We refer to the resulting visualizations generated by our methodology as the "LO telemetry" of a project. These visualizations can be automated to provide objective, holistic, and instantaneous analysis and communication of LO progression. This enhances the ability of project teams to more effectively drive LO process, while enabling management to better coordinate and prioritize LO projects. We present the telemetry of five LO projects comprising different biological targets and different project outcomes, including clinical compound selection, termination due to preclinical safety/tox, and termination due to lack of tractability. We demonstrate that LO progression is accurately captured by the telemetry. We also present metrics to quantify LO efficiency and tractability. PMID:26262898

  1. Using Automated Scores of Student Essays to Support Teacher Guidance in Classroom Inquiry

    Science.gov (United States)

    Gerard, Libby F.; Linn, Marcia C.

    2016-02-01

    Computer scoring of student written essays about an inquiry topic can be used to diagnose student progress both to alert teachers to struggling students and to generate automated guidance. We identify promising ways for teachers to add value to automated guidance to improve student learning. Three teachers from two schools and their 386 students participated. We draw on evidence from student progress, observations of how teachers interact with students, and reactions of teachers. The findings suggest that alerts for teachers prompted rich teacher-student conversations about energy in photosynthesis. In one school, the combination of the automated guidance plus teacher guidance was more effective for student science learning than two rounds of personalized, automated guidance. In the other school, both approaches resulted in equal learning gains. These findings suggest optimal combinations of automated guidance and teacher guidance to support students to revise explanations during inquiry and build integrated understanding of science.

  2. Automation: the competitive edge for HMOs and other alternative delivery systems.

    Science.gov (United States)

    Prussin, J A

    1987-12-01

    Until recently, many, if not most, Health Maintenance Organizations (HMO) were not automated. Moreover, HMOs that were automated tended to be automated only on a limited basis. Recently, however, the highly competitive marketplace within which HMOs and other Alternative Delivery Systems (ADS) exist has required that they operate at a maximum effectiveness and efficiency. Given the complex nature of ADSs, the volume of transactions in ADSs, the large number of members served by ADSs, and the numerous providers who are paid at different rates and on different bases by ADSs, it is impossible for an ADS to operate effectively or efficiently, let alone show optimal performance, without a sophisticated, comprehensive automated system. Reliable automated systems designed specifically to address ADS functions such as enrollment and premium billing, finance and accounting, medical information and patient management, and marketing have recently become available at a reasonable cost. PMID:3451941

  3. Coordinating complex decision support activities across distributed applications

    Science.gov (United States)

    Adler, Richard M.

    1994-01-01

    Knowledge-based technologies have been applied successfully to automate planning and scheduling in many problem domains. Automation of decision support can be increased further by integrating task-specific applications with supporting database systems, and by coordinating interactions between such tools to facilitate collaborative activities. Unfortunately, the technical obstacles that must be overcome to achieve this vision of transparent, cooperative problem-solving are daunting. Intelligent decision support tools are typically developed for standalone use, rely on incompatible, task-specific representational models and application programming interfaces (API's), and run on heterogeneous computing platforms. Getting such applications to interact freely calls for platform independent capabilities for distributed communication, as well as tools for mapping information across disparate representations. Symbiotics is developing a layered set of software tools (called NetWorks! for integrating and coordinating heterogeneous distributed applications. he top layer of tools consists of an extensible set of generic, programmable coordination services. Developers access these services via high-level API's to implement the desired interactions between distributed applications.

  4. Evaluating the Relational Coordination instrument

    DEFF Research Database (Denmark)

    Edwards, Kasper; Lundstrøm, Sanne Lykke

    2014-01-01

    Relational coordination rests on the idea that coordination is a central issue in all work and that coordination happens through communication, which in turn is shaped by relations. Relational coordination is quite interesting because it has been shown to correlate with on-time flight departures...... and surgical performance. This has prompted the attention of both practitioners and politicians some of who perceive relational coordination as a means to attain better performance. The relational coordination instrument has been validated as a measure of teamwork from the following perspectives: internal...... consistency, interrater agreement and reliability, structural validity, content validity. However as relational coordination is being used as a diagnostics tool it is important to examine further if the instrument can measure changes. Indeed we need to know how precise and sensitive the instrument is when...

  5. Parmodel: a web server for automated comparative modeling of proteins.

    Science.gov (United States)

    Uchôa, Hugo Brandão; Jorge, Guilherme Eberhart; Freitas Da Silveira, Nelson José; Camera, João Carlos; Canduri, Fernanda; De Azevedo, Walter Filgueira

    2004-12-24

    Parmodel is a web server for automated comparative modeling and evaluation of protein structures. The aim of this tool is to help inexperienced users to perform modeling, assessment, visualization, and optimization of protein models as well as crystallographers to evaluate structures solved experimentally. It is subdivided in four modules: Parmodel Modeling, Parmodel Assessment, Parmodel Visualization, and Parmodel Optimization. The main module is the Parmodel Modeling that allows the building of several models for a same protein in a reduced time, through the distribution of modeling processes on a Beowulf cluster. Parmodel automates and integrates the main softwares used in comparative modeling as MODELLER, Whatcheck, Procheck, Raster3D, Molscript, and Gromacs. This web server is freely accessible at .

  6. Evaluation of Automated Volumetric Cartilage Quantification for Hip Preservation Surgery.

    Science.gov (United States)

    Ramme, Austin J; Guss, Michael S; Vira, Shaleen; Vigdorchik, Jonathan M; Newe, Axel; Raithel, Esther; Chang, Gregory

    2016-01-01

    Automating the process of femoroacetabular cartilage identification from magnetic resonance imaging (MRI) images has important implications to guiding clinical care by providing a temporal metric that allows for optimizing the timing for joint preservation surgery. In this paper, we evaluate a new automated cartilage segmentation method using a time trial, segmented volume comparison, overlap metrics, and Euclidean distance mapping. We report interrater overlap metrics using the true fast imaging with steady-state precession MRI sequence of 0.874, 0.546, and 0.704 for the total overlap, union overlap, and mean overlap, respectively. This method was 3.28× faster than manual segmentation. This technique provides clinicians with volumetric cartilage information that is useful for optimizing the timing for joint preservation procedures. PMID:26377376

  7. Automated Model Generation for Hybrid Vehicles Optimization and Control Création automatique de modèles de composants pour l’optimisation et le contrôle de véhicules hybrides

    Directory of Open Access Journals (Sweden)

    Verdonck N.

    2010-01-01

    Full Text Available Systematic optimization of modern powertrains, and hybrids in particular, requires the representation of the system by means of Backward Quasistatic Models (BQM. In contrast, the models used in realistic powertrain simulators are often of the Forward Dynamic Model (FDM type. The paper presents a methodology to derive BQM’s of modern powertrain components, as parametric, steady-state limits of their FDM counterparts. The parametric nature of this procedure implies that changing the system modeled does not imply relaunching a simulation campaign, but only adjusting the corresponding parameters in the BQM. The approach is illustrated with examples concerning turbocharged engines, electric motors, and electrochemical batteries, and the influence of a change in parameters on the supervisory control of an hybrid vehicle is then studied offline, in co-simulation and on an HiL test bench adapted to hybrid vehicles (HyHiL. L’optimisation de l’utilisation des groupes moto-propulseurs (GMP modernes nécessite de modéliser le système de manière quasi-statique avec une logique inverse (“Backward Quasistatic Model” – BQM, en particulier dans le cas des GMP hybrides. Cependant, les modèles utilisés pour la simulation réaliste de ces GMP sont souvent dynamiques à logique directe (“Forward Dynamic Model” – FDM. Cet article présente une méthodologie pour obtenir les BQM des composants de GMP actuels directement issus de la limite quasi-statique des FDM correspondants de manière analytique. Grâce à l’aspect paramétrique de cette procédure, il n’est pas nécessaire de relancer une campagne de simulations après chaque changement du système modélisé: il suffit de modifier les paramètres correspondants dans le BQM. Cette approche est illustrée par trois cas d’étude (moteur turbo, moteur électrique et batterie, et l’effet d’un changement de paramètre sur le contrôle de supervision d’un véhicule hybride est

  8. molSimplify: A toolkit for automating discovery in inorganic chemistry.

    Science.gov (United States)

    Ioannidis, Efthymios I; Gani, Terry Z H; Kulik, Heather J

    2016-08-15

    We present an automated, open source toolkit for the first-principles screening and discovery of new inorganic molecules and intermolecular complexes. Challenges remain in the automatic generation of candidate inorganic molecule structures due to the high variability in coordination and bonding, which we overcome through a divide-and-conquer tactic that flexibly combines force-field preoptimization of organic fragments with alignment to first-principles-trained metal-ligand distances. Exploration of chemical space is enabled through random generation of ligands and intermolecular complexes from large chemical databases. We validate the generated structures with the root mean squared (RMS) gradients evaluated from density functional theory (DFT), which are around 0.02 Ha/au across a large 150 molecule test set. Comparison of molSimplify results to full optimization with the universal force field reveals that RMS DFT gradients are improved by 40%. Seamless generation of input files, preparation and execution of electronic structure calculations, and post-processing for each generated structure aids interpretation of underlying chemical and energetic trends. © 2016 Wiley Periodicals, Inc. PMID:27364957

  9. Automated microinjection system for adherent cells

    Science.gov (United States)

    Youoku, Sachihiro; Suto, Yoshinori; Ando, Moritoshi; Ito, Akio

    2007-07-01

    We have developed an automated microinjection system that can handle more than 500 cells an hour. Microinjection injects foreign agents directly into cells using a micro-capillary. It can randomly introduce agents such as DNA, proteins and drugs into various types of cells. However, conventional methods require a skilled operator and suffer from low throughput. The new automated microinjection techniques we have developed consist of a Petri dish height measuring method and a capillary apex position measuring method. The dish surface height is measured by analyzing the images of cells that adhere to the dish surface. The contrast between the cell images is minimized when the focus plane of an object lens coincides with the dish surface. We have developed an optimized focus searching method with a height accuracy of +/-0.2 um. The capillary apex position detection method consists of three steps: rough, middle, and precise. These steps are employed sequentially to cover capillary displacements of up to +/-2 mm, and to ultimately accomplish an alignment accuracy of less than one micron. Experimental results using this system we developed show that it can introduce fluorescent material (Alexa488) into adherent cells, HEK293, with a success rate of 88.5%.

  10. An automated procedure for evaluating song imitation.

    Directory of Open Access Journals (Sweden)

    Yael Mandelblat-Cerf

    Full Text Available Songbirds have emerged as an excellent model system to understand the neural basis of vocal and motor learning. Like humans, songbirds learn to imitate the vocalizations of their parents or other conspecific "tutors." Young songbirds learn by comparing their own vocalizations to the memory of their tutor song, slowly improving until over the course of several weeks they can achieve an excellent imitation of the tutor. Because of the slow progression of vocal learning, and the large amounts of singing generated, automated algorithms for quantifying vocal imitation have become increasingly important for studying the mechanisms underlying this process. However, methodologies for quantifying song imitation are complicated by the highly variable songs of either juvenile birds or those that learn poorly because of experimental manipulations. Here we present a method for the evaluation of song imitation that incorporates two innovations: First, an automated procedure for selecting pupil song segments, and, second, a new algorithm, implemented in Matlab, for computing both song acoustic and sequence similarity. We tested our procedure using zebra finch song and determined a set of acoustic features for which the algorithm optimally differentiates between similar and non-similar songs.

  11. A cost-effective intelligent robotic system with dual-arm dexterous coordination and real-time vision

    Science.gov (United States)

    Marzwell, Neville I.; Chen, Alexander Y. K.

    1991-01-01

    Dexterous coordination of manipulators based on the use of redundant degrees of freedom, multiple sensors, and built-in robot intelligence represents a critical breakthrough in development of advanced manufacturing technology. A cost-effective approach for achieving this new generation of robotics has been made possible by the unprecedented growth of the latest microcomputer and network systems. The resulting flexible automation offers the opportunity to improve the product quality, increase the reliability of the manufacturing process, and augment the production procedures for optimizing the utilization of the robotic system. Moreover, the Advanced Robotic System (ARS) is modular in design and can be upgraded by closely following technological advancements as they occur in various fields. This approach to manufacturing automation enhances the financial justification and ensures the long-term profitability and most efficient implementation of robotic technology. The new system also addresses a broad spectrum of manufacturing demand and has the potential to address both complex jobs as well as highly labor-intensive tasks. The ARS prototype employs the decomposed optimization technique in spatial planning. This technique is implemented to the framework of the sensor-actuator network to establish the general-purpose geometric reasoning system. The development computer system is a multiple microcomputer network system, which provides the architecture for executing the modular network computing algorithms. The knowledge-based approach used in both the robot vision subsystem and the manipulation control subsystems results in the real-time image processing vision-based capability. The vision-based task environment analysis capability and the responsive motion capability are under the command of the local intelligence centers. An array of ultrasonic, proximity, and optoelectronic sensors is used for path planning. The ARS currently has 18 degrees of freedom made up by two

  12. AUTOMATED ANALYSIS OF BREAKERS

    Directory of Open Access Journals (Sweden)

    E. M. Farhadzade

    2014-01-01

    Full Text Available Breakers relate to Electric Power Systems’ equipment, the reliability of which influence, to a great extend, on reliability of Power Plants. In particular, the breakers determine structural reliability of switchgear circuit of Power Stations and network substations. Failure in short-circuit switching off by breaker with further failure of reservation unit or system of long-distance protection lead quite often to system emergency.The problem of breakers’ reliability improvement and the reduction of maintenance expenses is becoming ever more urgent in conditions of systematic increasing of maintenance cost and repair expenses of oil circuit and air-break circuit breakers. The main direction of this problem solution is the improvement of diagnostic control methods and organization of on-condition maintenance. But this demands to use a great amount of statistic information about nameplate data of breakers and their operating conditions, about their failures, testing and repairing, advanced developments (software of computer technologies and specific automated information system (AIS.The new AIS with AISV logo was developed at the department: “Reliability of power equipment” of AzRDSI of Energy. The main features of AISV are:· to provide the security and data base accuracy;· to carry out systematic control of breakers conformity with operating conditions;· to make the estimation of individual  reliability’s value and characteristics of its changing for given combination of characteristics variety;· to provide personnel, who is responsible for technical maintenance of breakers, not only with information but also with methodological support, including recommendations for the given problem solving  and advanced methods for its realization.

  13. A Method of Automated Nonparametric Content Analysis for Social Science

    OpenAIRE

    Hopkins, Daniel J.; King, Gary

    2010-01-01

    The increasing availability of digitized text presents enormous opportunities for social scientists. Yet hand coding many blogs, speeches, government records, newspapers, or other sources of unstructured text is infeasible. Although computer scientists have methods for automated content analysis, most are optimized to classify individual documents, whereas social scientists instead want generalizations about the population of documents, such as the proportion in a given category. Unfortunatel...

  14. ASteCA: Automated Stellar Cluster Analysis

    Science.gov (United States)

    Perren, G. I.; Vázquez, R. A.; Piatti, A. E.

    2015-04-01

    We present the Automated Stellar Cluster Analysis package (ASteCA), a suit of tools designed to fully automate the standard tests applied on stellar clusters to determine their basic parameters. The set of functions included in the code make use of positional and photometric data to obtain precise and objective values for a given cluster's center coordinates, radius, luminosity function and integrated color magnitude, as well as characterizing through a statistical estimator its probability of being a true physical cluster rather than a random overdensity of field stars. ASteCA incorporates a Bayesian field star decontamination algorithm capable of assigning membership probabilities using photometric data alone. An isochrone fitting process based on the generation of synthetic clusters from theoretical isochrones and selection of the best fit through a genetic algorithm is also present, which allows ASteCA to provide accurate estimates for a cluster's metallicity, age, extinction and distance values along with its uncertainties. To validate the code we applied it on a large set of over 400 synthetic MASSCLEAN clusters with varying degrees of field star contamination as well as a smaller set of 20 observed Milky Way open clusters (Berkeley 7, Bochum 11, Czernik 26, Czernik 30, Haffner 11, Haffner 19, NGC 133, NGC 2236, NGC 2264, NGC 2324, NGC 2421, NGC 2627, NGC 6231, NGC 6383, NGC 6705, Ruprecht 1, Tombaugh 1, Trumpler 1, Trumpler 5 and Trumpler 14) studied in the literature. The results show that ASteCA is able to recover cluster parameters with an acceptable precision even for those clusters affected by substantial field star contamination. ASteCA is written in Python and is made available as an open source code which can be downloaded ready to be used from its official site.

  15. [Coordination among healthcare levels: systematization of tools and measures].

    Science.gov (United States)

    Terraza Núñez, Rebeca; Vargas Lorenzo, Ingrid; Vázquez Navarrete, María Luisa

    2006-01-01

    Improving healthcare coordination is a priority in many healthcare systems, particularly in chronic health problems in which a number of professionals and services intervene. There is an abundance of coordination strategies and mechanisms that should be systematized so that they can be used in the most appropriate context. The present article aims to analyse healthcare coordination and its instruments using the organisational theory. Coordination mechanisms can be classified according to two basic processes used to coordinate activities: programming and feedback. The optimal combination of mechanisms will depend on three factors: the degree to which healthcare activities are differentiated, the volume and type of interdependencies, and the level of uncertainty. Historically, healthcare services have based coordination on skills standardization and, most recently, on processes standardization, through clinical guidelines, maps, and plans. Their utilisation is unsatisfactory in chronic diseases involving intervention by several professionals with reciprocal interdependencies, variability in patients' response to medical interventions, and a large volume of information to be processed. In this case, mechanisms based on feedback, such as working groups, linking professionals and vertical information systems, are more effective. To date, evaluation of healthcare coordination has not been conducted systematically, using structure, process and results indicators. The different strategies and instruments have been applied mainly to long-term care and mental health and one of the challenges to healthcare coordination is to extend and evaluate their use throughout the healthcare continuum.

  16. The market future of automated price parsing systems for the electric power

    Directory of Open Access Journals (Sweden)

    Elena Zhuravleva

    2013-03-01

    Full Text Available Application of automated control systems allows providing better control on energy resources consumption, improving accountability, optimizing costs of energy resources. There is a need for the automated system which unifies all competing service providers of electric power and creates the monitoring environment of services on the basis of parsing. In such integrated system data collection is carried out in a uniform electronic platform (environment based on the indicator “electric power service price”.

  17. Challenges in Gaining Large Scale Carbon Reductions through Wireless Home Automation Systems

    OpenAIRE

    Larsen, Peter Gorm; Rovsing, Poul Ejnar; Toftegaard, Thomas Skjødeberg

    2010-01-01

    Buildings account for more than a 35 % of the energy consumptionin Europe. Therefore a step towards more sustainablelifestile is to use home automation to optimize theenergy consumption “automatically”. This paper reportsabout the usage and some of the remaining challenges ofespecially wireless but also powerline communication in ahome automation setting. For many years, home automationhas been visible to many, but accessible to only a few,because of inadequate integration of systems. A vast ...

  18. Coordination using Implicit Communication

    CERN Document Server

    Cuff, Paul

    2011-01-01

    We explore a basic noise-free signaling scenario where coordination and communication are naturally merged. A random signal X_1,...,X_n is processed to produce a control signal or action sequence A_1,...,A_n, which is observed and further processed (without access to X_1,...,X_n) to produce a third sequence B_1,...,B_n. The object of interest is the set of empirical joint distributions p(x,a,b) that can be achieved in this setting. We show that H(A) >= I(X;A,B) is the necessary and sufficient condition for achieving p(x,a,b) when no causality constraints are enforced on the encoders. We also give results for various causality constraints. This setting sheds light on the embedding of digital information in analog signals, a concept that is exploited in digital watermarking, steganography, cooperative communication, and strategic play in team games such as bridge.

  19. Network Coordinator Report

    Science.gov (United States)

    Himwich, Ed; Strand, Richard

    2013-01-01

    This report includes an assessment of the network performance in terms of lost observing time for the 2012 calendar year. Overall, the observing time loss was about 12.3%, which is in-line with previous years. A table of relative incidence of problems with various subsystems is presented. The most significant identified causes of loss were electronics rack problems (accounting for about 21.8% of losses), antenna reliability (18.1%), RFI (11.8%), and receiver problems (11.7%). About 14.2% of the losses occurred for unknown reasons. New antennas are under development in the USA, Germany, and Spain. There are plans for new telescopes in Norway and Sweden. Other activities of the Network Coordinator are summarized.

  20. Multi-net optimization of VLSI interconnect

    CERN Document Server

    Moiseev, Konstantin; Wimer, Shmuel

    2015-01-01

    This book covers layout design and layout migration methodologies for optimizing multi-net wire structures in advanced VLSI interconnects. Scaling-dependent models for interconnect power, interconnect delay and crosstalk noise are covered in depth, and several design optimization problems are addressed, such as minimization of interconnect power under delay constraints, or design for minimal delay in wire bundles within a given routing area. A handy reference or a guide for design methodologies and layout automation techniques, this book provides a foundation for physical design challenges of interconnect in advanced integrated circuits.  • Describes the evolution of interconnect scaling and provides new techniques for layout migration and optimization, focusing on multi-net optimization; • Presents research results that provide a level of design optimization which does not exist in commercially-available design automation software tools; • Includes mathematical properties and conditions for optimal...