WorldWideScience

Sample records for automated optimal coordination

  1. Nonparametric variational optimization of reaction coordinates

    Energy Technology Data Exchange (ETDEWEB)

    Banushkina, Polina V.; Krivov, Sergei V., E-mail: s.krivov@leeds.ac.uk [Astbury Center for Structural Molecular Biology, Faculty of Biological Sciences, University of Leeds, Leeds LS2 9JT (United Kingdom)

    2015-11-14

    State of the art realistic simulations of complex atomic processes commonly produce trajectories of large size, making the development of automated analysis tools very important. A popular approach aimed at extracting dynamical information consists of projecting these trajectories into optimally selected reaction coordinates or collective variables. For equilibrium dynamics between any two boundary states, the committor function also known as the folding probability in protein folding studies is often considered as the optimal coordinate. To determine it, one selects a functional form with many parameters and trains it on the trajectories using various criteria. A major problem with such an approach is that a poor initial choice of the functional form may lead to sub-optimal results. Here, we describe an approach which allows one to optimize the reaction coordinate without selecting its functional form and thus avoiding this source of error.

  2. Distributed optimization for systems design : an augmented Lagrangian coordination method

    NARCIS (Netherlands)

    Tosserams, S.

    2008-01-01

    This thesis presents a coordination method for the distributed design optimization of engineering systems. The design of advanced engineering systems such as aircrafts, automated distribution centers, and microelectromechanical systems (MEMS) involves multiple components that together realize the

  3. Optimal Control of Connected and Automated Vehicles at Roundabouts

    Energy Technology Data Exchange (ETDEWEB)

    Zhao, Liuhui [University of Delaware; Malikopoulos, Andreas [ORNL; Rios-Torres, Jackeline [ORNL

    2018-01-01

    Connectivity and automation in vehicles provide the most intriguing opportunity for enabling users to better monitor transportation network conditions and make better operating decisions to improve safety and reduce pollution, energy consumption, and travel delays. This study investigates the implications of optimally coordinating vehicles that are wirelessly connected to each other and to an infrastructure in roundabouts to achieve a smooth traffic flow without stop-and-go driving. We apply an optimization framework and an analytical solution that allows optimal coordination of vehicles for merging in such traffic scenario. The effectiveness of the efficiency of the proposed approach is validated through simulation and it is shown that coordination of vehicles can reduce total travel time by 3~49% and fuel consumption by 2~27% with respect to different traffic levels. In addition, network throughput is improved by up to 25% due to elimination of stop-and-go driving behavior.

  4. Optimization of automation: III. Development of optimization method for determining automation rate in nuclear power plants

    International Nuclear Information System (INIS)

    Lee, Seung Min; Kim, Jong Hyun; Kim, Man Cheol; Seong, Poong Hyun

    2016-01-01

    Highlights: • We propose an appropriate automation rate that enables the best human performance. • We analyze the shortest working time considering Situation Awareness Recovery (SAR). • The optimized automation rate is estimated by integrating the automation and ostracism rate estimation methods. • The process to derive the optimized automation rate is demonstrated through case studies. - Abstract: Automation has been introduced in various industries, including the nuclear field, because it is commonly believed that automation promises greater efficiency, lower workloads, and fewer operator errors through reducing operator errors and enhancing operator and system performance. However, the excessive introduction of automation has deteriorated operator performance due to the side effects of automation, which are referred to as Out-of-the-Loop (OOTL), and this is critical issue that must be resolved. Thus, in order to determine the optimal level of automation introduction that assures the best human operator performance, a quantitative method of optimizing the automation is proposed in this paper. In order to propose the optimization method for determining appropriate automation levels that enable the best human performance, the automation rate and ostracism rate, which are estimation methods that quantitatively analyze the positive and negative effects of automation, respectively, are integrated. The integration was conducted in order to derive the shortest working time through considering the concept of situation awareness recovery (SAR), which states that the automation rate with the shortest working time assures the best human performance. The process to derive the optimized automation rate is demonstrated through an emergency operation scenario-based case study. In this case study, four types of procedures are assumed through redesigning the original emergency operating procedure according to the introduced automation and ostracism levels. Using the

  5. Optimal Coordination of Automatic Line Switches for Distribution Systems

    Directory of Open Access Journals (Sweden)

    Jyh-Cherng Gu

    2012-04-01

    Full Text Available For the Taiwan Power Company (Taipower, the margins of coordination times between the lateral circuit breakers (LCB of underground 4-way automatic line switches and the protection equipment of high voltage customers are often too small. This could lead to sympathy tripping by the feeder circuit breaker (FCB of the distribution feeder and create difficulties in protection coordination between upstream and downstream protection equipment, identification of faults, and restoration operations. In order to solve the problem, it is necessary to reexamine the protection coordination between LCBs and high voltage customers’ protection equipment, and between LCBs and FCBs, in order to bring forth new proposals for settings and operations. This paper applies linear programming to optimize the protection coordination of protection devices, and proposes new time current curves (TCCs for the overcurrent (CO and low-energy overcurrent (LCO relays used in normally open distribution systems by performing simulations in the Electrical Transient Analyzer Program (ETAP environment. The simulation results show that the new TCCs solve the coordination problems among high voltage customer, lateral, feeder, bus-interconnection, and distribution transformer. The new proposals also satisfy the requirements of Taipower on protection coordination of the distribution feeder automation system (DFAS. Finally, the authors believe that the system configuration, operation experience, and relevant criteria mentioned in this paper may serve as valuable references for other companies or utilities when building DFAS of their own.

  6. Optimization of strong and weak coordinates

    NARCIS (Netherlands)

    Swart, M.; Bickelhaupt, F.M.

    2006-01-01

    We present a new scheme for the geometry optimization of equilibrium and transition state structures that can be used for both strong and weak coordinates. We use a screening function that depends on atom-pair distances to differentiate strong coordinates from weak coordinates. This differentiation

  7. PARAMETER COORDINATION AND ROBUST OPTIMIZATION FOR MULTIDISCIPLINARY DESIGN

    Institute of Scientific and Technical Information of China (English)

    HU Jie; PENG Yinghong; XIONG Guangleng

    2006-01-01

    A new parameter coordination and robust optimization approach for multidisciplinary design is presented. Firstly, the constraints network model is established to support engineering change, coordination and optimization. In this model, interval boxes are adopted to describe the uncertainty of design parameters quantitatively to enhance the design robustness. Secondly, the parameter coordination method is presented to solve the constraints network model, monitor the potential conflicts due to engineering changes, and obtain the consistency solution space corresponding to the given product specifications. Finally, the robust parameter optimization model is established, and genetic arithmetic is used to obtain the robust optimization parameter. An example of bogie design is analyzed to show the scheme to be effective.

  8. Optimization-based Method for Automated Road Network Extraction

    International Nuclear Information System (INIS)

    Xiong, D

    2001-01-01

    Automated road information extraction has significant applicability in transportation. It provides a means for creating, maintaining, and updating transportation network databases that are needed for purposes ranging from traffic management to automated vehicle navigation and guidance. This paper is to review literature on the subject of road extraction and to describe a study of an optimization-based method for automated road network extraction

  9. Coordinating decentralized optimization of truck and shovel mining operations

    Energy Technology Data Exchange (ETDEWEB)

    Cheng, R.; Fraser Forbes, J. [Alberta Univ., Edmonton, AB (Canada). Dept. of Chemical and Materials Engineering; San Yip, W. [Suncor Energy, Fort McMurray, AB (Canada)

    2006-07-01

    Canada's oil sands contain the largest known reserve of oil in the world. Oil sands mining uses 3 functional processes, ore hauling, overburden removal and mechanical maintenance. The industry relies mainly on truck-and-shovel technology in its open-pit mining operations which contributes greatly to the overall mining operation cost. Coordination between operating units is crucial for achieving an enterprise-wide optimal operation level. Some of the challenges facing the industry include multiple or conflicting objectives such as minimizing the use of raw materials and energy while maximizing production. The large sets of constraints that define the feasible domain pose as challenge, as does the uncertainty in system parameters. One solution lies in assigning truck resources to various activities. This fully decentralized approach would treat the optimization of ore production, waste removal and equipment maintenance independently. It was emphasized that mine-wide optimal operation can only be achieved by coordinating ore hauling and overburden removal processes. For that reason, this presentation proposed a coordination approach for a decentralized optimization system. The approach is based on the Dantzig-Wolfe decomposition and auction-based methods that have been previously used to decompose large-scale optimization problems. The treatment of discrete variables and coordinator design was described and the method was illustrated with a simple truck and shovel mining simulation study. The approach can be applied to a wide range of applications such as coordinating decentralized optimal control systems and scheduling. 16 refs., 3 tabs., 2 figs.

  10. Variationally optimal selection of slow coordinates and reaction coordinates in macromolecular systems

    Science.gov (United States)

    Noe, Frank

    To efficiently simulate and generate understanding from simulations of complex macromolecular systems, the concept of slow collective coordinates or reaction coordinates is of fundamental importance. Here we will introduce variational approaches to approximate the slow coordinates and the reaction coordinates between selected end-states given MD simulations of the macromolecular system and a (possibly large) basis set of candidate coordinates. We will then discuss how to select physically intuitive order paremeters that are good surrogates of this variationally optimal result. These result can be used in order to construct Markov state models or other models of the stationary and kinetics properties, in order to parametrize low-dimensional / coarse-grained model of the dynamics. Deutsche Forschungsgemeinschaft, European Research Council.

  11. Self-optimizing approach for automated laser resonator alignment

    Science.gov (United States)

    Brecher, C.; Schmitt, R.; Loosen, P.; Guerrero, V.; Pyschny, N.; Pavim, A.; Gatej, A.

    2012-02-01

    Nowadays, the assembly of laser systems is dominated by manual operations, involving elaborate alignment by means of adjustable mountings. From a competition perspective, the most challenging problem in laser source manufacturing is price pressure, a result of cost competition exerted mainly from Asia. From an economical point of view, an automated assembly of laser systems defines a better approach to produce more reliable units at lower cost. However, the step from today's manual solutions towards an automated assembly requires parallel developments regarding product design, automation equipment and assembly processes. This paper introduces briefly the idea of self-optimizing technical systems as a new approach towards highly flexible automation. Technically, the work focuses on the precision assembly of laser resonators, which is one of the final and most crucial assembly steps in terms of beam quality and laser power. The paper presents a new design approach for miniaturized laser systems and new automation concepts for a robot-based precision assembly, as well as passive and active alignment methods, which are based on a self-optimizing approach. Very promising results have already been achieved, considerably reducing the duration and complexity of the laser resonator assembly. These results as well as future development perspectives are discussed.

  12. Optimal coordination and control of posture and movements.

    Science.gov (United States)

    Johansson, Rolf; Fransson, Per-Anders; Magnusson, Måns

    2009-01-01

    This paper presents a theoretical model of stability and coordination of posture and locomotion, together with algorithms for continuous-time quadratic optimization of motion control. Explicit solutions to the Hamilton-Jacobi equation for optimal control of rigid-body motion are obtained by solving an algebraic matrix equation. The stability is investigated with Lyapunov function theory and it is shown that global asymptotic stability holds. It is also shown how optimal control and adaptive control may act in concert in the case of unknown or uncertain system parameters. The solution describes motion strategies of minimum effort and variance. The proposed optimal control is formulated to be suitable as a posture and movement model for experimental validation and verification. The combination of adaptive and optimal control makes this algorithm a candidate for coordination and control of functional neuromuscular stimulation as well as of prostheses. Validation examples with experimental data are provided.

  13. An optimized method for automated analysis of algal pigments by HPLC

    NARCIS (Netherlands)

    van Leeuwe, M. A.; Villerius, L. A.; Roggeveld, J.; Visser, R. J. W.; Stefels, J.

    2006-01-01

    A recent development in algal pigment analysis by high-performance liquid chromatography (HPLC) is the application of automation. An optimization of a complete sampling and analysis protocol applied specifically in automation has not yet been performed. In this paper we show that automation can only

  14. Optimizing a Drone Network to Deliver Automated External Defibrillators.

    Science.gov (United States)

    Boutilier, Justin J; Brooks, Steven C; Janmohamed, Alyf; Byers, Adam; Buick, Jason E; Zhan, Cathy; Schoellig, Angela P; Cheskes, Sheldon; Morrison, Laurie J; Chan, Timothy C Y

    2017-06-20

    Public access defibrillation programs can improve survival after out-of-hospital cardiac arrest, but automated external defibrillators (AEDs) are rarely available for bystander use at the scene. Drones are an emerging technology that can deliver an AED to the scene of an out-of-hospital cardiac arrest for bystander use. We hypothesize that a drone network designed with the aid of a mathematical model combining both optimization and queuing can reduce the time to AED arrival. We applied our model to 53 702 out-of-hospital cardiac arrests that occurred in the 8 regions of the Toronto Regional RescuNET between January 1, 2006, and December 31, 2014. Our primary analysis quantified the drone network size required to deliver an AED 1, 2, or 3 minutes faster than historical median 911 response times for each region independently. A secondary analysis quantified the reduction in drone resources required if RescuNET was treated as a large coordinated region. The region-specific analysis determined that 81 bases and 100 drones would be required to deliver an AED ahead of median 911 response times by 3 minutes. In the most urban region, the 90th percentile of the AED arrival time was reduced by 6 minutes and 43 seconds relative to historical 911 response times in the region. In the most rural region, the 90th percentile was reduced by 10 minutes and 34 seconds. A single coordinated drone network across all regions required 39.5% fewer bases and 30.0% fewer drones to achieve similar AED delivery times. An optimized drone network designed with the aid of a novel mathematical model can substantially reduce the AED delivery time to an out-of-hospital cardiac arrest event. © 2017 American Heart Association, Inc.

  15. Application of Advanced Particle Swarm Optimization Techniques to Wind-thermal Coordination

    DEFF Research Database (Denmark)

    Singh, Sri Niwas; Østergaard, Jacob; Yadagiri, J.

    2009-01-01

    wind-thermal coordination algorithm is necessary to determine the optimal proportion of wind and thermal generator capacity that can be integrated into the system. In this paper, four versions of Particle Swarm Optimization (PSO) techniques are proposed for solving wind-thermal coordination problem...

  16. A New Hybrid Nelder-Mead Particle Swarm Optimization for Coordination Optimization of Directional Overcurrent Relays

    Directory of Open Access Journals (Sweden)

    An Liu

    2012-01-01

    Full Text Available Coordination optimization of directional overcurrent relays (DOCRs is an important part of an efficient distribution system. This optimization problem involves obtaining the time dial setting (TDS and pickup current (Ip values of each DOCR. The optimal results should have the shortest primary relay operating time for all fault lines. Recently, the particle swarm optimization (PSO algorithm has been considered an effective tool for linear/nonlinear optimization problems with application in the protection and coordination of power systems. With a limited runtime period, the conventional PSO considers the optimal solution as the final solution, and an early convergence of PSO results in decreased overall performance and an increase in the risk of mistaking local optima for global optima. Therefore, this study proposes a new hybrid Nelder-Mead simplex search method and particle swarm optimization (proposed NM-PSO algorithm to solve the DOCR coordination optimization problem. PSO is the main optimizer, and the Nelder-Mead simplex search method is used to improve the efficiency of PSO due to its potential for rapid convergence. To validate the proposal, this study compared the performance of the proposed algorithm with that of PSO and original NM-PSO. The findings demonstrate the outstanding performance of the proposed NM-PSO in terms of computation speed, rate of convergence, and feasibility.

  17. Vibrational self-consistent field theory using optimized curvilinear coordinates.

    Science.gov (United States)

    Bulik, Ireneusz W; Frisch, Michael J; Vaccaro, Patrick H

    2017-07-28

    A vibrational SCF model is presented in which the functions forming the single-mode functions in the product wavefunction are expressed in terms of internal coordinates and the coordinates used for each mode are optimized variationally. This model involves no approximations to the kinetic energy operator and does not require a Taylor-series expansion of the potential. The non-linear optimization of coordinates is found to give much better product wavefunctions than the limited variations considered in most previous applications of SCF methods to vibrational problems. The approach is tested using published potential energy surfaces for water, ammonia, and formaldehyde. Variational flexibility allowed in the current ansätze results in excellent zero-point energies expressed through single-product states and accurate fundamental transition frequencies realized by short configuration-interaction expansions. Fully variational optimization of single-product states for excited vibrational levels also is discussed. The highlighted methodology constitutes an excellent starting point for more sophisticated treatments, as the bulk characteristics of many-mode coupling are accounted for efficiently in terms of compact wavefunctions (as evident from the accurate prediction of transition frequencies).

  18. Two-phase strategy of controlling motor coordination determined by task performance optimality.

    Science.gov (United States)

    Shimansky, Yury P; Rand, Miya K

    2013-02-01

    A quantitative model of optimal coordination between hand transport and grip aperture has been derived in our previous studies of reach-to-grasp movements without utilizing explicit knowledge of the optimality criterion or motor plant dynamics. The model's utility for experimental data analysis has been demonstrated. Here we show how to generalize this model for a broad class of reaching-type, goal-directed movements. The model allows for measuring the variability of motor coordination and studying its dependence on movement phase. The experimentally found characteristics of that dependence imply that execution noise is low and does not affect motor coordination significantly. From those characteristics it is inferred that the cost of neural computations required for information acquisition and processing is included in the criterion of task performance optimality as a function of precision demand for state estimation and decision making. The precision demand is an additional optimized control variable that regulates the amount of neurocomputational resources activated dynamically. It is shown that an optimal control strategy in this case comprises two different phases. During the initial phase, the cost of neural computations is significantly reduced at the expense of reducing the demand for their precision, which results in speed-accuracy tradeoff violation and significant inter-trial variability of motor coordination. During the final phase, neural computations and thus motor coordination are considerably more precise to reduce the cost of errors in making a contact with the target object. The generality of the optimal coordination model and the two-phase control strategy is illustrated on several diverse examples.

  19. Optimal Coordination of Directional Overcurrent Relays Using PSO-TVAC Considering Series Compensation

    Directory of Open Access Journals (Sweden)

    Nabil Mancer

    2015-01-01

    Full Text Available The integration of system compensation such as Series Compensator (SC into the transmission line makes the coordination of directional overcurrent in a practical power system important and complex. This article presents an efficient variant of Particle Swarm Optimization (PSO algorithm based on Time-Varying Acceleration Coefficients (PSO-TVAC for optimal coordination of directional overcurrent relays (DOCRs considering the integration of series compensation. Simulation results are compared to other methods to confirm the efficiency of the proposed variant PSO in solving the optimal coordination of directional overcurrent relay in the presence of series compensation.

  20. Optimal Energy Management of Multi-Microgrids with Sequentially Coordinated Operations

    Directory of Open Access Journals (Sweden)

    Nah-Oak Song

    2015-08-01

    Full Text Available We propose an optimal electric energy management of a cooperative multi-microgrid community with sequentially coordinated operations. The sequentially coordinated operations are suggested to distribute computational burden and yet to make the optimal 24 energy management of multi-microgrids possible. The sequential operations are mathematically modeled to find the optimal operation conditions and illustrated with physical interpretation of how to achieve optimal energy management in the cooperative multi-microgrid community. This global electric energy optimization of the cooperative community is realized by the ancillary internal trading between the microgrids in the cooperative community which reduces the extra cost from unnecessary external trading by adjusting the electric energy production amounts of combined heat and power (CHP generators and amounts of both internal and external electric energy trading of the cooperative community. A simulation study is also conducted to validate the proposed mathematical energy management models.

  1. Predictive Analytics for Coordinated Optimization in Distribution Systems

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Rui [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2018-04-13

    This talk will present NREL's work on developing predictive analytics that enables the optimal coordination of all the available resources in distribution systems to achieve the control objectives of system operators. Two projects will be presented. One focuses on developing short-term state forecasting-based optimal voltage regulation in distribution systems; and the other one focuses on actively engaging electricity consumers to benefit distribution system operations.

  2. Automated Multivariate Optimization Tool for Energy Analysis: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Ellis, P. G.; Griffith, B. T.; Long, N.; Torcellini, P. A.; Crawley, D.

    2006-07-01

    Building energy simulations are often used for trial-and-error evaluation of ''what-if'' options in building design--a limited search for an optimal solution, or ''optimization''. Computerized searching has the potential to automate the input and output, evaluate many options, and perform enough simulations to account for the complex interactions among combinations of options. This paper describes ongoing efforts to develop such a tool. The optimization tool employs multiple modules, including a graphical user interface, a database, a preprocessor, the EnergyPlus simulation engine, an optimization engine, and a simulation run manager. Each module is described and the overall application architecture is summarized.

  3. Optimal Coordinated Strategy Analysis for the Procurement Logistics of a Steel Group

    Directory of Open Access Journals (Sweden)

    Lianbo Deng

    2014-01-01

    Full Text Available This paper focuses on the optimization of an internal coordinated procurement logistics system in a steel group and the decision on the coordinated procurement strategy by minimizing the logistics costs. Considering the coordinated procurement strategy and the procurement logistics costs, the aim of the optimization model was to maximize the degree of quality satisfaction and to minimize the procurement logistics costs. The model was transformed into a single-objective model and solved using a simulated annealing algorithm. In the algorithm, the supplier of each subsidiary was selected according to the evaluation result for independent procurement. Finally, the effect of different parameters on the coordinated procurement strategy was analysed. The results showed that the coordinated strategy can clearly save procurement costs; that the strategy appears to be more cooperative when the quality requirement is not stricter; and that the coordinated costs have a strong effect on the coordinated procurement strategy.

  4. An Automated DAKOTA and VULCAN-CFD Framework with Application to Supersonic Facility Nozzle Flowpath Optimization

    Science.gov (United States)

    Axdahl, Erik L.

    2015-01-01

    Removing human interaction from design processes by using automation may lead to gains in both productivity and design precision. This memorandum describes efforts to incorporate high fidelity numerical analysis tools into an automated framework and applying that framework to applications of practical interest. The purpose of this effort was to integrate VULCAN-CFD into an automated, DAKOTA-enabled framework with a proof-of-concept application being the optimization of supersonic test facility nozzles. It was shown that the optimization framework could be deployed on a high performance computing cluster with the flow of information handled effectively to guide the optimization process. Furthermore, the application of the framework to supersonic test facility nozzle flowpath design and optimization was demonstrated using multiple optimization algorithms.

  5. Protein Folding Free Energy Landscape along the Committor - the Optimal Folding Coordinate.

    Science.gov (United States)

    Krivov, Sergei V

    2018-06-06

    Recent advances in simulation and experiment have led to dramatic increases in the quantity and complexity of produced data, which makes the development of automated analysis tools very important. A powerful approach to analyze dynamics contained in such data sets is to describe/approximate it by diffusion on a free energy landscape - free energy as a function of reaction coordinates (RC). For the description to be quantitatively accurate, RCs should be chosen in an optimal way. Recent theoretical results show that such an optimal RC exists; however, determining it for practical systems is a very difficult unsolved problem. Here we describe a solution to this problem. We describe an adaptive nonparametric approach to accurately determine the optimal RC (the committor) for an equilibrium trajectory of a realistic system. In contrast to alternative approaches, which require a functional form with many parameters to approximate an RC and thus extensive expertise with the system, the suggested approach is nonparametric and can approximate any RC with high accuracy without system specific information. To avoid overfitting for a realistically sampled system, the approach performs RC optimization in an adaptive manner by focusing optimization on less optimized spatiotemporal regions of the RC. The power of the approach is illustrated on a long equilibrium atomistic folding simulation of HP35 protein. We have determined the optimal folding RC - the committor, which was confirmed by passing a stringent committor validation test. It allowed us to determine a first quantitatively accurate protein folding free energy landscape. We have confirmed the recent theoretical results that diffusion on such a free energy profile can be used to compute exactly the equilibrium flux, the mean first passage times, and the mean transition path times between any two points on the profile. We have shown that the mean squared displacement along the optimal RC grows linear with time as for

  6. Automation and Optimization of Multipulse Laser Zona Drilling of Mouse Embryos During Embryo Biopsy.

    Science.gov (United States)

    Wong, Christopher Yee; Mills, James K

    2017-03-01

    Laser zona drilling (LZD) is a required step in many embryonic surgical procedures, for example, assisted hatching and preimplantation genetic diagnosis. LZD involves the ablation of the zona pellucida (ZP) using a laser while minimizing potentially harmful thermal effects on critical internal cell structures. Develop a method for the automation and optimization of multipulse LZD, applied to cleavage-stage embryos. A two-stage optimization is used. The first stage uses computer vision algorithms to identify embryonic structures and determines the optimal ablation zone farthest away from critical structures such as blastomeres. The second stage combines a genetic algorithm with a previously reported thermal analysis of LZD to optimize the combination of laser pulse locations and pulse durations. The goal is to minimize the peak temperature experienced by the blastomeres while creating the desired opening in the ZP. A proof of concept of the proposed LZD automation and optimization method is demonstrated through experiments on mouse embryos with positive results, as adequately sized openings are created. Automation of LZD is feasible and is a viable step toward the automation of embryo biopsy procedures. LZD is a common but delicate procedure performed by human operators using subjective methods to gauge proper LZD procedure. Automation of LZD removes human error to increase the success rate of LZD. Although the proposed methods are developed for cleavage-stage embryos, the same methods may be applied to most types LZD procedures, embryos at different developmental stages, or nonembryonic cells.

  7. Distributed optimal coordination for distributed energy resources in power systems

    DEFF Research Database (Denmark)

    Wu, Di; Yang, Tao; Stoorvogel, A.

    2017-01-01

    Driven by smart grid technologies, distributed energy resources (DERs) have been rapidly developing in recent years for improving reliability and efficiency of distribution systems. Emerging DERs require effective and efficient coordination in order to reap their potential benefits. In this paper......, we consider an optimal DER coordination problem over multiple time periods subject to constraints at both system and device levels. Fully distributed algorithms are proposed to dynamically and automatically coordinate distributed generators with multiple/single storages. With the proposed algorithms...

  8. Hybrid optimal online-overnight charging coordination of plug-in electric vehicles in smart grid

    Science.gov (United States)

    Masoum, Mohammad A. S.; Nabavi, Seyed M. H.

    2016-10-01

    Optimal coordinated charging of plugged-in electric vehicles (PEVs) in smart grid (SG) can be beneficial for both consumers and utilities. This paper proposes a hybrid optimal online followed by overnight charging coordination of high and low priority PEVs using discrete particle swarm optimization (DPSO) that considers the benefits of both consumers and electric utilities. Objective functions are online minimization of total cost (associated with grid losses and energy generation) and overnight valley filling through minimization of the total load levels. The constraints include substation transformer loading, node voltage regulations and the requested final battery state of charge levels (SOCreq). The main challenge is optimal selection of the overnight starting time (toptimal-overnight,start) to guarantee charging of all vehicle batteries to the SOCreq levels before the requested plug-out times (treq) which is done by simultaneously solving the online and overnight objective functions. The online-overnight PEV coordination approach is implemented on a 449-node SG; results are compared for uncoordinated and coordinated battery charging as well as a modified strategy using cost minimizations for both online and overnight coordination. The impact of toptimal-overnight,start on performance of the proposed PEV coordination is investigated.

  9. Application of a Continuous Particle Swarm Optimization (CPSO for the Optimal Coordination of Overcurrent Relays Considering a Penalty Method

    Directory of Open Access Journals (Sweden)

    Abdul Wadood

    2018-04-01

    Full Text Available In an electrical power system, the coordination of the overcurrent relays plays an important role in protecting the electrical system by providing primary as well as backup protection. To reduce power outages, the coordination between these relays should be kept at the optimum value to minimize the total operating time and ensure that the least damage occurs under fault conditions. It is also imperative to ensure that the relay setting does not create an unintentional operation and consecutive sympathy trips. In a power system protection coordination problem, the objective function to be optimized is the sum of the total operating time of all main relays. In this paper, the coordination of overcurrent relays in a ring fed distribution system is formulated as an optimization problem. Coordination is performed using proposed continuous particle swarm optimization. In order to enhance and improve the quality of this solution a local search algorithm (LSA is implanted into the original particle swarm algorithm (PSO and, in addition to the constraints, these are amalgamated into the fitness function via the penalty method. The results achieved from the continuous particle swarm optimization algorithm (CPSO are compared with other evolutionary optimization algorithms (EA and this comparison showed that the proposed scheme is competent in dealing with the relevant problems. From further analyzing the obtained results, it was found that the continuous particle swarm approach provides the most globally optimum solution.

  10. Optimal coordination of distance and over-current relays in series compensated systems based on MAPSO

    International Nuclear Information System (INIS)

    Moravej, Zahra; Jazaeri, Mostafa; Gholamzadeh, Mehdi

    2012-01-01

    Highlight: ► Optimal coordination problem between distance relays and Directional Over-Current Relays (DOCRs) is studied. ► A new problem formulation for both uncompensated and series compensated system is proposed. ► In order to solve the coordination problem a Modified Adaptive Particle Swarm Optimization (MAPSO) is employed. ► The optimum results are found in both uncompensated and series compensated systems. - Abstract: In this paper, a novel problem formulation for optimal coordination between distance relays and Directional Over-Current Relays (DOCRs) in series compensated systems is proposed. The integration of the series capacitor (SC) into the transmission line makes the coordination problem more complex. The main contribution of this paper is a new systematic method for computing the optimal second zone timing of distance relays and optimal settings of DOCRs, in series compensated and uncompensated transmission systems, which have a combined protection scheme with DOCRs and distance relays. In order to solve this coordination problem, which is a nonlinear and non-convex problem, a Modified Adaptive Particle Swarm Optimization (MAPSO) is employed. The new proposed method is supported by obtained results from a typical test case and a real power system network.

  11. Optimal Protection Coordination for Microgrid under Different Operating Modes

    Directory of Open Access Journals (Sweden)

    Ming-Ta Yang

    2013-01-01

    Full Text Available Significant consequences result when a microgrid is connected to a distribution system. This study discusses the impacts of bolted three-phase faults and bolted single line-to-ground faults on the protection coordination of a distribution system connected by a microgrid which operates in utility-only mode or in grid-connected mode. The power system simulation software is used to build the test system. The linear programming method is applied to optimize the coordination of relays, and the relays coordination simulation software is used to verify if the coordination time intervals (CTIs of the primary/backup relay pairs are adequate. In addition, this study also proposes a relays protection coordination strategy when the microgrid operates in islanding mode during a utility power outage. Because conventional CO/LCO relays are not capable of detecting high impedance fault, intelligent electrical device (IED combined with wavelet transformer and neural network is proposed to accurately detect high impedance fault and identify the fault phase.

  12. Optimizing transformations for automated, high throughput analysis of flow cytometry data.

    Science.gov (United States)

    Finak, Greg; Perez, Juan-Manuel; Weng, Andrew; Gottardo, Raphael

    2010-11-04

    In a high throughput setting, effective flow cytometry data analysis depends heavily on proper data preprocessing. While usual preprocessing steps of quality assessment, outlier removal, normalization, and gating have received considerable scrutiny from the community, the influence of data transformation on the output of high throughput analysis has been largely overlooked. Flow cytometry measurements can vary over several orders of magnitude, cell populations can have variances that depend on their mean fluorescence intensities, and may exhibit heavily-skewed distributions. Consequently, the choice of data transformation can influence the output of automated gating. An appropriate data transformation aids in data visualization and gating of cell populations across the range of data. Experience shows that the choice of transformation is data specific. Our goal here is to compare the performance of different transformations applied to flow cytometry data in the context of automated gating in a high throughput, fully automated setting. We examine the most common transformations used in flow cytometry, including the generalized hyperbolic arcsine, biexponential, linlog, and generalized Box-Cox, all within the BioConductor flowCore framework that is widely used in high throughput, automated flow cytometry data analysis. All of these transformations have adjustable parameters whose effects upon the data are non-intuitive for most users. By making some modelling assumptions about the transformed data, we develop maximum likelihood criteria to optimize parameter choice for these different transformations. We compare the performance of parameter-optimized and default-parameter (in flowCore) data transformations on real and simulated data by measuring the variation in the locations of cell populations across samples, discovered via automated gating in both the scatter and fluorescence channels. We find that parameter-optimized transformations improve visualization, reduce

  13. Optimizing transformations for automated, high throughput analysis of flow cytometry data

    Directory of Open Access Journals (Sweden)

    Weng Andrew

    2010-11-01

    Full Text Available Abstract Background In a high throughput setting, effective flow cytometry data analysis depends heavily on proper data preprocessing. While usual preprocessing steps of quality assessment, outlier removal, normalization, and gating have received considerable scrutiny from the community, the influence of data transformation on the output of high throughput analysis has been largely overlooked. Flow cytometry measurements can vary over several orders of magnitude, cell populations can have variances that depend on their mean fluorescence intensities, and may exhibit heavily-skewed distributions. Consequently, the choice of data transformation can influence the output of automated gating. An appropriate data transformation aids in data visualization and gating of cell populations across the range of data. Experience shows that the choice of transformation is data specific. Our goal here is to compare the performance of different transformations applied to flow cytometry data in the context of automated gating in a high throughput, fully automated setting. We examine the most common transformations used in flow cytometry, including the generalized hyperbolic arcsine, biexponential, linlog, and generalized Box-Cox, all within the BioConductor flowCore framework that is widely used in high throughput, automated flow cytometry data analysis. All of these transformations have adjustable parameters whose effects upon the data are non-intuitive for most users. By making some modelling assumptions about the transformed data, we develop maximum likelihood criteria to optimize parameter choice for these different transformations. Results We compare the performance of parameter-optimized and default-parameter (in flowCore data transformations on real and simulated data by measuring the variation in the locations of cell populations across samples, discovered via automated gating in both the scatter and fluorescence channels. We find that parameter-optimized

  14. Simulation-Based Optimization for Storage Allocation Problem of Outbound Containers in Automated Container Terminals

    Directory of Open Access Journals (Sweden)

    Ning Zhao

    2015-01-01

    Full Text Available Storage allocation of outbound containers is a key factor of the performance of container handling system in automated container terminals. Improper storage plans of outbound containers make QC waiting inevitable; hence, the vessel handling time will be lengthened. A simulation-based optimization method is proposed in this paper for the storage allocation problem of outbound containers in automated container terminals (SAPOBA. A simulation model is built up by Timed-Colored-Petri-Net (TCPN, used to evaluate the QC waiting time of storage plans. Two optimization approaches, based on Particle Swarm Optimization (PSO and Genetic Algorithm (GA, are proposed to form the complete simulation-based optimization method. Effectiveness of this method is verified by experiment, as the comparison of the two optimization approaches.

  15. Dynamic modeling and optimal joint torque coordination of advanced robotic systems

    Science.gov (United States)

    Kang, Hee-Jun

    The development is documented of an efficient dynamic modeling algorithm and the subsequent optimal joint input load coordination of advanced robotic systems for industrial application. A closed-form dynamic modeling algorithm for the general closed-chain robotic linkage systems is presented. The algorithm is based on the transfer of system dependence from a set of open chain Lagrangian coordinates to any desired system generalized coordinate set of the closed-chain. Three different techniques for evaluation of the kinematic closed chain constraints allow the representation of the dynamic modeling parameters in terms of system generalized coordinates and have no restriction with regard to kinematic redundancy. The total computational requirement of the closed-chain system model is largely dependent on the computation required for the dynamic model of an open kinematic chain. In order to improve computational efficiency, modification of an existing open-chain KIC based dynamic formulation is made by the introduction of the generalized augmented body concept. This algorithm allows a 44 pct. computational saving over the current optimized one (O(N4), 5995 when N = 6). As means of resolving redundancies in advanced robotic systems, local joint torque optimization is applied for effectively using actuator power while avoiding joint torque limits. The stability problem in local joint torque optimization schemes is eliminated by using fictitious dissipating forces which act in the necessary null space. The performance index representing the global torque norm is shown to be satisfactory. In addition, the resulting joint motion trajectory becomes conservative, after a transient stage, for repetitive cyclic end-effector trajectories. The effectiveness of the null space damping method is shown. The modular robot, which is built of well defined structural modules from a finite-size inventory and is controlled by one general computer system, is another class of evolving

  16. Automated Planning of Tangential Breast Intensity-Modulated Radiotherapy Using Heuristic Optimization

    International Nuclear Information System (INIS)

    Purdie, Thomas G.; Dinniwell, Robert E.; Letourneau, Daniel; Hill, Christine; Sharpe, Michael B.

    2011-01-01

    Purpose: To present an automated technique for two-field tangential breast intensity-modulated radiotherapy (IMRT) treatment planning. Method and Materials: A total of 158 planned patients with Stage 0, I, and II breast cancer treated using whole-breast IMRT were retrospectively replanned using automated treatment planning tools. The tools developed are integrated into the existing clinical treatment planning system (Pinnacle 3 ) and are designed to perform the manual volume delineation, beam placement, and IMRT treatment planning steps carried out by the treatment planning radiation therapist. The automated algorithm, using only the radio-opaque markers placed at CT simulation as inputs, optimizes the tangential beam parameters to geometrically minimize the amount of lung and heart treated while covering the whole-breast volume. The IMRT parameters are optimized according to the automatically delineated whole-breast volume. Results: The mean time to generate a complete treatment plan was 6 min, 50 s ± 1 min 12 s. For the automated plans, 157 of 158 plans (99%) were deemed clinically acceptable, and 138 of 158 plans (87%) were deemed clinically improved or equal to the corresponding clinical plan when reviewed in a randomized, double-blinded study by one experienced breast radiation oncologist. In addition, overall the automated plans were dosimetrically equivalent to the clinical plans when scored for target coverage and lung and heart doses. Conclusion: We have developed robust and efficient automated tools for fully inversed planned tangential breast IMRT planning that can be readily integrated into clinical practice. The tools produce clinically acceptable plans using only the common anatomic landmarks from the CT simulation process as an input. We anticipate the tools will improve patient access to high-quality IMRT treatment by simplifying the planning process and will reduce the effort and cost of incorporating more advanced planning into clinical practice.

  17. Simulation based optimization on automated fibre placement process

    Science.gov (United States)

    Lei, Shi

    2018-02-01

    In this paper, a software simulation (Autodesk TruPlan & TruFiber) based method is proposed to optimize the automate fibre placement (AFP) process. Different types of manufacturability analysis are introduced to predict potential defects. Advanced fibre path generation algorithms are compared with respect to geometrically different parts. Major manufacturing data have been taken into consideration prior to the tool paths generation to achieve high success rate of manufacturing.

  18. Development of An Optimization Method for Determining Automation Rate in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Lee, Seung Min; Seong, Poong Hyun; Kim, Jong Hyun

    2014-01-01

    Since automation was introduced in various industrial fields, it has been known that automation provides positive effects like greater efficiency and fewer human errors, and negative effect defined as out-of-the-loop (OOTL). Thus, before introducing automation in nuclear field, the estimation of the positive and negative effects of automation on human operators should be conducted. In this paper, by focusing on CPS, the optimization method to find an appropriate proportion of automation is suggested by integrating the suggested cognitive automation rate and the concepts of the level of ostracism. The cognitive automation rate estimation method was suggested to express the reduced amount of human cognitive loads, and the level of ostracism was suggested to express the difficulty in obtaining information from the automation system and increased uncertainty of human operators' diagnosis. The maximized proportion of automation that maintains the high level of attention for monitoring the situation is derived by an experiment, and the automation rate is estimated by the suggested automation rate estimation method. It is expected to derive an appropriate inclusion proportion of the automation system avoiding the OOTL problem and having maximum efficacy at the same time

  19. Controller Design Automation for Aeroservoelastic Design Optimization of Wind Turbines

    NARCIS (Netherlands)

    Ashuri, T.; Van Bussel, G.J.W.; Zaayer, M.B.; Van Kuik, G.A.M.

    2010-01-01

    The purpose of this paper is to integrate the controller design of wind turbines with structure and aerodynamic analysis and use the final product in the design optimization process (DOP) of wind turbines. To do that, the controller design is automated and integrated with an aeroelastic simulation

  20. On the use of PGD for optimal control applied to automated fibre placement

    Science.gov (United States)

    Bur, N.; Joyot, P.

    2017-10-01

    Automated Fibre Placement (AFP) is an incipient manufacturing process for composite structures. Despite its concep-tual simplicity it involves many complexities related to the necessity of melting the thermoplastic at the interface tape-substrate, ensuring the consolidation that needs the diffusion of molecules and control the residual stresses installation responsible of the residual deformations of the formed parts. The optimisation of the process and the determination of the process window cannot be achieved in a traditional way since it requires a plethora of trials/errors or numerical simulations, because there are many parameters involved in the characterisation of the material and the process. Using reduced order modelling such as the so called Proper Generalised Decomposition method, allows the construction of multi-parametric solution taking into account many parameters. This leads to virtual charts that can be explored on-line in real time in order to perform process optimisation or on-line simulation-based control. Thus, for a given set of parameters, determining the power leading to an optimal temperature becomes easy. However, instead of controlling the power knowing the temperature field by particularizing an abacus, we propose here an approach based on optimal control: we solve by PGD a dual problem from heat equation and optimality criteria. To circumvent numerical issue due to ill-conditioned system, we propose an algorithm based on Uzawa's method. That way, we are able to solve the dual problem, setting the desired state as an extra-coordinate in the PGD framework. In a single computation, we get both the temperature field and the required heat flux to reach a parametric optimal temperature on a given zone.

  1. Coordinated Optimal Operation Method of the Regional Energy Internet

    Directory of Open Access Journals (Sweden)

    Rishang Long

    2017-05-01

    Full Text Available The development of the energy internet has become one of the key ways to solve the energy crisis. This paper studies the system architecture, energy flow characteristics and coordinated optimization method of the regional energy internet. Considering the heat-to-electric ratio of a combined cooling, heating and power unit, energy storage life and real-time electricity price, a double-layer optimal scheduling model is proposed, which includes economic and environmental benefit in the upper layer and energy efficiency in the lower layer. A particle swarm optimizer–individual variation ant colony optimization algorithm is used to solve the computational efficiency and accuracy. Through the calculation and simulation of the simulated system, the energy savings, level of environmental protection and economic optimal dispatching scheme are realized.

  2. Automated Design Framework for Synthetic Biology Exploiting Pareto Optimality.

    Science.gov (United States)

    Otero-Muras, Irene; Banga, Julio R

    2017-07-21

    In this work we consider Pareto optimality for automated design in synthetic biology. We present a generalized framework based on a mixed-integer dynamic optimization formulation that, given design specifications, allows the computation of Pareto optimal sets of designs, that is, the set of best trade-offs for the metrics of interest. We show how this framework can be used for (i) forward design, that is, finding the Pareto optimal set of synthetic designs for implementation, and (ii) reverse design, that is, analyzing and inferring motifs and/or design principles of gene regulatory networks from the Pareto set of optimal circuits. Finally, we illustrate the capabilities and performance of this framework considering four case studies. In the first problem we consider the forward design of an oscillator. In the remaining problems, we illustrate how to apply the reverse design approach to find motifs for stripe formation, rapid adaption, and fold-change detection, respectively.

  3. Simple heuristics: A bridge between manual core design and automated optimization methods

    International Nuclear Information System (INIS)

    White, J.R.; Delmolino, P.M.

    1993-01-01

    The primary function of RESCUE is to serve as an aid in the analysis and identification of feasible loading patterns for LWR reload cores. The unique feature of RESCUE is that its physics model is based on some recent advances in generalized perturbation theory (GPT) methods. The high order GPT techniques offer the accuracy, computational efficiency, and flexibility needed for the implementation of a full range of capabilities within a set of compatible interactive (manual and semi-automated) and automated design tools. The basic design philosophy and current features within RESCUE are reviewed, and the new semi-automated capability is highlighted. The online advisor facility appears quite promising and it provides a natural bridge between the traditional trial-and-error manual process and the recent progress towards fully automated optimization sequences. (orig.)

  4. Optimization of an auto-thermal ammonia synthesis reactor using cyclic coordinate method

    Science.gov (United States)

    A-N Nguyen, T.; Nguyen, T.-A.; Vu, T.-D.; Nguyen, K.-T.; K-T Dao, T.; P-H Huynh, K.

    2017-06-01

    The ammonia synthesis system is an important chemical process used in the manufacture of fertilizers, chemicals, explosives, fibers, plastics, refrigeration. In the literature, many works approaching the modeling, simulation and optimization of an auto-thermal ammonia synthesis reactor can be found. However, they just focus on the optimization of the reactor length while keeping the others parameters constant. In this study, the other parameters are also considered in the optimization problem such as the temperature of feed gas enters the catalyst zone, the initial nitrogen proportion. The optimal problem requires the maximization of an objective function which is multivariable function and subject to a number of equality constraints involving the solution of coupled differential equations and also inequality constraint. The cyclic coordinate search was applied to solve the multivariable-optimization problem. In each coordinate, the golden section method was applied to find the maximum value. The inequality constraints were treated using penalty method. The coupled differential equations system was solved using Runge-Kutta 4th order method. The results obtained from this study are also compared to the results from the literature.

  5. Coordinated Platoon Routing in a Metropolitan Network

    Energy Technology Data Exchange (ETDEWEB)

    Larson, Jeffrey; Munson, Todd; Sokolov, Vadim

    2016-10-10

    Platooning vehicles—connected and automated vehicles traveling with small intervehicle distances—use less fuel because of reduced aerodynamic drag. Given a network de- fined by vertex and edge sets and a set of vehicles with origin/destination nodes/times, we model and solve the combinatorial optimization problem of coordinated routing of vehicles in a manner that routes them to their destination on time while using the least amount of fuel. Common approaches decompose the platoon coordination and vehicle routing into separate problems. Our model addresses both problems simultaneously to obtain the best solution. We use modern modeling techniques and constraints implied from analyzing the platoon routing problem to address larger numbers of vehicles and larger networks than previously considered. While the numerical method used is unable to certify optimality for candidate solutions to all networks and parameters considered, we obtain excellent solutions in approximately one minute for much larger networks and vehicle sets than previously considered in the literature.

  6. A new hybrid optimization algorithm CRO-DE for optimal coordination of overcurrent relays in complex power systems

    Directory of Open Access Journals (Sweden)

    Mohamed Zellagui

    2017-09-01

    Full Text Available The paper presents a new hybrid global optimization algorithm based on Chemical Reaction based Optimization (CRO and Di¤erential evolution (DE algorithm for nonlinear constrained optimization problems. This approach proposed for the optimal coordination and setting relays of directional overcurrent relays in complex power systems. In protection coordination problem, the objective function to be minimized is the sum of the operating time of all main relays. The optimization problem is subject to a number of constraints which are mainly focused on the operation of the backup relay, which should operate if a primary relay fails to respond to the fault near to it, Time Dial Setting (TDS, Plug Setting (PS and the minimum operating time of a relay. The hybrid global proposed optimization algorithm aims to minimize the total operating time of each protection relay. Two systems are used as case study to check the effeciency of the optimization algorithm which are IEEE 4-bus and IEEE 6-bus models. Results are obtained and presented for CRO and DE and hybrid CRO-DE algorithms. The obtained results for the studied cases are compared with those results obtained when using other optimization algorithms which are Teaching Learning-Based Optimization (TLBO, Chaotic Differential Evolution Algorithm (CDEA and Modiffied Differential Evolution Algorithm (MDEA, and Hybrid optimization algorithms (PSO-DE, IA-PSO, and BFOA-PSO. From analysing the obtained results, it has been concluded that hybrid CRO-DO algorithm provides the most optimum solution with the best convergence rate.

  7. Coordinated Optimization of Distributed Energy Resources and Smart Loads in Distribution Systems: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Rui; Zhang, Yingchen

    2016-08-01

    Distributed energy resources (DERs) and smart loads have the potential to provide flexibility to the distribution system operation. A coordinated optimization approach is proposed in this paper to actively manage DERs and smart loads in distribution systems to achieve the optimal operation status. A three-phase unbalanced Optimal Power Flow (OPF) problem is developed to determine the output from DERs and smart loads with respect to the system operator's control objective. This paper focuses on coordinating PV systems and smart loads to improve the overall voltage profile in distribution systems. Simulations have been carried out in a 12-bus distribution feeder and results illustrate the superior control performance of the proposed approach.

  8. Study on Electricity Purchase Optimization in Coordination of Electricity and Carbon Trading

    Science.gov (United States)

    Liu, Dunnan; Meng, Yaru; Zhang, Shuo

    2017-07-01

    With the establishment of carbon emissions trading market in China, the power industry has become an important part of the market participants. The power grid enterprises need to optimize their own strategies in the new environment of electricity market and carbon market coordination. First, the influence of electricity and carbon trading coordination on electricity purchase strategy for grid enterprises was analysed in the paper. Then a power purchase optimization model was presented, which used the minimum cost of low carbon, energy saving and environment protection as the goal, the power generation capacity, installed capacity and pollutant emission as the constraints. Finally, a provincial power grid was taken as an example to analyse the model, and the optimization order of power purchase was obtained, which provided a new idea for the low carbon development of power grid enterprises.

  9. Optimal Multiuser Zero Forcing with Per-Antenna Power Constraints for Network MIMO Coordination

    Directory of Open Access Journals (Sweden)

    Kaviani Saeed

    2011-01-01

    Full Text Available We consider a multicell multiple-input multiple-output (MIMO coordinated downlink transmission, also known as network MIMO, under per-antenna power constraints. We investigate a simple multiuser zero-forcing (ZF linear precoding technique known as block diagonalization (BD for network MIMO. The optimal form of BD with per-antenna power constraints is proposed. It involves a novel approach of optimizing the precoding matrices over the entire null space of other users' transmissions. An iterative gradient descent method is derived by solving the dual of the throughput maximization problem, which finds the optimal precoding matrices globally and efficiently. The comprehensive simulations illustrate several network MIMO coordination advantages when the optimal BD scheme is used. Its achievable throughput is compared with the capacity region obtained through the recently established duality concept under per-antenna power constraints.

  10. Systems integration (automation system). System integration (automation system)

    Energy Technology Data Exchange (ETDEWEB)

    Fujii, K; Komori, T; Fukuma, Y; Oikawa, M [Nippon Steal Corp., Tokyo (Japan)

    1991-09-26

    This paper introduces business activities on an automation systems integration (SI) started by a company in July,1988, and describes the SI concepts. The business activities include, with the CIM (unified production carried out on computers) and AMENITY (living environment) as the mainstays, a single responsibility construction ranging from consultation on structuring optimal systems for processing and assembling industries and intelligent buildings to system design, installation and after-sales services. With an SI standing on users {prime} position taken most importantly, the business starts from a planning and consultation under close coordination. On the conceptual basis of structuring optimal systems using the ompany {prime}s affluent know-hows and tools and adapting and applying with multi-vendors, open networks, centralized and distributed systems, the business is promoted with the accumulated technologies capable of realizing artificial intelligence and neural networks in its background, and supported with highly valuable business results in the past. 10 figs., 1 tab.

  11. Automated gamma knife radiosurgery treatment planning with image registration, data-mining, and Nelder-Mead simplex optimization

    International Nuclear Information System (INIS)

    Lee, Kuan J.; Barber, David C.; Walton, Lee

    2006-01-01

    Gamma knife treatments are usually planned manually, requiring much expertise and time. We describe a new, fully automatic method of treatment planning. The treatment volume to be planned is first compared with a database of past treatments to find volumes closely matching in size and shape. The treatment parameters of the closest matches are used as starting points for the new treatment plan. Further optimization is performed with the Nelder-Mead simplex method: the coordinates and weight of the isocenters are allowed to vary until a maximally conformal plan specific to the new treatment volume is found. The method was tested on a randomly selected set of 10 acoustic neuromas and 10 meningiomas. Typically, matching a new volume took under 30 seconds. The time for simplex optimization, on a 3 GHz Xeon processor, ranged from under a minute for small volumes ( 30 000 cubic mm,>20 isocenters). In 8/10 acoustic neuromas and 8/10 meningiomas, the automatic method found plans with conformation number equal or better than that of the manual plan. In 4/10 acoustic neuromas and 5/10 meningiomas, both overtreatment and undertreatment ratios were equal or better in automated plans. In conclusion, data-mining of past treatments can be used to derive starting parameters for treatment planning. These parameters can then be computer optimized to give good plans automatically

  12. Design of an optimal automation system : Finding a balance between a human's task engagement and exhaustion

    NARCIS (Netherlands)

    Klein, Michel; van Lambalgen, Rianne

    2011-01-01

    In demanding tasks, human performance can seriously degrade as a consequence of increased workload and limited resources. In such tasks it is very important to maintain an optimal performance quality, therefore automation assistance is required. On the other hand, automation can also impose

  13. THE METHOD OF FORMING THE PIGGYBACK TECHNOLOGIES USING THE AUTOMATED HEURISTIC SYSTEM

    Directory of Open Access Journals (Sweden)

    Ye. Nahornyi

    2015-07-01

    Full Text Available In order to choose a rational piggyback technology there was offered a method that envisages the automated system improvement by giving it a heuristic nature. The automated system is based on a set of methods, techniques and strategies aimed at creating optimal resource saving technologies, which makes it possible to take into account with maximum efficiency the interests of all the participants of the delivery process. When organizing the piggyback traffic there is presupposed the coordination of operations between the piggyback traffic participants to minimize the cargo travel time.

  14. Optimization Method of Intersection Signal Coordinated Control Based on Vehicle Actuated Model

    Directory of Open Access Journals (Sweden)

    Chen Zhao-Meng

    2015-01-01

    Full Text Available Traditional timing green wave control with predetermined cycle, split, and offset cannot adapt for dynamic real-time traffic flow. This paper proposes a coordinated control method for variable cycle time green wave bandwidth optimization integrated with traffic-actuated control. In the coordinated control, green split is optimized in real time by the measured presence of arriving and/or standing vehicles in each intersection and simultaneously green waves along arterials are guaranteed. Specifically, the dynamic bound of green wave is firstly determined, and then green early-start and green late-start algorithms are presented respectively to accommodate the fluctuations in vehicle arrival rates in each phase. Numerical examples show that the proposed method improves green time, expands green wave bandwidth, and reduces queuing.

  15. Energy Coordinative Optimization of Wind-Storage-Load Microgrids Based on Short-Term Prediction

    Directory of Open Access Journals (Sweden)

    Changbin Hu

    2015-02-01

    Full Text Available According to the topological structure of wind-storage-load complementation microgrids, this paper proposes a method for energy coordinative optimization which focuses on improvement of the economic benefits of microgrids in the prediction framework. First of all, the external characteristic mathematical model of distributed generation (DG units including wind turbines and storage batteries are established according to the requirements of the actual constraints. Meanwhile, using the minimum consumption costs from the external grid as the objective function, a grey prediction model with residual modification is introduced to output the predictive wind turbine power and load at specific periods. Second, based on the basic framework of receding horizon optimization, an intelligent genetic algorithm (GA is applied to figure out the optimum solution in the predictive horizon for the complex non-linear coordination control model of microgrids. The optimum results of the GA are compared with the receding solution of mixed integer linear programming (MILP. The obtained results show that the method is a viable approach for energy coordinative optimization of microgrid systems for energy flow and reasonable schedule. The effectiveness and feasibility of the proposed method is verified by examples.

  16. Automated electrochemical assembly of the protected potential TMG-chitotriomycin precursor based on rational optimization of the carbohydrate building block.

    Science.gov (United States)

    Nokami, Toshiki; Isoda, Yuta; Sasaki, Norihiko; Takaiso, Aki; Hayase, Shuichi; Itoh, Toshiyuki; Hayashi, Ryutaro; Shimizu, Akihiro; Yoshida, Jun-ichi

    2015-03-20

    The anomeric arylthio group and the hydroxyl-protecting groups of thioglycosides were optimized to construct carbohydrate building blocks for automated electrochemical solution-phase synthesis of oligoglucosamines having 1,4-β-glycosidic linkages. The optimization study included density functional theory calculations, measurements of the oxidation potentials, and the trial synthesis of the chitotriose trisaccharide. The automated synthesis of the protected potential N,N,N-trimethyl-d-glucosaminylchitotriomycin precursor was accomplished by using the optimized building block.

  17. Geometry Based Design Automation : Applied to Aircraft Modelling and Optimization

    OpenAIRE

    Amadori, Kristian

    2012-01-01

    Product development processes are continuously challenged by demands for increased efficiency. As engineering products become more and more complex, efficient tools and methods for integrated and automated design are needed throughout the development process. Multidisciplinary Design Optimization (MDO) is one promising technique that has the potential to drastically improve concurrent design. MDO frameworks combine several disciplinary models with the aim of gaining a holistic perspective of ...

  18. Development of an Integrated Approach to Routine Automation of Neutron Activation Analysis. Results of a Coordinated Research Project

    International Nuclear Information System (INIS)

    2018-04-01

    Neutron activation analysis (NAA) is a powerful technique for determining bulk composition of major and trace elements. Automation may contribute significantly to keep NAA competitive for end-users. It provides opportunities for a larger analytical capacity and a shorter overall turnaround time if large series of samples have to be analysed. This publication documents and disseminates the expertise generated on automation in NAA during a coordinated research project (CRP). The CRP participants presented different cost-effective designs of sample changers for gamma-ray spectrometry as well as irradiation devices, and were able to construct and successfully test these systems. They also implemented, expanded and improved quality control and quality assurance as cross-cutting topical area of their automated NAA procedures. The publication serves as a reference of interest to NAA practitioners, experts, and research reactor personnel, but also to various stakeholders and users interested in basic research and/or services provided by NAA. The individual country reports are available on the CD-ROM attached to this publication.

  19. Optimal Coordination of Distance and Directional Overcurrent Relays Considering Different Network Topologies

    Directory of Open Access Journals (Sweden)

    Y. Damchi

    2015-09-01

    Full Text Available Most studies in relay coordination have focused solely on coordination of overcurrent relays while distance relays are used as the main protection of transmission lines. Since, simultaneous coordination of these two types of relays can provide a better protection, in this paper, a new approach is proposed for simultaneous coordination of distance and directional overcurrent relays (D&DOCRs. Also, pursued by most of the previously published studies, the settings of D&DOCRs are usually determined based on a main network topology which may result in mis-coordination of relays when changes occur in the network topology. In the proposed method, in order to have a robust coordination, network topology changes are taken into account in the coordination problem. In the new formulation, coordination constraints for different network topologies are added to those of the main topology. A complex nonlinear optimization problem is derived to find the desirable relay settings. Then, the problem is solved using hybridized genetic algorithm (GA with linear programming (LP method (HGA. The proposed method is evaluated using the IEEE 14-bus test system. According to the results, a feasible and robust solution is obtained for D&DOCRs coordination while all constraints, which are due to different network topologies, are satisfied.

  20. The contaminant analysis automation robot implementation for the automated laboratory

    International Nuclear Information System (INIS)

    Younkin, J.R.; Igou, R.E.; Urenda, T.D.

    1995-01-01

    The Contaminant Analysis Automation (CAA) project defines the automated laboratory as a series of standard laboratory modules (SLM) serviced by a robotic standard support module (SSM). These SLMs are designed to allow plug-and-play integration into automated systems that perform standard analysis methods (SAM). While the SLMs are autonomous in the execution of their particular chemical processing task, the SAM concept relies on a high-level task sequence controller (TSC) to coordinate the robotic delivery of materials requisite for SLM operations, initiate an SLM operation with the chemical method dependent operating parameters, and coordinate the robotic removal of materials from the SLM when its commands and events has been established to allow ready them for transport operations as well as performing the Supervisor and Subsystems (GENISAS) software governs events from the SLMs and robot. The Intelligent System Operating Environment (ISOE) enables the inter-process communications used by GENISAS. CAA selected the Hewlett-Packard Optimized Robot for Chemical Analysis (ORCA) and its associated Windows based Methods Development Software (MDS) as the robot SSM. The MDS software is used to teach the robot each SLM position and required material port motions. To allow the TSC to command these SLM motions, a hardware and software implementation was required that allowed message passing between different operating systems. This implementation involved the use of a Virtual Memory Extended (VME) rack with a Force CPU-30 computer running VxWorks; a real-time multitasking operating system, and a Radiuses PC compatible VME computer running MDS. A GENISAS server on The Force computer accepts a transport command from the TSC, a GENISAS supervisor, over Ethernet and notifies software on the RadiSys PC of the pending command through VMEbus shared memory. The command is then delivered to the MDS robot control software using a Windows Dynamic Data Exchange conversation

  1. Novel Particle Swarm Optimization and Its Application in Calibrating the Underwater Transponder Coordinates

    OpenAIRE

    Zheping Yan; Chao Deng; Benyin Li; Jiajia Zhou

    2014-01-01

    A novel improved particle swarm algorithm named competition particle swarm optimization (CPSO) is proposed to calibrate the Underwater Transponder coordinates. To improve the performance of the algorithm, TVAC algorithm is introduced into CPSO to present an extension competition particle swarm optimization (ECPSO). The proposed method is tested with a set of 10 standard optimization benchmark problems and the results are compared with those obtained through existing PSO algorithms, basic par...

  2. Coordinated trajectory planning of dual-arm space robot using constrained particle swarm optimization

    Science.gov (United States)

    Wang, Mingming; Luo, Jianjun; Yuan, Jianping; Walter, Ulrich

    2018-05-01

    Application of the multi-arm space robot will be more effective than single arm especially when the target is tumbling. This paper investigates the application of particle swarm optimization (PSO) strategy to coordinated trajectory planning of the dual-arm space robot in free-floating mode. In order to overcome the dynamics singularities issue, the direct kinematics equations in conjunction with constrained PSO are employed for coordinated trajectory planning of dual-arm space robot. The joint trajectories are parametrized with Bézier curve to simplify the calculation. Constrained PSO scheme with adaptive inertia weight is implemented to find the optimal solution of joint trajectories while specific objectives and imposed constraints are satisfied. The proposed method is not sensitive to the singularity issue due to the application of forward kinematic equations. Simulation results are presented for coordinated trajectory planning of two kinematically redundant manipulators mounted on a free-floating spacecraft and demonstrate the effectiveness of the proposed method.

  3. Extending the NIF DISCO framework to automate complex workflow: coordinating the harvest and integration of data from diverse neuroscience information resources.

    Science.gov (United States)

    Marenco, Luis N; Wang, Rixin; Bandrowski, Anita E; Grethe, Jeffrey S; Shepherd, Gordon M; Miller, Perry L

    2014-01-01

    This paper describes how DISCO, the data aggregator that supports the Neuroscience Information Framework (NIF), has been extended to play a central role in automating the complex workflow required to support and coordinate the NIF's data integration capabilities. The NIF is an NIH Neuroscience Blueprint initiative designed to help researchers access the wealth of data related to the neurosciences available via the Internet. A central component is the NIF Federation, a searchable database that currently contains data from 231 data and information resources regularly harvested, updated, and warehoused in the DISCO system. In the past several years, DISCO has greatly extended its functionality and has evolved to play a central role in automating the complex, ongoing process of harvesting, validating, integrating, and displaying neuroscience data from a growing set of participating resources. This paper provides an overview of DISCO's current capabilities and discusses a number of the challenges and future directions related to the process of coordinating the integration of neuroscience data within the NIF Federation.

  4. Automated calculation of point A coordinates for CT-based high-dose-rate brachytherapy of cervical cancer

    Directory of Open Access Journals (Sweden)

    Hyejoo Kang

    2017-07-01

    Full Text Available Purpose: The goal is to develop a stand-alone application, which automatically and consistently computes the coordinates of the dose calculation point recommended by the American Brachytherapy Society (i.e., point A based solely on the implanted applicator geometry for cervical cancer brachytherapy. Material and methods: The application calculates point A coordinates from the source dwell geometries in the computed tomography (CT scans, and outputs the 3D coordinates in the left and right directions. The algorithm was tested on 34 CT scans of 7 patients treated with high-dose-rate (HDR brachytherapy using tandem and ovoid applicators. A single experienced user retrospectively and manually inserted point A into each CT scan, whose coordinates were used as the “gold standard” for all comparisons. The gold standard was subtracted from the automatically calculated points, a second manual placement by the same experienced user, and the clinically used point coordinates inserted by multiple planners. Coordinate differences and corresponding variances were compared using nonparametric tests. Results: Automatically calculated, manually placed, and clinically used points agree with the gold standard to < 1 mm, 1 mm, 2 mm, respectively. When compared to the gold standard, the average and standard deviation of the 3D coordinate differences were 0.35 ± 0.14 mm from automatically calculated points, 0.38 ± 0.21 mm from the second manual placement, and 0.71 ± 0.44 mm from the clinically used point coordinates. Both the mean and standard deviations of the 3D coordinate differences were statistically significantly different from the gold standard, when point A was placed by multiple users (p < 0.05 but not when placed repeatedly by a single user or when calculated automatically. There were no statistical differences in doses, which agree to within 1-2% on average for all three groups. Conclusions: The study demonstrates that the automated algorithm

  5. Optimal Control of Wind Farms for Coordinated TSO-DSO Reactive Power Management

    Directory of Open Access Journals (Sweden)

    David Sebastian Stock

    2018-01-01

    Full Text Available The growing importance of renewable generation connected to distribution grids requires an increased coordination between transmission system operators (TSOs and distribution system operators (DSOs for reactive power management. This work proposes a practical and effective interaction method based on sequential optimizations to evaluate the reactive flexibility potential of distribution networks and to dispatch them along with traditional synchronous generators, keeping to a minimum the information exchange. A modular optimal power flow (OPF tool featuring multi-objective optimization is developed for this purpose. The proposed method is evaluated for a model of a real German 110 kV grid with 1.6 GW of installed wind power capacity and a reduced order model of the surrounding transmission system. Simulations show the benefit of involving wind farms in reactive power support reducing losses both at distribution and transmission level. Different types of setpoints are investigated, showing the feasibility for the DSO to fulfill also individual voltage and reactive power targets over multiple connection points. Finally, some suggestions are presented to achieve a fair coordination, combining both TSO and DSO requirements.

  6. Optimizing wireless LAN for longwall coal mine automation

    Energy Technology Data Exchange (ETDEWEB)

    Hargrave, C.O.; Ralston, J.C.; Hainsworth, D.W. [Exploration & Mining Commonwealth Science & Industrial Research Organisation, Pullenvale, Qld. (Australia)

    2007-01-15

    A significant development in underground longwall coal mining automation has been achieved with the successful implementation of wireless LAN (WLAN) technology for communication on a longwall shearer. WIreless-FIdelity (Wi-Fi) was selected to meet the bandwidth requirements of the underground data network, and several configurations were installed on operating longwalls to evaluate their performance. Although these efforts demonstrated the feasibility of using WLAN technology in longwall operation, it was clear that new research and development was required in order to establish optimal full-face coverage. By undertaking an accurate characterization of the target environment, it has been possible to achieve great improvements in WLAN performance over a nominal Wi-Fi installation. This paper discusses the impact of Fresnel zone obstructions and multipath effects on radio frequency propagation and reports an optimal antenna and system configuration. Many of the lessons learned in the longwall case are immediately applicable to other underground mining operations, particularly wherever there is a high degree of obstruction from mining equipment.

  7. Optimal number of stimulation contacts for coordinated reset neuromodulation

    Science.gov (United States)

    Lysyansky, Borys; Popovych, Oleksandr V.; Tass, Peter A.

    2013-01-01

    In this computational study we investigate coordinated reset (CR) neuromodulation designed for an effective control of synchronization by multi-site stimulation of neuronal target populations. This method was suggested to effectively counteract pathological neuronal synchrony characteristic for several neurological disorders. We study how many stimulation sites are required for optimal CR-induced desynchronization. We found that a moderate increase of the number of stimulation sites may significantly prolong the post-stimulation desynchronized transient after the stimulation is completely switched off. This can, in turn, reduce the amount of the administered stimulation current for the intermittent ON–OFF CR stimulation protocol, where time intervals with stimulation ON are recurrently followed by time intervals with stimulation OFF. In addition, we found that the optimal number of stimulation sites essentially depends on how strongly the administered current decays within the neuronal tissue with increasing distance from the stimulation site. In particular, for a broad spatial stimulation profile, i.e., for a weak spatial decay rate of the stimulation current, CR stimulation can optimally be delivered via a small number of stimulation sites. Our findings may contribute to an optimization of therapeutic applications of CR neuromodulation. PMID:23885239

  8. Context-awareness in task automation services by distributed event processing

    OpenAIRE

    Coronado Barrios, Miguel; Bruns, Ralf; Dunkel, Jürgen; Stipković, Sebastian

    2014-01-01

    Everybody has to coordinate several tasks everyday, usually in a manual manner. Recently, the concept of Task Automation Services has been introduced to automate and personalize the task coordination problem. Several user centered platforms and applications have arisen in the last years, that let their users configure their very own automations based on third party services. In this paper, we propose a new system architecture for Task Automation Services in a heterogeneous mobile, smart devic...

  9. Automated Methods of Corrosion Measurements

    DEFF Research Database (Denmark)

    Andersen, Jens Enevold Thaulov

    1997-01-01

    . Mechanical control, recording, and data processing must therefore be automated to a high level of precision and reliability. These general techniques and the apparatus involved have been described extensively. The automated methods of such high-resolution microscopy coordinated with computerized...

  10. The optimal number, type and location of devices in automation of electrical distribution networks

    Directory of Open Access Journals (Sweden)

    Popović Željko N.

    2015-01-01

    Full Text Available This paper presents the mixed integer linear programming based model for determining optimal number, type and location of remotely controlled and supervised devices in distribution networks in the presence of distributed generators. The proposed model takes into consideration a number of different devices simultaneously (remotely controlled circuit breakers/reclosers, sectionalizing switches, remotely supervised and local fault passage indicators along with the following: expected outage cost to consumers and producers due to momentary and long-term interruptions, automated device expenses (capital investment, installation, and annual operation and maintenance costs, number and expenses of crews involved in the isolation and restoration process. Furthermore, the other possible benefits of each of automated device are also taken into account (e.g., benefits due to decreasing the cost of switching operations in normal conditions. The obtained numerical results emphasize the importance of consideration of different types of automation devices simultaneously. They also show that the proposed approach have a potential to improve the process of determining of the best automation strategy in real life distribution networks.

  11. Plug-and-play monitoring and performance optimization for industrial automation processes

    CERN Document Server

    Luo, Hao

    2017-01-01

    Dr.-Ing. Hao Luo demonstrates the developments of advanced plug-and-play (PnP) process monitoring and control systems for industrial automation processes. With aid of the so-called Youla parameterization, a novel PnP process monitoring and control architecture (PnP-PMCA) with modularized components is proposed. To validate the developments, a case study on an industrial rolling mill benchmark is performed, and the real-time implementation on a laboratory brushless DC motor is presented. Contents PnP Process Monitoring and Control Architecture Real-Time Configuration Techniques for PnP Process Monitoring Real-Time Configuration Techniques for PnP Performance Optimization Benchmark Study and Real-Time Implementation Target Groups Researchers and students of Automation and Control Engineering Practitioners in the area of Industrial and Production Engineering The Author Hao Luo received the Ph.D. degree at the Institute for Automatic Control and Complex Systems (AKS) at the University of Duisburg-Essen, Germany, ...

  12. MAS-based Distributed Coordinated Control and Optimization in Microgrid and Microgrid Clusters: A Comprehensive Overview

    DEFF Research Database (Denmark)

    Han, Yang; Zhang, Ke; Hong, Li

    2018-01-01

    The increasing integration of the distributed renewable energy sources highlights the requirement to design various control strategies for microgrids (MGs) and microgrid clusters (MGCs). The multi-agent system (MAS)-based distributed coordinated control strategies shows the benefits to balance...... the power and energy, stabilize voltage and frequency, achieve economic and coordinated operation among the MGs and MGCs. However, the complex and diverse combinations of distributed generations in multi-agent system increase the complexity of system control and operation. In order to design the optimized...... configuration and control strategy using MAS, the topology models and mathematic models such as the graph topology model, non-cooperative game model, the genetic algorithm and particle swarm optimization algorithm are summarized. The merits and drawbacks of these control methods are compared. Moreover, since...

  13. Optimal Solution for VLSI Physical Design Automation Using Hybrid Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    I. Hameem Shanavas

    2014-01-01

    Full Text Available In Optimization of VLSI Physical Design, area minimization and interconnect length minimization is an important objective in physical design automation of very large scale integration chips. The objective of minimizing the area and interconnect length would scale down the size of integrated chips. To meet the above objective, it is necessary to find an optimal solution for physical design components like partitioning, floorplanning, placement, and routing. This work helps to perform the optimization of the benchmark circuits with the above said components of physical design using hierarchical approach of evolutionary algorithms. The goal of minimizing the delay in partitioning, minimizing the silicon area in floorplanning, minimizing the layout area in placement, minimizing the wirelength in routing has indefinite influence on other criteria like power, clock, speed, cost, and so forth. Hybrid evolutionary algorithm is applied on each of its phases to achieve the objective. Because evolutionary algorithm that includes one or many local search steps within its evolutionary cycles to obtain the minimization of area and interconnect length. This approach combines a hierarchical design like genetic algorithm and simulated annealing to attain the objective. This hybrid approach can quickly produce optimal solutions for the popular benchmarks.

  14. Vibrational quasi-degenerate perturbation theory with optimized coordinates: applications to ethylene and trans-1,3-butadiene.

    Science.gov (United States)

    Yagi, Kiyoshi; Otaki, Hiroki

    2014-02-28

    A perturbative extension to optimized coordinate vibrational self-consistent field (oc-VSCF) is proposed based on the quasi-degenerate perturbation theory (QDPT). A scheme to construct the degenerate space (P space) is developed, which incorporates degenerate configurations and alleviates the divergence of perturbative expansion due to localized coordinates in oc-VSCF (e.g., local O-H stretching modes of water). An efficient configuration selection scheme is also implemented, which screens out the Hamiltonian matrix element between the P space configuration (p) and the complementary Q space configuration (q) based on a difference in their quantum numbers (λpq = ∑s|ps - qs|). It is demonstrated that the second-order vibrational QDPT based on optimized coordinates (oc-VQDPT2) smoothly converges with respect to the order of the mode coupling, and outperforms the conventional one based on normal coordinates. Furthermore, an improved, fast algorithm is developed for optimizing the coordinates. First, the minimization of the VSCF energy is conducted in a restricted parameter space, in which only a portion of pairs of coordinates is selectively transformed. A rational index is devised for this purpose, which identifies the important coordinate pairs to mix from others that may remain unchanged based on the magnitude of harmonic coupling induced by the transformation. Second, a cubic force field (CFF) is employed in place of a quartic force field, which bypasses intensive procedures that arise due to the presence of the fourth-order force constants. It is found that oc-VSCF based on CFF together with the pair selection scheme yields the coordinates similar in character to the conventional ones such that the final vibrational energy is affected very little while gaining an order of magnitude acceleration. The proposed method is applied to ethylene and trans-1,3-butadiene. An accurate, multi-resolution potential, which combines the MP2 and coupled-cluster with singles

  15. Vibrational quasi-degenerate perturbation theory with optimized coordinates: Applications to ethylene and trans-1,3-butadiene

    Energy Technology Data Exchange (ETDEWEB)

    Yagi, Kiyoshi, E-mail: kiyoshi.yagi@riken.jp; Otaki, Hiroki [Theoretical Molecular Science Laboratory, RIKEN, 2-1 Hirosawa, Wako, Saitama 351-0198 (Japan)

    2014-02-28

    A perturbative extension to optimized coordinate vibrational self-consistent field (oc-VSCF) is proposed based on the quasi-degenerate perturbation theory (QDPT). A scheme to construct the degenerate space (P space) is developed, which incorporates degenerate configurations and alleviates the divergence of perturbative expansion due to localized coordinates in oc-VSCF (e.g., local O–H stretching modes of water). An efficient configuration selection scheme is also implemented, which screens out the Hamiltonian matrix element between the P space configuration (p) and the complementary Q space configuration (q) based on a difference in their quantum numbers (λ{sub pq} = ∑{sub s}|p{sub s} − q{sub s}|). It is demonstrated that the second-order vibrational QDPT based on optimized coordinates (oc-VQDPT2) smoothly converges with respect to the order of the mode coupling, and outperforms the conventional one based on normal coordinates. Furthermore, an improved, fast algorithm is developed for optimizing the coordinates. First, the minimization of the VSCF energy is conducted in a restricted parameter space, in which only a portion of pairs of coordinates is selectively transformed. A rational index is devised for this purpose, which identifies the important coordinate pairs to mix from others that may remain unchanged based on the magnitude of harmonic coupling induced by the transformation. Second, a cubic force field (CFF) is employed in place of a quartic force field, which bypasses intensive procedures that arise due to the presence of the fourth-order force constants. It is found that oc-VSCF based on CFF together with the pair selection scheme yields the coordinates similar in character to the conventional ones such that the final vibrational energy is affected very little while gaining an order of magnitude acceleration. The proposed method is applied to ethylene and trans-1,3-butadiene. An accurate, multi-resolution potential, which combines the MP2 and

  16. An optimized routing algorithm for the automated assembly of standard multimode ribbon fibers in a full-mesh optical backplane

    Science.gov (United States)

    Basile, Vito; Guadagno, Gianluca; Ferrario, Maddalena; Fassi, Irene

    2018-03-01

    In this paper a parametric, modular and scalable algorithm allowing a fully automated assembly of a backplane fiber-optic interconnection circuit is presented. This approach guarantees the optimization of the optical fiber routing inside the backplane with respect to specific criteria (i.e. bending power losses), addressing both transmission performance and overall costs issues. Graph theory has been exploited to simplify the complexity of the NxN full-mesh backplane interconnection topology, firstly, into N independent sub-circuits and then, recursively, into a limited number of loops easier to be generated. Afterwards, the proposed algorithm selects a set of geometrical and architectural parameters whose optimization allows to identify the optimal fiber optic routing for each sub-circuit of the backplane. The topological and numerical information provided by the algorithm are then exploited to control a robot which performs the automated assembly of the backplane sub-circuits. The proposed routing algorithm can be extended to any array architecture and number of connections thanks to its modularity and scalability. Finally, the algorithm has been exploited for the automated assembly of an 8x8 optical backplane realized with standard multimode (MM) 12-fiber ribbons.

  17. Automated Cache Performance Analysis And Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Mohror, Kathryn [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-12-23

    While there is no lack of performance counter tools for coarse-grained measurement of cache activity, there is a critical lack of tools for relating data layout to cache behavior to application performance. Generally, any nontrivial optimizations are either not done at all, or are done ”by hand” requiring significant time and expertise. To the best of our knowledge no tool available to users measures the latency of memory reference instructions for partic- ular addresses and makes this information available to users in an easy-to-use and intuitive way. In this project, we worked to enable the Open|SpeedShop performance analysis tool to gather memory reference latency information for specific instructions and memory ad- dresses, and to gather and display this information in an easy-to-use and intuitive way to aid performance analysts in identifying problematic data structures in their codes. This tool was primarily designed for use in the supercomputer domain as well as grid, cluster, cloud-based parallel e-commerce, and engineering systems and middleware. Ultimately, we envision a tool to automate optimization of application cache layout and utilization in the Open|SpeedShop performance analysis tool. To commercialize this soft- ware, we worked to develop core capabilities for gathering enhanced memory usage per- formance data from applications and create and apply novel methods for automatic data structure layout optimizations, tailoring the overall approach to support existing supercom- puter and cluster programming models and constraints. In this Phase I project, we focused on infrastructure necessary to gather performance data and present it in an intuitive way to users. With the advent of enhanced Precise Event-Based Sampling (PEBS) counters on recent Intel processor architectures and equivalent technology on AMD processors, we are now in a position to access memory reference information for particular addresses. Prior to the introduction of PEBS counters

  18. Automation of P-3 Simulations to Improve Operator Workload

    Science.gov (United States)

    2012-09-01

    Training GBE Group Behavior Engine GCC Geocentric Coordinates GCS Global Coordinate System GUI Graphical User Interface xiv HLA High...this thesis and because they each have a unique approach to solving the problem of entity behavior automation. A. DISCOVERY MACHINE The United States...from the operators and can be automated in JSAF using the mental simulation approach . Two trips were conducted to visit the Naval Warfare

  19. The Spiral Discovery Network as an Automated General-Purpose Optimization Tool

    Directory of Open Access Journals (Sweden)

    Adam B. Csapo

    2018-01-01

    Full Text Available The Spiral Discovery Method (SDM was originally proposed as a cognitive artifact for dealing with black-box models that are dependent on multiple inputs with nonlinear and/or multiplicative interaction effects. Besides directly helping to identify functional patterns in such systems, SDM also simplifies their control through its characteristic spiral structure. In this paper, a neural network-based formulation of SDM is proposed together with a set of automatic update rules that makes it suitable for both semiautomated and automated forms of optimization. The behavior of the generalized SDM model, referred to as the Spiral Discovery Network (SDN, and its applicability to nondifferentiable nonconvex optimization problems are elucidated through simulation. Based on the simulation, the case is made that its applicability would be worth investigating in all areas where the default approach of gradient-based backpropagation is used today.

  20. Research on ISFLA-Based Optimal Control Strategy for the Coordinated Charging of EV Battery Swap Station

    Directory of Open Access Journals (Sweden)

    Xueliang Huang

    2013-01-01

    Full Text Available As an important component of the smart grid, electric vehicles (EVs could be a good measure against energy shortages and environmental pollution. A main way of energy supply to EVs is to swap battery from the swap station. Based on the characteristics of EV battery swap station, the coordinated charging optimal control strategy is investigated to smooth the load fluctuation. Shuffled frog leaping algorithm (SFLA is an optimization method inspired by the memetic evolution of a group of frogs when seeking food. An improved shuffled frog leaping algorithm (ISFLA with the reflecting method to deal with the boundary constraint is proposed to obtain the solution of the optimal control strategy for coordinated charging. Based on the daily load of a certain area, the numerical simulations including the comparison of PSO and ISFLA are carried out and the results show that the presented ISFLA can effectively lower the peak-valley difference and smooth the load profile with the faster convergence rate and higher convergence precision.

  1. Automated optimal coordination of multiple-DOF neuromuscular actions in feedforward neuroprostheses.

    Science.gov (United States)

    Lujan, J Luis; Crago, Patrick E

    2009-01-01

    This paper describes a new method for designing feedforward controllers for multiple-muscle, multiple-DOF, motor system neural prostheses. The design process is based on experimental measurement of the forward input/output properties of the neuromechanical system and numerical optimization of stimulation patterns to meet muscle coactivation criteria, thus resolving the muscle redundancy (i.e., overcontrol) and the coupled DOF problems inherent in neuromechanical systems. We designed feedforward controllers to control the isometric forces at the tip of the thumb in two directions during stimulation of three thumb muscles as a model system. We tested the method experimentally in ten able-bodied individuals and one patient with spinal cord injury. Good control of isometric force in both DOFs was observed, with rms errors less than 10% of the force range in seven experiments and statistically significant correlations between the actual and target forces in all ten experiments. Systematic bias and slope errors were observed in a few experiments, likely due to the neuromuscular fatigue. Overall, the tests demonstrated the ability of a general design approach to satisfy both control and coactivation criteria in multiple-muscle, multiple-axis neuromechanical systems, which is applicable to a wide range of neuromechanical systems and stimulation electrodes.

  2. Optimization of automation: I. Estimation method of cognitive automation rates reflecting the effects of automation on human operators in nuclear power plants

    International Nuclear Information System (INIS)

    Lee, Seung Min; Kim, Jong Hyun; Seong, Poong Hyun

    2014-01-01

    Highlights: • We propose an estimation method of the automation rate by taking the advantages of automation as the estimation measures. • We conduct the experiments to examine the validity of the suggested method. • The higher the cognitive automation rate is, the greater the decreased rate of the working time will be. • The usefulness of the suggested estimation method is proved by statistical analyses. - Abstract: Since automation was introduced in various industrial fields, the concept of the automation rate has been used to indicate the inclusion proportion of automation among all work processes or facilities. Expressions of the inclusion proportion of automation are predictable, as is the ability to express the degree of the enhancement of human performance. However, many researchers have found that a high automation rate does not guarantee high performance. Therefore, to reflect the effects of automation on human performance, this paper proposes a new estimation method of the automation rate that considers the effects of automation on human operators in nuclear power plants (NPPs). Automation in NPPs can be divided into two types: system automation and cognitive automation. Some general descriptions and characteristics of each type of automation are provided, and the advantages of automation are investigated. The advantages of each type of automation are used as measures of the estimation method of the automation rate. One advantage was found to be a reduction in the number of tasks, and another was a reduction in human cognitive task loads. The system and the cognitive automation rate were proposed as quantitative measures by taking advantage of the aforementioned benefits. To quantify the required human cognitive task loads and thus suggest the cognitive automation rate, Conant’s information-theory-based model was applied. The validity of the suggested method, especially as regards the cognitive automation rate, was proven by conducting

  3. A Bi-Level Particle Swarm Optimization Algorithm for Solving Unit Commitment Problems with Wind-EVs Coordinated Dispatch

    Science.gov (United States)

    Song, Lei; Zhang, Bo

    2017-07-01

    Nowadays, the grid faces much more challenges caused by wind power and the accessing of electric vehicles (EVs). Based on the potentiality of coordinated dispatch, a model of wind-EVs coordinated dispatch was developed. Then, A bi-level particle swarm optimization algorithm for solving the model was proposed in this paper. The application of this algorithm to 10-unit test system carried out that coordinated dispatch can benefit the power system from the following aspects: (1) Reducing operating costs; (2) Improving the utilization of wind power; (3) Stabilizing the peak-valley difference.

  4. Improving the automated optimization of profile extrusion dies by applying appropriate optimization areas and strategies

    Science.gov (United States)

    Hopmann, Ch.; Windeck, C.; Kurth, K.; Behr, M.; Siegbert, R.; Elgeti, S.

    2014-05-01

    The rheological design of profile extrusion dies is one of the most challenging tasks in die design. As no analytical solution is available, the quality and the development time for a new design highly depend on the empirical knowledge of the die manufacturer. Usually, prior to start production several time-consuming, iterative running-in trials need to be performed to check the profile accuracy and the die geometry is reworked. An alternative are numerical flow simulations. These simulations enable to calculate the melt flow through a die so that the quality of the flow distribution can be analyzed. The objective of a current research project is to improve the automated optimization of profile extrusion dies. Special emphasis is put on choosing a convenient starting geometry and parameterization, which enable for possible deformations. In this work, three commonly used design features are examined with regard to their influence on the optimization results. Based on the results, a strategy is derived to select the most relevant areas of the flow channels for the optimization. For these characteristic areas recommendations are given concerning an efficient parameterization setup that still enables adequate deformations of the flow channel geometry. Exemplarily, this approach is applied to a L-shaped profile with different wall thicknesses. The die is optimized automatically and simulation results are qualitatively compared with experimental results. Furthermore, the strategy is applied to a complex extrusion die of a floor skirting profile to prove the universal adaptability.

  5. Selection of an optimal neural network architecture for computer-aided detection of microcalcifications - Comparison of automated optimization techniques

    International Nuclear Information System (INIS)

    Gurcan, Metin N.; Sahiner, Berkman; Chan Heangping; Hadjiiski, Lubomir; Petrick, Nicholas

    2001-01-01

    Many computer-aided diagnosis (CAD) systems use neural networks (NNs) for either detection or classification of abnormalities. Currently, most NNs are 'optimized' by manual search in a very limited parameter space. In this work, we evaluated the use of automated optimization methods for selecting an optimal convolution neural network (CNN) architecture. Three automated methods, the steepest descent (SD), the simulated annealing (SA), and the genetic algorithm (GA), were compared. We used as an example the CNN that classifies true and false microcalcifications detected on digitized mammograms by a prescreening algorithm. Four parameters of the CNN architecture were considered for optimization, the numbers of node groups and the filter kernel sizes in the first and second hidden layers, resulting in a search space of 432 possible architectures. The area A z under the receiver operating characteristic (ROC) curve was used to design a cost function. The SA experiments were conducted with four different annealing schedules. Three different parent selection methods were compared for the GA experiments. An available data set was split into two groups with approximately equal number of samples. By using the two groups alternately for training and testing, two different cost surfaces were evaluated. For the first cost surface, the SD method was trapped in a local minimum 91% (392/432) of the time. The SA using the Boltzman schedule selected the best architecture after evaluating, on average, 167 architectures. The GA achieved its best performance with linearly scaled roulette-wheel parent selection; however, it evaluated 391 different architectures, on average, to find the best one. The second cost surface contained no local minimum. For this surface, a simple SD algorithm could quickly find the global minimum, but the SA with the very fast reannealing schedule was still the most efficient. The same SA scheme, however, was trapped in a local minimum on the first cost

  6. Multi-objective optimization for an automated and simultaneous phase and baseline correction of NMR spectral data

    Science.gov (United States)

    Sawall, Mathias; von Harbou, Erik; Moog, Annekathrin; Behrens, Richard; Schröder, Henning; Simoneau, Joël; Steimers, Ellen; Neymeyr, Klaus

    2018-04-01

    Spectral data preprocessing is an integral and sometimes inevitable part of chemometric analyses. For Nuclear Magnetic Resonance (NMR) spectra a possible first preprocessing step is a phase correction which is applied to the Fourier transformed free induction decay (FID) signal. This preprocessing step can be followed by a separate baseline correction step. Especially if series of high-resolution spectra are considered, then automated and computationally fast preprocessing routines are desirable. A new method is suggested that applies the phase and the baseline corrections simultaneously in an automated form without manual input, which distinguishes this work from other approaches. The underlying multi-objective optimization or Pareto optimization provides improved results compared to consecutively applied correction steps. The optimization process uses an objective function which applies strong penalty constraints and weaker regularization conditions. The new method includes an approach for the detection of zero baseline regions. The baseline correction uses a modified Whittaker smoother. The functionality of the new method is demonstrated for experimental NMR spectra. The results are verified against gravimetric data. The method is compared to alternative preprocessing tools. Additionally, the simultaneous correction method is compared to a consecutive application of the two correction steps.

  7. Optimal Real-time Dispatch for Integrated Energy Systems

    DEFF Research Database (Denmark)

    Anvari-Moghaddam, Amjad; Guerrero, Josep M.; Rahimi-Kian, Ashkan

    2016-01-01

    With the emerging of small-scale integrated energy systems (IESs), there are significant potentials to increase the functionality of a typical demand-side management (DSM) strategy and typical implementation of building-level distributed energy resources (DERs). By integrating DSM and DERs...... into a cohesive, networked package that fully utilizes smart energy-efficient end-use devices, advanced building control/automation systems, and integrated communications architectures, it is possible to efficiently manage energy and comfort at the end-use location. In this paper, an ontology-driven multi......-agent control system with intelligent optimizers is proposed for optimal real-time dispatch of an integrated building and microgrid system considering coordinated demand response (DR) and DERs management. The optimal dispatch problem is formulated as a mixed integer nonlinear programing problem (MINLP...

  8. Automated Portfolio Optimization Based on a New Test for Structural Breaks

    Directory of Open Access Journals (Sweden)

    Tobias Berens

    2014-04-01

    Full Text Available We present a completely automated optimization strategy which combines the classical Markowitz mean-variance portfolio theory with a recently proposed test for structural breaks in covariance matrices. With respect to equity portfolios, global minimum-variance optimizations, which base solely on the covariance matrix, yield considerable results in previous studies. However, financial assets cannot be assumed to have a constant covariance matrix over longer periods of time. Hence, we estimate the covariance matrix of the assets by respecting potential change points. The resulting approach resolves the issue of determining a sample for parameter estimation. Moreover, we investigate if this approach is also appropriate for timing the reoptimizations. Finally, we apply the approach to two datasets and compare the results to relevant benchmark techniques by means of an out-of-sample study. It is shown that the new approach outperforms equally weighted portfolios and plain minimum-variance portfolios on average.

  9. Optimization of an NLEO-based algorithm for automated detection of spontaneous activity transients in early preterm EEG

    International Nuclear Information System (INIS)

    Palmu, Kirsi; Vanhatalo, Sampsa; Stevenson, Nathan; Wikström, Sverre; Hellström-Westas, Lena; Palva, J Matias

    2010-01-01

    We propose here a simple algorithm for automated detection of spontaneous activity transients (SATs) in early preterm electroencephalography (EEG). The parameters of the algorithm were optimized by supervised learning using a gold standard created from visual classification data obtained from three human raters. The generalization performance of the algorithm was estimated by leave-one-out cross-validation. The mean sensitivity of the optimized algorithm was 97% (range 91–100%) and specificity 95% (76–100%). The optimized algorithm makes it possible to systematically study brain state fluctuations of preterm infants. (note)

  10. Considering Pilot Protection in the Optimal Coordination of Distance and Directional Overcurrent Relays

    Directory of Open Access Journals (Sweden)

    Y. Damchi

    2015-06-01

    Full Text Available The aim of the relay coordination is that protection systems detect and isolate the faulted part as fast and selective as possible. On the other hand, in order to reduce the fault clearing time, distance protection relays are usually equipped with pilot protection schemes. Such schemes can be considered in the distance and directional overcurrent relays (D&DOCRs coordination to achieve faster protection systems, while the selectivity is maintained. Therefore, in this paper, a new formulation is presented for the relay coordination problem considering pilot protection. In the proposed formulation, the selectivity constraints for the primary distance and backup overcurrent relays are defined based on the fault at the end of the transmission lines, rather than those at the end of the first zone of the primary distance relay. To solve this nonlinear optimization problem, a combination of genetic algorithm (GA and linear programming (LP is used as a hybrid genetic algorithm (HGA. The proposed approach is tested on an 8-bus and the IEEE 14-bus test systems. Simulation results indicate that considering the pilot protection in the D&DOCRS coordination, not only obtains feasible and effective solutions for the relay settings, but also reduces the overall operating time of the protection system.

  11. Determination of the Optimized Automation Rate considering Effects of Automation on Human Operators in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Lee, Seung Min; Seong, Poong Hyun; Kim, Jong Hyun; Kim, Man Cheol

    2015-01-01

    Automation refers to the use of a device or a system to perform a function previously performed by a human operator. It is introduced to reduce the human errors and to enhance the performance in various industrial fields, including the nuclear industry. However, these positive effects are not always achieved in complex systems such as nuclear power plants (NPPs). An excessive introduction of automation can generate new roles for human operators and change activities in unexpected ways. As more automation systems are accepted, the ability of human operators to detect automation failures and resume manual control is diminished. This disadvantage of automation is called the Out-of-the- Loop (OOTL) problem. We should consider the positive and negative effects of automation at the same time to determine the appropriate level of the introduction of automation. Thus, in this paper, we suggest an estimation method to consider the positive and negative effects of automation at the same time to determine the appropriate introduction of automation. This concept is limited in that it does not consider the effects of automation on human operators. Thus, a new estimation method for automation rate was suggested to overcome this problem

  12. Optimal coordinated scheduling of combined heat and power fuel cell, wind, and photovoltaic units in micro grids considering uncertainties

    International Nuclear Information System (INIS)

    Bornapour, Mosayeb; Hooshmand, Rahmat-Allah; Khodabakhshian, Amin; Parastegari, Moein

    2016-01-01

    In this paper, a stochastic model is proposed for coordinated scheduling of combined heat and power units in micro grid considering wind turbine and photovoltaic units. Uncertainties of electrical market price; the speed of wind and solar radiation are considered using a scenario-based method. In the method, scenarios are generated using roulette wheel mechanism based on probability distribution functions of input random variables. Using this method, the probabilistic specifics of the problem are distributed and the problem is converted to a deterministic one. The type of the objective function, coordinated scheduling of combined heat and power, wind turbine, and photovoltaic units change this problem to a mixed integer nonlinear one. Therefore to solve this problem modified particle swarm optimization algorithm is employed. The mentioned uncertainties lead to an increase in profit. Moreover, the optimal coordinated scheduling of renewable energy resources and thermal units in micro grids increase the total profit. In order to evaluate the performance of the proposed method, its performance is executed on modified 33 bus distributed system as a micro grid. - Highlights: • Stochastic model is proposed for coordinated scheduling of renewable energy sources. • The effect of combined heat and power is considered. • Maximizing profits of micro grid is considered as objective function. • Considering the uncertainties of problem lead to profit increasing. • Optimal scheduling of renewable energy sources and thermal units increases profit.

  13. Automated magnetic divertor design for optimal power exhaust

    Energy Technology Data Exchange (ETDEWEB)

    Blommaert, Maarten

    2017-07-01

    The so-called divertor is the standard particle and power exhaust system of nuclear fusion tokamaks. In essence, the magnetic configuration hereby 'diverts' the plasma to a specific divertor structure. The design of this divertor is still a key issue to be resolved to evolve from experimental fusion tokamaks to commercial power plants. The focus of this dissertation is on one particular design requirement: avoiding excessive heat loads on the divertor structure. The divertor design process is assisted by plasma edge transport codes that simulate the plasma and neutral particle transport in the edge of the reactor. These codes are computationally extremely demanding, not in the least due to the complex collisional processes between plasma and neutrals that lead to strong radiation sinks and macroscopic heat convection near the vessel walls. One way of improving the heat exhaust is by modifying the magnetic confinement that governs the plasma flow. In this dissertation, automated design of the magnetic configuration is pursued using adjoint based optimization methods. A simple and fast perturbation model is used to compute the magnetic field in the vacuum vessel. A stable optimal design method of the nested type is then elaborated that strictly accounts for several nonlinear design constraints and code limitations. Using appropriate cost function definitions, the heat is spread more uniformly over the high-heat load plasma-facing components in a practical design example. Furthermore, practical in-parts adjoint sensitivity calculations are presented that provide a way to an efficient optimization procedure. Results are elaborated for a fictituous JET (Joint European Torus) case. The heat load is strongly reduced by exploiting an expansion of the magnetic flux towards the solid divertor structure. Subsequently, shortcomings of the perturbation model for magnetic field calculations are discussed in comparison to a free boundary equilibrium (FBE) simulation

  14. Automated magnetic divertor design for optimal power exhaust

    International Nuclear Information System (INIS)

    Blommaert, Maarten

    2017-01-01

    The so-called divertor is the standard particle and power exhaust system of nuclear fusion tokamaks. In essence, the magnetic configuration hereby 'diverts' the plasma to a specific divertor structure. The design of this divertor is still a key issue to be resolved to evolve from experimental fusion tokamaks to commercial power plants. The focus of this dissertation is on one particular design requirement: avoiding excessive heat loads on the divertor structure. The divertor design process is assisted by plasma edge transport codes that simulate the plasma and neutral particle transport in the edge of the reactor. These codes are computationally extremely demanding, not in the least due to the complex collisional processes between plasma and neutrals that lead to strong radiation sinks and macroscopic heat convection near the vessel walls. One way of improving the heat exhaust is by modifying the magnetic confinement that governs the plasma flow. In this dissertation, automated design of the magnetic configuration is pursued using adjoint based optimization methods. A simple and fast perturbation model is used to compute the magnetic field in the vacuum vessel. A stable optimal design method of the nested type is then elaborated that strictly accounts for several nonlinear design constraints and code limitations. Using appropriate cost function definitions, the heat is spread more uniformly over the high-heat load plasma-facing components in a practical design example. Furthermore, practical in-parts adjoint sensitivity calculations are presented that provide a way to an efficient optimization procedure. Results are elaborated for a fictituous JET (Joint European Torus) case. The heat load is strongly reduced by exploiting an expansion of the magnetic flux towards the solid divertor structure. Subsequently, shortcomings of the perturbation model for magnetic field calculations are discussed in comparison to a free boundary equilibrium (FBE) simulation. These flaws

  15. Microseismic event location using global optimization algorithms: An integrated and automated workflow

    Science.gov (United States)

    Lagos, Soledad R.; Velis, Danilo R.

    2018-02-01

    We perform the location of microseismic events generated in hydraulic fracturing monitoring scenarios using two global optimization techniques: Very Fast Simulated Annealing (VFSA) and Particle Swarm Optimization (PSO), and compare them against the classical grid search (GS). To this end, we present an integrated and optimized workflow that concatenates into an automated bash script the different steps that lead to the microseismic events location from raw 3C data. First, we carry out the automatic detection, denoising and identification of the P- and S-waves. Secondly, we estimate their corresponding backazimuths using polarization information, and propose a simple energy-based criterion to automatically decide which is the most reliable estimate. Finally, after taking proper care of the size of the search space using the backazimuth information, we perform the location using the aforementioned algorithms for 2D and 3D usual scenarios of hydraulic fracturing processes. We assess the impact of restricting the search space and show the advantages of using either VFSA or PSO over GS to attain significant speed-ups.

  16. Designing a fully automated multi-bioreactor plant for fast DoE optimization of pharmaceutical protein production.

    Science.gov (United States)

    Fricke, Jens; Pohlmann, Kristof; Jonescheit, Nils A; Ellert, Andree; Joksch, Burkhard; Luttmann, Reiner

    2013-06-01

    The identification of optimal expression conditions for state-of-the-art production of pharmaceutical proteins is a very time-consuming and expensive process. In this report a method for rapid and reproducible optimization of protein expression in an in-house designed small-scale BIOSTAT® multi-bioreactor plant is described. A newly developed BioPAT® MFCS/win Design of Experiments (DoE) module (Sartorius Stedim Systems, Germany) connects the process control system MFCS/win and the DoE software MODDE® (Umetrics AB, Sweden) and enables therefore the implementation of fully automated optimization procedures. As a proof of concept, a commercial Pichia pastoris strain KM71H has been transformed for the expression of potential malaria vaccines. This approach has allowed a doubling of intact protein secretion productivity due to the DoE optimization procedure compared to initial cultivation results. In a next step, robustness regarding the sensitivity to process parameter variability has been proven around the determined optimum. Thereby, a pharmaceutical production process that is significantly improved within seven 24-hour cultivation cycles was established. Specifically, regarding the regulatory demands pointed out in the process analytical technology (PAT) initiative of the United States Food and Drug Administration (FDA), the combination of a highly instrumented, fully automated multi-bioreactor platform with proper cultivation strategies and extended DoE software solutions opens up promising benefits and opportunities for pharmaceutical protein production. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Improved decomposition–coordination and discrete differential dynamic programming for optimization of large-scale hydropower system

    International Nuclear Information System (INIS)

    Li, Chunlong; Zhou, Jianzhong; Ouyang, Shuo; Ding, Xiaoling; Chen, Lu

    2014-01-01

    Highlights: • Optimization of large-scale hydropower system in the Yangtze River basin. • Improved decomposition–coordination and discrete differential dynamic programming. • Generating initial solution randomly to reduce generation time. • Proposing relative coefficient for more power generation. • Proposing adaptive bias corridor technology to enhance convergence speed. - Abstract: With the construction of major hydro plants, more and more large-scale hydropower systems are taking shape gradually, which brings up a challenge to optimize these systems. Optimization of large-scale hydropower system (OLHS), which is to determine water discharges or water levels of overall hydro plants for maximizing total power generation when subjecting to lots of constrains, is a high dimensional, nonlinear and coupling complex problem. In order to solve the OLHS problem effectively, an improved decomposition–coordination and discrete differential dynamic programming (IDC–DDDP) method is proposed in this paper. A strategy that initial solution is generated randomly is adopted to reduce generation time. Meanwhile, a relative coefficient based on maximum output capacity is proposed for more power generation. Moreover, an adaptive bias corridor technology is proposed to enhance convergence speed. The proposed method is applied to long-term optimal dispatches of large-scale hydropower system (LHS) in the Yangtze River basin. Compared to other methods, IDC–DDDP has competitive performances in not only total power generation but also convergence speed, which provides a new method to solve the OLHS problem

  18. Automated property optimization via ab initio O(N) elongation method: Application to (hyper-)polarizability in DNA

    Energy Technology Data Exchange (ETDEWEB)

    Orimoto, Yuuichi, E-mail: orimoto.yuuichi.888@m.kyushu-u.ac.jp [Department of Material Sciences, Faculty of Engineering Sciences, Kyushu University, 6-1 Kasuga-Park, Fukuoka 816-8580 (Japan); Aoki, Yuriko [Department of Material Sciences, Faculty of Engineering Sciences, Kyushu University, 6-1 Kasuga-Park, Fukuoka 816-8580 (Japan); Japan Science and Technology Agency, CREST, 4-1-8 Hon-chou, Kawaguchi, Saitama 332-0012 (Japan)

    2016-07-14

    An automated property optimization method was developed based on the ab initio O(N) elongation (ELG) method and applied to the optimization of nonlinear optical (NLO) properties in DNA as a first test. The ELG method mimics a polymerization reaction on a computer, and the reaction terminal of a starting cluster is attacked by monomers sequentially to elongate the electronic structure of the system by solving in each step a limited space including the terminal (localized molecular orbitals at the terminal) and monomer. The ELG-finite field (ELG-FF) method for calculating (hyper-)polarizabilities was used as the engine program of the optimization method, and it was found to show linear scaling efficiency while maintaining high computational accuracy for a random sequenced DNA model. Furthermore, the self-consistent field convergence was significantly improved by using the ELG-FF method compared with a conventional method, and it can lead to more feasible NLO property values in the FF treatment. The automated optimization method successfully chose an appropriate base pair from four base pairs (A, T, G, and C) for each elongation step according to an evaluation function. From test optimizations for the first order hyper-polarizability (β) in DNA, a substantial difference was observed depending on optimization conditions between “choose-maximum” (choose a base pair giving the maximum β for each step) and “choose-minimum” (choose a base pair giving the minimum β). In contrast, there was an ambiguous difference between these conditions for optimizing the second order hyper-polarizability (γ) because of the small absolute value of γ and the limitation of numerical differential calculations in the FF method. It can be concluded that the ab initio level property optimization method introduced here can be an effective step towards an advanced computer aided material design method as long as the numerical limitation of the FF method is taken into account.

  19. Automated property optimization via ab initio O(N) elongation method: Application to (hyper-)polarizability in DNA

    International Nuclear Information System (INIS)

    Orimoto, Yuuichi; Aoki, Yuriko

    2016-01-01

    An automated property optimization method was developed based on the ab initio O(N) elongation (ELG) method and applied to the optimization of nonlinear optical (NLO) properties in DNA as a first test. The ELG method mimics a polymerization reaction on a computer, and the reaction terminal of a starting cluster is attacked by monomers sequentially to elongate the electronic structure of the system by solving in each step a limited space including the terminal (localized molecular orbitals at the terminal) and monomer. The ELG-finite field (ELG-FF) method for calculating (hyper-)polarizabilities was used as the engine program of the optimization method, and it was found to show linear scaling efficiency while maintaining high computational accuracy for a random sequenced DNA model. Furthermore, the self-consistent field convergence was significantly improved by using the ELG-FF method compared with a conventional method, and it can lead to more feasible NLO property values in the FF treatment. The automated optimization method successfully chose an appropriate base pair from four base pairs (A, T, G, and C) for each elongation step according to an evaluation function. From test optimizations for the first order hyper-polarizability (β) in DNA, a substantial difference was observed depending on optimization conditions between “choose-maximum” (choose a base pair giving the maximum β for each step) and “choose-minimum” (choose a base pair giving the minimum β). In contrast, there was an ambiguous difference between these conditions for optimizing the second order hyper-polarizability (γ) because of the small absolute value of γ and the limitation of numerical differential calculations in the FF method. It can be concluded that the ab initio level property optimization method introduced here can be an effective step towards an advanced computer aided material design method as long as the numerical limitation of the FF method is taken into account.

  20. Ontology-based composition and matching for dynamic service coordination

    OpenAIRE

    Pahl, Claus; Gacitua-Decar, Veronica; Wang, MingXue; Yapa Bandara, Kosala

    2011-01-01

    Service engineering needs to address integration problems allowing services to collaborate and coordinate. The need to address dynamic automated changes - caused by on-demand environments and changing requirements - can be addressed through service coordination based on ontology-based composition and matching techniques. Our solution to composition and matching utilises a service coordination space that acts as a passive infrastructure for collaboration. We discuss the information models an...

  1. Development of an Integrated Approach to Routine Automation of Neutron Activation Analysis. Results of a Coordinated Research Project. Companion CD-ROM. Annex II: Country Reports

    International Nuclear Information System (INIS)

    2018-04-01

    Neutron activation analysis (NAA) is a powerful technique for determining bulk composition of major and trace elements. Automation may contribute significantly to keep NAA competitive for end-users. It provides opportunities for a larger analytical capacity and a shorter overall turnaround time if large series of samples have to be analysed. This publication documents and disseminates the expertise generated on automation in NAA during a coordinated research project (CRP). The CRP participants presented different cost-effective designs of sample changers for gamma-ray spectrometry as well as irradiation devices, and were able to construct and successfully test these systems. They also implemented, expanded and improved quality control and quality assurance as cross-cutting topical area of their automated NAA procedures. The publication serves as a reference of interest to NAA practitioners, experts, and research reactor personnel, but also to various stakeholders and users interested in basic research and/or services provided by NAA. This CD-ROM contains the individual country reports

  2. Automated curved planar reformation of 3D spine images

    International Nuclear Information System (INIS)

    Vrtovec, Tomaz; Likar, Bostjan; Pernus, Franjo

    2005-01-01

    Traditional techniques for visualizing anatomical structures are based on planar cross-sections from volume images, such as images obtained by computed tomography (CT) or magnetic resonance imaging (MRI). However, planar cross-sections taken in the coordinate system of the 3D image often do not provide sufficient or qualitative enough diagnostic information, because planar cross-sections cannot follow curved anatomical structures (e.g. arteries, colon, spine, etc). Therefore, not all of the important details can be shown simultaneously in any planar cross-section. To overcome this problem, reformatted images in the coordinate system of the inspected structure must be created. This operation is usually referred to as curved planar reformation (CPR). In this paper we propose an automated method for CPR of 3D spine images, which is based on the image transformation from the standard image-based to a novel spine-based coordinate system. The axes of the proposed spine-based coordinate system are determined on the curve that represents the vertebral column, and the rotation of the vertebrae around the spine curve, both of which are described by polynomial models. The optimal polynomial parameters are obtained in an image analysis based optimization framework. The proposed method was qualitatively and quantitatively evaluated on five CT spine images. The method performed well on both normal and pathological cases and was consistent with manually obtained ground truth data. The proposed spine-based CPR benefits from reduced structural complexity in favour of improved feature perception of the spine. The reformatted images are diagnostically valuable and enable easier navigation, manipulation and orientation in 3D space. Moreover, reformatted images may prove useful for segmentation and other image analysis tasks

  3. Optimization of axial enrichment distribution for BWR fuels using scoping libraries and block coordinate descent method

    Energy Technology Data Exchange (ETDEWEB)

    Tung, Wu-Hsiung, E-mail: wstong@iner.gov.tw; Lee, Tien-Tso; Kuo, Weng-Sheng; Yaur, Shung-Jung

    2017-03-15

    Highlights: • An optimization method for axial enrichment distribution in a BWR fuel was developed. • Block coordinate descent method is employed to search for optimal solution. • Scoping libraries are used to reduce computational effort. • Optimization search space consists of enrichment difference parameters. • Capability of the method to find optimal solution is demonstrated. - Abstract: An optimization method has been developed to search for the optimal axial enrichment distribution in a fuel assembly for a boiling water reactor core. The optimization method features: (1) employing the block coordinate descent method to find the optimal solution in the space of enrichment difference parameters, (2) using scoping libraries to reduce the amount of CASMO-4 calculation, and (3) integrating a core critical constraint into the objective function that is used to quantify the quality of an axial enrichment design. The objective function consists of the weighted sum of core parameters such as shutdown margin and critical power ratio. The core parameters are evaluated by using SIMULATE-3, and the cross section data required for the SIMULATE-3 calculation are generated by using CASMO-4 and scoping libraries. The application of the method to a 4-segment fuel design (with the highest allowable segment enrichment relaxed to 5%) demonstrated that the method can obtain an axial enrichment design with improved thermal limit ratios and objective function value while satisfying the core design constraints and core critical requirement through the use of an objective function. The use of scoping libraries effectively reduced the number of CASMO-4 calculation, from 85 to 24, in the 4-segment optimization case. An exhausted search was performed to examine the capability of the method in finding the optimal solution for a 4-segment fuel design. The results show that the method found a solution very close to the optimum obtained by the exhausted search. The number of

  4. Automated Robust Maneuver Design and Optimization

    Data.gov (United States)

    National Aeronautics and Space Administration — NASA is seeking improvements to the current technologies related to Position, Navigation and Timing. In particular, it is desired to automate precise maneuver...

  5. Many-Objective Particle Swarm Optimization Using Two-Stage Strategy and Parallel Cell Coordinate System.

    Science.gov (United States)

    Hu, Wang; Yen, Gary G; Luo, Guangchun

    2017-06-01

    It is a daunting challenge to balance the convergence and diversity of an approximate Pareto front in a many-objective optimization evolutionary algorithm. A novel algorithm, named many-objective particle swarm optimization with the two-stage strategy and parallel cell coordinate system (PCCS), is proposed in this paper to improve the comprehensive performance in terms of the convergence and diversity. In the proposed two-stage strategy, the convergence and diversity are separately emphasized at different stages by a single-objective optimizer and a many-objective optimizer, respectively. A PCCS is exploited to manage the diversity, such as maintaining a diverse archive, identifying the dominance resistant solutions, and selecting the diversified solutions. In addition, a leader group is used for selecting the global best solutions to balance the exploitation and exploration of a population. The experimental results illustrate that the proposed algorithm outperforms six chosen state-of-the-art designs in terms of the inverted generational distance and hypervolume over the DTLZ test suite.

  6. Optimal Ordering Policy and Coordination Mechanism of a Supply Chain with Controllable Lead-Time-Dependent Demand Forecast

    Directory of Open Access Journals (Sweden)

    Hua-Ming Song

    2011-01-01

    Full Text Available This paper investigates the ordering decisions and coordination mechanism for a distributed short-life-cycle supply chain. The objective is to maximize the whole supply chain's expected profit and meanwhile make the supply chain participants achieve a Pareto improvement. We treat lead time as a controllable variable, thus the demand forecast is dependent on lead time: the shorter lead time, the better forecast. Moreover, optimal decision-making models for lead time and order quantity are formulated and compared in the decentralized and centralized cases. Besides, a three-parameter contract is proposed to coordinate the supply chain and alleviate the double margin in the decentralized scenario. In addition, based on the analysis of the models, we develop an algorithmic procedure to find the optimal ordering decisions. Finally, a numerical example is also presented to illustrate the results.

  7. A Velocity-Level Bi-Criteria Optimization Scheme for Coordinated Path Tracking of Dual Robot Manipulators Using Recurrent Neural Network.

    Science.gov (United States)

    Xiao, Lin; Zhang, Yongsheng; Liao, Bolin; Zhang, Zhijun; Ding, Lei; Jin, Long

    2017-01-01

    A dual-robot system is a robotic device composed of two robot arms. To eliminate the joint-angle drift and prevent the occurrence of high joint velocity, a velocity-level bi-criteria optimization scheme, which includes two criteria (i.e., the minimum velocity norm and the repetitive motion), is proposed and investigated for coordinated path tracking of dual robot manipulators. Specifically, to realize the coordinated path tracking of dual robot manipulators, two subschemes are first presented for the left and right robot manipulators. After that, such two subschemes are reformulated as two general quadratic programs (QPs), which can be formulated as one unified QP. A recurrent neural network (RNN) is thus presented to solve effectively the unified QP problem. At last, computer simulation results based on a dual three-link planar manipulator further validate the feasibility and the efficacy of the velocity-level optimization scheme for coordinated path tracking using the recurrent neural network.

  8. Integrating Test-Form Formatting into Automated Test Assembly

    Science.gov (United States)

    Diao, Qi; van der Linden, Wim J.

    2013-01-01

    Automated test assembly uses the methodology of mixed integer programming to select an optimal set of items from an item bank. Automated test-form generation uses the same methodology to optimally order the items and format the test form. From an optimization point of view, production of fully formatted test forms directly from the item pool using…

  9. Coordinating complex decision support activities across distributed applications

    Science.gov (United States)

    Adler, Richard M.

    1994-01-01

    Knowledge-based technologies have been applied successfully to automate planning and scheduling in many problem domains. Automation of decision support can be increased further by integrating task-specific applications with supporting database systems, and by coordinating interactions between such tools to facilitate collaborative activities. Unfortunately, the technical obstacles that must be overcome to achieve this vision of transparent, cooperative problem-solving are daunting. Intelligent decision support tools are typically developed for standalone use, rely on incompatible, task-specific representational models and application programming interfaces (API's), and run on heterogeneous computing platforms. Getting such applications to interact freely calls for platform independent capabilities for distributed communication, as well as tools for mapping information across disparate representations. Symbiotics is developing a layered set of software tools (called NetWorks! for integrating and coordinating heterogeneous distributed applications. he top layer of tools consists of an extensible set of generic, programmable coordination services. Developers access these services via high-level API's to implement the desired interactions between distributed applications.

  10. Damping Improvement of Multiple Damping Controllers by Using Optimal Coordinated Design Based on PSS and FACTS-POD in a Multi-Machine Power System

    Directory of Open Access Journals (Sweden)

    Ali Nasser Hussain

    2016-09-01

    Full Text Available The aim of this study is to present a comprehensive comparison and assessment of the damping function improvement of power system oscillation for the multiple damping controllers using the simultaneously coordinated design based on Power System Stabilizer (PSS and Flexible AC Transmission System (FACTS devices. FACTS devices can help in the enhancing the stability of the power system by adding supplementary damping controller to the control channel of the FACTS input to implement the task of Power Oscillation Damping (FACT POD controller. Simultaneous coordination can be performed in different ways. First, the dual coordinated designs between PSS and FACTS POD controller or between different FACTS POD controllers are arranged in a multiple FACTS devices without PSS. Second, the simultaneous coordination has been extended to triple coordinated design among PSS and different FACTS POD controllers. The parameters of the damping controllers have been tuned in the individual controllers and coordinated designs by using a Chaotic Particle Swarm Optimization (CPSO algorithm that optimized the given eigenvalue-based objective function. The simulation results for a multi-machine power system show that the dual coordinated design provide satisfactory damping performance over the individual control responses. Furthermore, the triple coordinated design has been shown to be more effective in damping oscillations than the dual damping controllers.

  11. Bioprocessing automation in cell therapy manufacturing: Outcomes of special interest group automation workshop.

    Science.gov (United States)

    Ball, Oliver; Robinson, Sarah; Bure, Kim; Brindley, David A; Mccall, David

    2018-04-01

    Phacilitate held a Special Interest Group workshop event in Edinburgh, UK, in May 2017. The event brought together leading stakeholders in the cell therapy bioprocessing field to identify present and future challenges and propose potential solutions to automation in cell therapy bioprocessing. Here, we review and summarize discussions from the event. Deep biological understanding of a product, its mechanism of action and indication pathogenesis underpin many factors relating to bioprocessing and automation. To fully exploit the opportunities of bioprocess automation, therapeutics developers must closely consider whether an automation strategy is applicable, how to design an 'automatable' bioprocess and how to implement process modifications with minimal disruption. Major decisions around bioprocess automation strategy should involve all relevant stakeholders; communication between technical and business strategy decision-makers is of particular importance. Developers should leverage automation to implement in-process testing, in turn applicable to process optimization, quality assurance (QA)/ quality control (QC), batch failure control, adaptive manufacturing and regulatory demands, but a lack of precedent and technical opportunities can complicate such efforts. Sparse standardization across product characterization, hardware components and software platforms is perceived to complicate efforts to implement automation. The use of advanced algorithmic approaches such as machine learning may have application to bioprocess and supply chain optimization. Automation can substantially de-risk the wider supply chain, including tracking and traceability, cryopreservation and thawing and logistics. The regulatory implications of automation are currently unclear because few hardware options exist and novel solutions require case-by-case validation, but automation can present attractive regulatory incentives. Copyright © 2018 International Society for Cellular Therapy

  12. Modular high power diode lasers with flexible 3D multiplexing arrangement optimized for automated manufacturing

    Science.gov (United States)

    Könning, Tobias; Bayer, Andreas; Plappert, Nora; Faßbender, Wilhelm; Dürsch, Sascha; Küster, Matthias; Hubrich, Ralf; Wolf, Paul; Köhler, Bernd; Biesenbach, Jens

    2018-02-01

    A novel 3-dimensional arrangement of mirrors is used to re-arrange beams from 1-D and 2-D high power diode laser arrays. The approach allows for a variety of stacking geometries, depending on individual requirements. While basic building blocks, including collimating optics, always remain the same, most adaptations can be realized by simple rearrangement of a few optical components. Due to fully automated alignment processes, the required changes can be realized in software by changing coordinates, rather than requiring customized mechanical components. This approach minimizes development costs due to its flexibility, while reducing overall product cost by using similar building blocks for a variety of products and utilizing a high grade of automation. The modules can be operated with industrial grade water, lowering overall system and maintenance cost. Stackable macro coolers are used as the smallest building block of the system. Each cooler can hold up to five diode laser bars. Micro optical components, collimating the beam, are mounted directly to the cooler. All optical assembly steps are fully automated. Initially, the beams from all laser bars propagate in the same direction. Key to the concept is an arrangement of deflectors, which re-arrange the beams into a 2-D array of the desired shape and high fill factor. Standard multiplexing techniques like polarization- or wavelengths-multiplexing have been implemented as well. A variety of fiber coupled modules ranging from a few hundred watts of optical output power to multiple kilowatts of power, as well as customized laser spot geometries like uniform line sources, have been realized.

  13. Design automation, languages, and simulations

    CERN Document Server

    Chen, Wai-Kai

    2003-01-01

    As the complexity of electronic systems continues to increase, the micro-electronic industry depends upon automation and simulations to adapt quickly to market changes and new technologies. Compiled from chapters contributed to CRC's best-selling VLSI Handbook, this volume covers a broad range of topics relevant to design automation, languages, and simulations. These include a collaborative framework that coordinates distributed design activities through the Internet, an overview of the Verilog hardware description language and its use in a design environment, hardware/software co-design, syst

  14. Automated Test-Form Generation

    Science.gov (United States)

    van der Linden, Wim J.; Diao, Qi

    2011-01-01

    In automated test assembly (ATA), the methodology of mixed-integer programming is used to select test items from an item bank to meet the specifications for a desired test form and optimize its measurement accuracy. The same methodology can be used to automate the formatting of the set of selected items into the actual test form. Three different…

  15. Application-Oriented Optimal Shift Schedule Extraction for a Dual-Motor Electric Bus with Automated Manual Transmission

    Directory of Open Access Journals (Sweden)

    Mingjie Zhao

    2018-02-01

    Full Text Available The conventional battery electric buses (BEBs have limited potential to optimize the energy consumption and reach a better dynamic performance. A practical dual-motor equipped with 4-speed Automated Manual Transmission (AMT propulsion system is proposed, which can eliminate the traction interruption in conventional AMT. A discrete model of the dual-motor-AMT electric bus (DMAEB is built and used to optimize the gear shift schedule. Dynamic programming (DP algorithm is applied to find the optimal results where the efficiency and shift time of each gear are considered to handle the application problem of global optimization. A rational penalty factor and a proper shift time delay based on bench test results are set to reduce the shift frequency by 82.5% in Chinese-World Transient Vehicle Cycle (C-WTVC. Two perspectives of applicable shift rule extraction methods, i.e., the classification method based on optimal operating points and clustering method based on optimal shifting points, are explored and compared. Eventually, the hardware-in-the-loop (HIL simulation results demonstrate that the proposed structure and extracted shift schedule can realize a significant improvement in reducing energy loss by 20.13% compared to traditional empirical strategies.

  16. Economic Load Dispatch - A Comparative Study on Heuristic Optimization Techniques With an Improved Coordinated Aggregation-Based PSO

    DEFF Research Database (Denmark)

    Vlachogiannis, Ioannis (John); Lee, KY

    2009-01-01

    In this paper an improved coordinated aggregation-based particle swarm optimization (ICA-PSO) algorithm is introduced for solving the optimal economic load dispatch (ELD) problem in power systems. In the ICA-PSO algorithm each particle in the swarm retains a memory of its best position ever...... encountered, and is attracted only by other particles with better achievements than its own with the exception of the particle with the best achievement, which moves randomly. Moreover, the population size is increased adaptively, the number of search intervals for the particles is selected adaptively...

  17. TH-AB-BRA-02: Automated Triplet Beam Orientation Optimization for MRI-Guided Co-60 Radiotherapy

    International Nuclear Information System (INIS)

    Nguyen, D; Thomas, D; Cao, M; O’Connor, D; Lamb, J; Sheng, K

    2016-01-01

    Purpose: MRI guided Co-60 provides daily and intrafractional MRI soft tissue imaging for improved target tracking and adaptive radiotherapy. To remedy the low output limitation, the system uses three Co-60 sources at 120° apart, but using all three sources in planning is considerably unintuitive. We automate the beam orientation optimization using column generation, and then solve a novel fluence map optimization (FMO) problem while regularizing the number of MLC segments. Methods: Three patients—1 prostate (PRT), 1 lung (LNG), and 1 head-and-neck boost plan (H&NBoost)—were evaluated. The beamlet dose for 180 equally spaced coplanar beams under 0.35 T magnetic field was calculated using Monte Carlo. The 60 triplets were selected utilizing the column generation algorithm. The FMO problem was formulated using an L2-norm minimization with anisotropic total variation (TV) regularization term, which allows for control over the number of MLC segments. Our Fluence Regularized and Optimized Selection of Triplets (FROST) plans were compared against the clinical treatment plans (CLN) produced by an experienced dosimetrist. Results: The mean PTV D95, D98, and D99 differ by −0.02%, +0.12%, and +0.44% of the prescription dose between planning methods, showing same PTV dose coverage. The mean PTV homogeneity (D95/D5) was at 0.9360 (FROST) and 0.9356 (CLN). R50 decreased by 0.07 with FROST. On average, FROST reduced Dmax and Dmean of OARs by 6.56% and 5.86% of the prescription dose. The manual CLN planning required iterative trial and error runs which is very time consuming, while FROST required minimal human intervention. Conclusions: MRI guided Co-60 therapy needs the output of all sources yet suffers from unintuitive and laborious manual beam selection processes. Automated triplet orientation optimization is shown essential to overcome the difficulty and improves the dosimetry. A novel FMO with regularization provides additional controls over the number of MLC segments

  18. TH-AB-BRA-02: Automated Triplet Beam Orientation Optimization for MRI-Guided Co-60 Radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Nguyen, D; Thomas, D; Cao, M; O’Connor, D; Lamb, J; Sheng, K [Department of Radiation Oncology, University of California Los Angeles, Los Angeles, CA (United States)

    2016-06-15

    Purpose: MRI guided Co-60 provides daily and intrafractional MRI soft tissue imaging for improved target tracking and adaptive radiotherapy. To remedy the low output limitation, the system uses three Co-60 sources at 120° apart, but using all three sources in planning is considerably unintuitive. We automate the beam orientation optimization using column generation, and then solve a novel fluence map optimization (FMO) problem while regularizing the number of MLC segments. Methods: Three patients—1 prostate (PRT), 1 lung (LNG), and 1 head-and-neck boost plan (H&NBoost)—were evaluated. The beamlet dose for 180 equally spaced coplanar beams under 0.35 T magnetic field was calculated using Monte Carlo. The 60 triplets were selected utilizing the column generation algorithm. The FMO problem was formulated using an L2-norm minimization with anisotropic total variation (TV) regularization term, which allows for control over the number of MLC segments. Our Fluence Regularized and Optimized Selection of Triplets (FROST) plans were compared against the clinical treatment plans (CLN) produced by an experienced dosimetrist. Results: The mean PTV D95, D98, and D99 differ by −0.02%, +0.12%, and +0.44% of the prescription dose between planning methods, showing same PTV dose coverage. The mean PTV homogeneity (D95/D5) was at 0.9360 (FROST) and 0.9356 (CLN). R50 decreased by 0.07 with FROST. On average, FROST reduced Dmax and Dmean of OARs by 6.56% and 5.86% of the prescription dose. The manual CLN planning required iterative trial and error runs which is very time consuming, while FROST required minimal human intervention. Conclusions: MRI guided Co-60 therapy needs the output of all sources yet suffers from unintuitive and laborious manual beam selection processes. Automated triplet orientation optimization is shown essential to overcome the difficulty and improves the dosimetry. A novel FMO with regularization provides additional controls over the number of MLC segments

  19. Optimal Coordinated Control of Power Extraction in LES of a Wind Farm with Entrance Effects

    Directory of Open Access Journals (Sweden)

    Jay P. Goit

    2016-01-01

    Full Text Available We investigate the use of optimal coordinated control techniques in large eddy simulations of wind farm boundary layer interaction with the aim of increasing the total energy extraction in wind farms. The individual wind turbines are considered as flow actuators, and their energy extraction is dynamically regulated in time, so as to optimally influence the flow field. We extend earlier work on wind farm optimal control in the fully-developed regime (Goit and Meyers 2015, J. Fluid Mech. 768, 5–50 to a ‘finite’ wind farm case, in which entrance effects play an important role. For the optimal control, a receding horizon framework is employed in which turbine thrust coefficients are optimized in time and per turbine. Optimization is performed with a conjugate gradient method, where gradients of the cost functional are obtained using adjoint large eddy simulations. Overall, the energy extraction is increased 7% by the optimal control. This increase in energy extraction is related to faster wake recovery throughout the farm. For the first row of turbines, the optimal control increases turbulence levels and Reynolds stresses in the wake, leading to better wake mixing and an inflow velocity for the second row that is significantly higher than in the uncontrolled case. For downstream rows, the optimal control mainly enhances the sideways mean transport of momentum. This is different from earlier observations by Goit and Meyers (2015 in the fully-developed regime, where mainly vertical transport was enhanced.

  20. Optimization of the radiological protection of patients: Image quality and dose in mammography (co-ordinated research in Europe). Results of the coordinated research project on optimization of protection mammography in some eastern European States

    International Nuclear Information System (INIS)

    2005-05-01

    Mammography is an extremely useful non-invasive imaging technique with unparalleled advantages for the detection of breast cancer. It has played an immense role in the screening of women above a certain age or with a family history of breast cancer. The IAEA has a statutory responsibility to establish standards for the protection of people against exposure to ionizing radiation and to provide for the worldwide application of those standards. A fundamental requirement of the International Basic Safety Standards for Protection Against Ionizing Radiation (BSS) and for the Safety of Radiation Sources, issued by the IAEA and co-sponsored by FAO, ILO, WHO, PAHO and NEA, is the optimization of radiological protection of patients undergoing medical exposure. In keeping with its responsibility on the application of standards, the IAEA programme on Radiological Protection of Patients attempts to reduce radiation doses to patients while balancing quality assurance considerations. IAEA-TECDOC-796, Radiation Doses in Diagnostic Radiology and Methods for Dose Reduction (1995), addresses this aspect. The related IAEA-TECDOC-1423 on Optimization of the Radiological Protection of Patients undergoing Radiography, Fluoroscopy and Computed Tomography, (2004) constitutes the final report of the coordinated research in Africa, Asia and eastern Europe. The preceding publications do not explicitly consider mammography. Mindful of the importance of this imaging technique, the IAEA launched a Coordinated Research Project on Optimization of Protection in Mammography in some eastern European States. The present publication is the outcome of this project: it is aimed at evaluating the situation in a number of countries, identifying variations in the technique, examining the status of the equipment and comparing performance in the light of the norms established by the European Commission. A number of important aspects are covered, including: - quality control of mammography equipment; - imaging

  1. Lighting Automation - Flying an Earthlike Habit Project

    Science.gov (United States)

    Falker, Jay; Howard, Ricky; Culbert, Christopher; Clark, Toni Anne; Kolomenski, Andrei

    2017-01-01

    Our proposal will enable the development of automated spacecraft habitats for long duration missions. Majority of spacecraft lighting systems employ lamps or zone specific switches and dimmers. Automation is not in the "picture". If we are to build long duration environments, which provide earth-like habitats, minimize crew time, and optimize spacecraft power reserves, innovation in lighting automation is a must. To transform how spacecraft lighting environments are automated, we will provide performance data on a standard lighting communication protocol. We will investigate utilization and application of an industry accepted lighting control protocol, DMX512. We will demonstrate how lighting automation can conserve power, assist with lighting countermeasures, and utilize spatial body tracking. By using DMX512 we will prove the "wheel" does not need to be reinvented in terms of smart lighting and future spacecraft can use a standard lighting protocol to produce an effective, optimized and potentially earthlike habitat.

  2. An objective method to optimize the MR sequence set for plaque classification in carotid vessel wall images using automated image segmentation.

    Directory of Open Access Journals (Sweden)

    Ronald van 't Klooster

    Full Text Available A typical MR imaging protocol to study the status of atherosclerosis in the carotid artery consists of the application of multiple MR sequences. Since scanner time is limited, a balance has to be reached between the duration of the applied MR protocol and the quantity and quality of the resulting images which are needed to assess the disease. In this study an objective method to optimize the MR sequence set for classification of soft plaque in vessel wall images of the carotid artery using automated image segmentation was developed. The automated method employs statistical pattern recognition techniques and was developed based on an extensive set of MR contrast weightings and corresponding manual segmentations of the vessel wall and soft plaque components, which were validated by histological sections. Evaluation of the results from nine contrast weightings showed the tradeoff between scan duration and automated image segmentation performance. For our dataset the best segmentation performance was achieved by selecting five contrast weightings. Similar performance was achieved with a set of three contrast weightings, which resulted in a reduction of scan time by more than 60%. The presented approach can help others to optimize MR imaging protocols by investigating the tradeoff between scan duration and automated image segmentation performance possibly leading to shorter scanning times and better image interpretation. This approach can potentially also be applied to other research fields focusing on different diseases and anatomical regions.

  3. Self-Defense Distributed Engagement Coordinator

    Science.gov (United States)

    2016-02-01

    Distributed Engagement Coordinator MIT Lincoln Laboratory helped develop a unique decision support tool that automatically evaluates responses to...Laboratory researchers collaborated with scientists from the Operations Research Center at MIT’s Sloan School of Management to apply modern computational...epidemic.  A Technology Solution MIT Lincoln Laboratory, in collaboration with the Office of Naval Research (ONR), has developed an automated

  4. Design optimization of single mixed refrigerant LNG process using a hybrid modified coordinate descent algorithm

    Science.gov (United States)

    Qyyum, Muhammad Abdul; Long, Nguyen Van Duc; Minh, Le Quang; Lee, Moonyong

    2018-01-01

    Design optimization of the single mixed refrigerant (SMR) natural gas liquefaction (LNG) process involves highly non-linear interactions between decision variables, constraints, and the objective function. These non-linear interactions lead to an irreversibility, which deteriorates the energy efficiency of the LNG process. In this study, a simple and highly efficient hybrid modified coordinate descent (HMCD) algorithm was proposed to cope with the optimization of the natural gas liquefaction process. The single mixed refrigerant process was modeled in Aspen Hysys® and then connected to a Microsoft Visual Studio environment. The proposed optimization algorithm provided an improved result compared to the other existing methodologies to find the optimal condition of the complex mixed refrigerant natural gas liquefaction process. By applying the proposed optimization algorithm, the SMR process can be designed with the 0.2555 kW specific compression power which is equivalent to 44.3% energy saving as compared to the base case. Furthermore, in terms of coefficient of performance (COP), it can be enhanced up to 34.7% as compared to the base case. The proposed optimization algorithm provides a deep understanding of the optimization of the liquefaction process in both technical and numerical perspectives. In addition, the HMCD algorithm can be employed to any mixed refrigerant based liquefaction process in the natural gas industry.

  5. Towards automating the discovery of certain innovative design principles through a clustering-based optimization technique

    Science.gov (United States)

    Bandaru, Sunith; Deb, Kalyanmoy

    2011-09-01

    In this article, a methodology is proposed for automatically extracting innovative design principles which make a system or process (subject to conflicting objectives) optimal using its Pareto-optimal dataset. Such 'higher knowledge' would not only help designers to execute the system better, but also enable them to predict how changes in one variable would affect other variables if the system has to retain its optimal behaviour. This in turn would help solve other similar systems with different parameter settings easily without the need to perform a fresh optimization task. The proposed methodology uses a clustering-based optimization technique and is capable of discovering hidden functional relationships between the variables, objective and constraint functions and any other function that the designer wishes to include as a 'basis function'. A number of engineering design problems are considered for which the mathematical structure of these explicit relationships exists and has been revealed by a previous study. A comparison with the multivariate adaptive regression splines (MARS) approach reveals the practicality of the proposed approach due to its ability to find meaningful design principles. The success of this procedure for automated innovization is highly encouraging and indicates its suitability for further development in tackling more complex design scenarios.

  6. ASTROS: A multidisciplinary automated structural design tool

    Science.gov (United States)

    Neill, D. J.

    1989-01-01

    ASTROS (Automated Structural Optimization System) is a finite-element-based multidisciplinary structural optimization procedure developed under Air Force sponsorship to perform automated preliminary structural design. The design task is the determination of the structural sizes that provide an optimal structure while satisfying numerous constraints from many disciplines. In addition to its automated design features, ASTROS provides a general transient and frequency response capability, as well as a special feature to perform a transient analysis of a vehicle subjected to a nuclear blast. The motivation for the development of a single multidisciplinary design tool is that such a tool can provide improved structural designs in less time than is currently needed. The role of such a tool is even more apparent as modern materials come into widespread use. Balancing conflicting requirements for the structure's strength and stiffness while exploiting the benefits of material anisotropy is perhaps an impossible task without assistance from an automated design tool. Finally, the use of a single tool can bring the design task into better focus among design team members, thereby improving their insight into the overall task.

  7. Dynamic Coordinated Shifting Control of Automated Mechanical Transmissions without a Clutch in a Plug-In Hybrid Electric Vehicle

    Directory of Open Access Journals (Sweden)

    Xinlei Liu

    2012-08-01

    Full Text Available On the basis of the shifting process of automated mechanical transmissions (AMTs for traditional hybrid electric vehicles (HEVs, and by combining the features of electric machines with fast response speed, the dynamic model of the hybrid electric AMT vehicle powertrain is built up, the dynamic characteristics of each phase of shifting process are analyzed, and a control strategy in which torque and speed of the engine and electric machine are coordinatively controlled to achieve AMT shifting control for a plug-in hybrid electric vehicle (PHEV without clutch is proposed. In the shifting process, the engine and electric machine are well controlled, and the shift jerk and power interruption and restoration time are reduced. Simulation and real car test results show that the proposed control strategy can more efficiently improve the shift quality for PHEVs equipped with AMTs.

  8. System automation for a bacterial colony detection and identification instrument via forward scattering

    International Nuclear Information System (INIS)

    Bae, Euiwon; Hirleman, E Daniel; Aroonnual, Amornrat; Bhunia, Arun K; Robinson, J Paul

    2009-01-01

    A system design and automation of a microbiological instrument that locates bacterial colonies and captures the forward-scattering signatures are presented. The proposed instrument integrates three major components: a colony locator, a forward scatterometer and a motion controller. The colony locator utilizes an off-axis light source to illuminate a Petri dish and an IEEE1394 camera to capture the diffusively scattered light to provide the number of bacterial colonies and two-dimensional coordinate information of the bacterial colonies with the help of a segmentation algorithm with region-growing. Then the Petri dish is automatically aligned with the respective centroid coordinate with a trajectory optimization method, such as the Traveling Salesman Algorithm. The forward scatterometer automatically computes the scattered laser beam from a monochromatic image sensor via quadrant intensity balancing and quantitatively determines the centeredness of the forward-scattering pattern. The final scattering signatures are stored to be analyzed to provide rapid identification and classification of the bacterial samples

  9. An automated analysis workflow for optimization of force-field parameters using neutron scattering data

    Energy Technology Data Exchange (ETDEWEB)

    Lynch, Vickie E.; Borreguero, Jose M. [Neutron Data Analysis & Visualization Division, Oak Ridge National Laboratory, Oak Ridge, TN, 37831 (United States); Bhowmik, Debsindhu [Computational Sciences & Engineering Division, Oak Ridge National Laboratory, Oak Ridge, TN, 37831 (United States); Ganesh, Panchapakesan; Sumpter, Bobby G. [Center for Nanophase Material Sciences, Oak Ridge National Laboratory, Oak Ridge, TN, 37831 (United States); Computational Sciences & Engineering Division, Oak Ridge National Laboratory, Oak Ridge, TN, 37831 (United States); Proffen, Thomas E. [Neutron Data Analysis & Visualization Division, Oak Ridge National Laboratory, Oak Ridge, TN, 37831 (United States); Goswami, Monojoy, E-mail: goswamim@ornl.gov [Center for Nanophase Material Sciences, Oak Ridge National Laboratory, Oak Ridge, TN, 37831 (United States); Computational Sciences & Engineering Division, Oak Ridge National Laboratory, Oak Ridge, TN, 37831 (United States)

    2017-07-01

    Graphical abstract: - Highlights: • An automated workflow to optimize force-field parameters. • Used the workflow to optimize force-field parameter for a system containing nanodiamond and tRNA. • The mechanism relies on molecular dynamics simulation and neutron scattering experimental data. • The workflow can be generalized to any other experimental and simulation techniques. - Abstract: Large-scale simulations and data analysis are often required to explain neutron scattering experiments to establish a connection between the fundamental physics at the nanoscale and data probed by neutrons. However, to perform simulations at experimental conditions it is critical to use correct force-field (FF) parameters which are unfortunately not available for most complex experimental systems. In this work, we have developed a workflow optimization technique to provide optimized FF parameters by comparing molecular dynamics (MD) to neutron scattering data. We describe the workflow in detail by using an example system consisting of tRNA and hydrophilic nanodiamonds in a deuterated water (D{sub 2}O) environment. Quasi-elastic neutron scattering (QENS) data show a faster motion of the tRNA in the presence of nanodiamond than without the ND. To compare the QENS and MD results quantitatively, a proper choice of FF parameters is necessary. We use an efficient workflow to optimize the FF parameters between the hydrophilic nanodiamond and water by comparing to the QENS data. Our results show that we can obtain accurate FF parameters by using this technique. The workflow can be generalized to other types of neutron data for FF optimization, such as vibrational spectroscopy and spin echo.

  10. Design Optimization of Internal Flow Devices

    DEFF Research Database (Denmark)

    Madsen, Jens Ingemann

    The power of computational fluid dynamics is boosted through the use of automated design optimization methodologies. The thesis considers both derivative-based search optimization and the use of response surface methodologies.......The power of computational fluid dynamics is boosted through the use of automated design optimization methodologies. The thesis considers both derivative-based search optimization and the use of response surface methodologies....

  11. A novel optimal coordinated control strategy for the updated robot system for single port surgery.

    Science.gov (United States)

    Bai, Weibang; Cao, Qixin; Leng, Chuntao; Cao, Yang; Fujie, Masakatsu G; Pan, Tiewen

    2017-09-01

    Research into robotic systems for single port surgery (SPS) has become widespread around the world in recent years. A new robot arm system for SPS was developed, but its positioning platform and other hardware components were not efficient. Special features of the developed surgical robot system make good teleoperation with safety and efficiency difficult. A robot arm is combined and used as new positioning platform, and the remote center motion is realized by a new method using active motion control. A new mapping strategy based on kinematics computation and a novel optimal coordinated control strategy based on real-time approaching to a defined anthropopathic criterion configuration that is referred to the customary ease state of human arms and especially the configuration of boxers' habitual preparation posture are developed. The hardware components, control architecture, control system, and mapping strategy of the robotic system has been updated. A novel optimal coordinated control strategy is proposed and tested. The new robot system can be more dexterous, intelligent, convenient and safer for preoperative positioning and intraoperative adjustment. The mapping strategy can achieve good following and representation for the slave manipulator arms. And the proposed novel control strategy can enable them to complete tasks with higher maneuverability, lower possibility of self-interference and singularity free while teleoperating. Copyright © 2017 John Wiley & Sons, Ltd.

  12. Launch Control System Software Development System Automation Testing

    Science.gov (United States)

    Hwang, Andrew

    2017-01-01

    ) tool to Brandon Echols, a fellow intern, and I. The purpose of the OCR tool is to analyze an image and find the coordinates of any group of text. Some issues that arose while installing the OCR tool included the absence of certain libraries needed to train the tool and an outdated software version. We eventually resolved the issues and successfully installed the OCR tool. Training the tool required many images and different fonts and sizes, but in the end the tool learned to accurately decipher the text in the images and their coordinates. The OCR tool produced a file that contained significant metadata for each section of text, but only the text and coordinates of the text was required for our purpose. The team made a script to parse the information we wanted from the OCR file to a different file that would be used by automation functions within the automated framework. Since a majority of development and testing for the automated test cases for the GUI in question has been done using live simulated data on the workstations at the Launch Control Center (LCC), a large amount of progress has been made. As of this writing, about 60% of all of automated testing has been implemented. Additionally, the OCR tool will help make our automated tests more robust due to the tool's text recognition being highly scalable to different text fonts and text sizes. Soon we will have the whole test system automated, allowing for more full-time engineers working on development projects.

  13. A Process Algebra for Supervisory Coordination

    Directory of Open Access Journals (Sweden)

    Jos Baeten

    2011-08-01

    Full Text Available A supervisory controller controls and coordinates the behavior of different components of a complex machine by observing their discrete behaviour. Supervisory control theory studies automated synthesis of controller models, known as supervisors, based on formal models of the machine components and a formalization of the requirements. Subsequently, code generation can be used to implement this supervisor in software, on a PLC, or embedded microprocessor. In this article, we take a closer look at the control loop that couples the supervisory controller and the machine. We model both event-based and state-based observations using process algebra and bisimulation-based semantics. The main application area of supervisory control that we consider is coordination, referred to as supervisory coordination, and we give an academic and an industrial example, discussing the process-theoretic concepts employed.

  14. The Model of Coordination of Communication Channels for Small Tourist Communities

    Directory of Open Access Journals (Sweden)

    Jelena VASKOVIĆ

    2013-12-01

    Full Text Available By including e-business, small tourist communities were allowed, apart from their classic offers, to appear on the global market, but that caused the need for automation and coordination of booking capacity tasks. Advertising and booking in these communities are performed by a conventional agency arrangement, the Internet, mobile services or by tourists themselves upon their arrival in the local community where they can reserve the accommodation. The possibility of booking accommodation capacities in many ways creates additional benefits for considerable usage of excess capacity, but as a side effect there is a problem of coordination of communication channels in order to avoid double-booking. On the other hand, the local administration has a problem with the registration and the payment of the tourist tax, particularly if the tourists do not stay long. With the automation and coordination of communication channels, conflicts can be completely avoided, and the reservation system informs all interested parties and reports to the local administration.

  15. A new framework for analysing automated acoustic species-detection data: occupancy estimation and optimization of recordings post-processing

    Science.gov (United States)

    Chambert, Thierry A.; Waddle, J. Hardin; Miller, David A.W.; Walls, Susan; Nichols, James D.

    2018-01-01

    The development and use of automated species-detection technologies, such as acoustic recorders, for monitoring wildlife are rapidly expanding. Automated classification algorithms provide a cost- and time-effective means to process information-rich data, but often at the cost of additional detection errors. Appropriate methods are necessary to analyse such data while dealing with the different types of detection errors.We developed a hierarchical modelling framework for estimating species occupancy from automated species-detection data. We explore design and optimization of data post-processing procedures to account for detection errors and generate accurate estimates. Our proposed method accounts for both imperfect detection and false positive errors and utilizes information about both occurrence and abundance of detections to improve estimation.Using simulations, we show that our method provides much more accurate estimates than models ignoring the abundance of detections. The same findings are reached when we apply the methods to two real datasets on North American frogs surveyed with acoustic recorders.When false positives occur, estimator accuracy can be improved when a subset of detections produced by the classification algorithm is post-validated by a human observer. We use simulations to investigate the relationship between accuracy and effort spent on post-validation, and found that very accurate occupancy estimates can be obtained with as little as 1% of data being validated.Automated monitoring of wildlife provides opportunity and challenges. Our methods for analysing automated species-detection data help to meet key challenges unique to these data and will prove useful for many wildlife monitoring programs.

  16. Automated design and optimization of flexible booster autopilots via linear programming, volume 1

    Science.gov (United States)

    Hauser, F. D.

    1972-01-01

    A nonlinear programming technique was developed for the automated design and optimization of autopilots for large flexible launch vehicles. This technique, which resulted in the COEBRA program, uses the iterative application of linear programming. The method deals directly with the three main requirements of booster autopilot design: to provide (1) good response to guidance commands; (2) response to external disturbances (e.g. wind) to minimize structural bending moment loads and trajectory dispersions; and (3) stability with specified tolerances on the vehicle and flight control system parameters. The method is applicable to very high order systems (30th and greater per flight condition). Examples are provided that demonstrate the successful application of the employed algorithm to the design of autopilots for both single and multiple flight conditions.

  17. Aviation safety and automation technology for subsonic transports

    Science.gov (United States)

    Albers, James A.

    1991-01-01

    Discussed here are aviation safety human factors and air traffic control (ATC) automation research conducted at the NASA Ames Research Center. Research results are given in the areas of flight deck and ATC automations, displays and warning systems, crew coordination, and crew fatigue and jet lag. Accident investigation and an incident reporting system that is used to guide the human factors research is discussed. A design philosophy for human-centered automation is given, along with an evaluation of automation on advanced technology transports. Intelligent error tolerant systems such as electronic checklists are discussed along with design guidelines for reducing procedure errors. The data on evaluation of Crew Resource Management (CRM) training indicates highly significant positive changes in appropriate flight deck behavior and more effective use of available resources for crew members receiving the training.

  18. Gravity-Assist Trajectories to the Ice Giants: An Automated Method to Catalog Mass-or Time-Optimal Solutions

    Science.gov (United States)

    Hughes, Kyle M.; Knittel, Jeremy M.; Englander, Jacob A.

    2017-01-01

    This work presents an automated method of calculating mass (or time) optimal gravity-assist trajectories without a priori knowledge of the flyby-body combination. Since gravity assists are particularly crucial for reaching the outer Solar System, we use the Ice Giants, Uranus and Neptune, as example destinations for this work. Catalogs are also provided that list the most attractive trajectories found over launch dates ranging from 2024 to 2038. The tool developed to implement this method, called the Python EMTG Automated Trade Study Application (PEATSA), iteratively runs the Evolutionary Mission Trajectory Generator (EMTG), a NASA Goddard Space Flight Center in-house trajectory optimization tool. EMTG finds gravity-assist trajectories with impulsive maneuvers using a multiple-shooting structure along with stochastic methods (such as monotonic basin hopping) and may be run with or without an initial guess provided. PEATSA runs instances of EMTG in parallel over a grid of launch dates. After each set of runs completes, the best results within a neighborhood of launch dates are used to seed all other cases in that neighborhood---allowing the solutions across the range of launch dates to improve over each iteration. The results here are compared against trajectories found using a grid-search technique, and PEATSA is found to outperform the grid-search results for most launch years considered.

  19. Automation model of sewerage rehabilitation planning.

    Science.gov (United States)

    Yang, M D; Su, T C

    2006-01-01

    The major steps of sewerage rehabilitation include inspection of sewerage, assessment of structural conditions, computation of structural condition grades, and determination of rehabilitation methods and materials. Conventionally, sewerage rehabilitation planning relies on experts with professional background that is tedious and time-consuming. This paper proposes an automation model of planning optimal sewerage rehabilitation strategies for the sewer system by integrating image process, clustering technology, optimization, and visualization display. Firstly, image processing techniques, such as wavelet transformation and co-occurrence features extraction, were employed to extract various characteristics of structural failures from CCTV inspection images. Secondly, a classification neural network was established to automatically interpret the structural conditions by comparing the extracted features with the typical failures in a databank. Then, to achieve optimal rehabilitation efficiency, a genetic algorithm was used to determine appropriate rehabilitation methods and substitution materials for the pipe sections with a risk of mal-function and even collapse. Finally, the result from the automation model can be visualized in a geographic information system in which essential information of the sewer system and sewerage rehabilitation plans are graphically displayed. For demonstration, the automation model of optimal sewerage rehabilitation planning was applied to a sewer system in east Taichung, Chinese Taiwan.

  20. Automated MAD and MIR structure solution

    International Nuclear Information System (INIS)

    Terwilliger, Thomas C.; Berendzen, Joel

    1999-01-01

    A fully automated procedure for solving MIR and MAD structures has been developed using a scoring scheme to convert the structure-solution process into an optimization problem. Obtaining an electron-density map from X-ray diffraction data can be difficult and time-consuming even after the data have been collected, largely because MIR and MAD structure determinations currently require many subjective evaluations of the qualities of trial heavy-atom partial structures before a correct heavy-atom solution is obtained. A set of criteria for evaluating the quality of heavy-atom partial solutions in macromolecular crystallography have been developed. These have allowed the conversion of the crystal structure-solution process into an optimization problem and have allowed its automation. The SOLVE software has been used to solve MAD data sets with as many as 52 selenium sites in the asymmetric unit. The automated structure-solution process developed is a major step towards the fully automated structure-determination, model-building and refinement procedure which is needed for genomic scale structure determinations

  1. Laboratory automation: trajectory, technology, and tactics.

    Science.gov (United States)

    Markin, R S; Whalen, S A

    2000-05-01

    Laboratory automation is in its infancy, following a path parallel to the development of laboratory information systems in the late 1970s and early 1980s. Changes on the horizon in healthcare and clinical laboratory service that affect the delivery of laboratory results include the increasing age of the population in North America, the implementation of the Balanced Budget Act (1997), and the creation of disease management companies. Major technology drivers include outcomes optimization and phenotypically targeted drugs. Constant cost pressures in the clinical laboratory have forced diagnostic manufacturers into less than optimal profitability states. Laboratory automation can be a tool for the improvement of laboratory services and may decrease costs. The key to improvement of laboratory services is implementation of the correct automation technology. The design of this technology should be driven by required functionality. Automation design issues should be centered on the understanding of the laboratory and its relationship to healthcare delivery and the business and operational processes in the clinical laboratory. Automation design philosophy has evolved from a hardware-based approach to a software-based approach. Process control software to support repeat testing, reflex testing, and transportation management, and overall computer-integrated manufacturing approaches to laboratory automation implementation are rapidly expanding areas. It is clear that hardware and software are functionally interdependent and that the interface between the laboratory automation system and the laboratory information system is a key component. The cost-effectiveness of automation solutions suggested by vendors, however, has been difficult to evaluate because the number of automation installations are few and the precision with which operational data have been collected to determine payback is suboptimal. The trend in automation has moved from total laboratory automation to a

  2. Autonomous Vehicle Coordination with Wireless Sensor and Actuator Networks

    NARCIS (Netherlands)

    Marin Perianu, Mihai; Bosch, S.; Marin Perianu, Raluca; Scholten, Johan; Havinga, Paul J.M.

    2010-01-01

    A coordinated team of mobile wireless sensor and actuator nodes can bring numerous benefits for various applications in the field of cooperative surveillance, mapping unknown areas, disaster management, automated highway and space exploration. This article explores the idea of mobile nodes using

  3. Automation Framework for Flight Dynamics Products Generation

    Science.gov (United States)

    Wiegand, Robert E.; Esposito, Timothy C.; Watson, John S.; Jun, Linda; Shoan, Wendy; Matusow, Carla

    2010-01-01

    XFDS provides an easily adaptable automation platform. To date it has been used to support flight dynamics operations. It coordinates the execution of other applications such as Satellite TookKit, FreeFlyer, MATLAB, and Perl code. It provides a mechanism for passing messages among a collection of XFDS processes, and allows sending and receiving of GMSEC messages. A unified and consistent graphical user interface (GUI) is used for the various tools. Its automation configuration is stored in text files, and can be edited either directly or using the GUI.

  4. Optimization and coordination of South-to-North Water Diversion supply chain with strategic customer behavior

    Directory of Open Access Journals (Sweden)

    Zhi-song Chen

    2012-12-01

    Full Text Available The South-to-North Water Diversion (SNWD Project is a significant engineering project meant to solve water shortage problems in North China. Faced with market operations management of the water diversion system, this study defined the supply chain system for the SNWD Project, considering the actual project conditions, built a decentralized decision model and a centralized decision model with strategic customer behavior (SCB using a floating pricing mechanism (FPM, and constructed a coordination mechanism via a revenue-sharing contract. The results suggest the following: (1 owing to water shortage supplements and the excess water sale policy provided by the FPM, the optimal ordering quantity of water resources is less than that without the FPM, and the optimal profits of the whole supply chain, supplier, and external distributor are higher than they would be without the FPM; (2 wholesale pricing and supplementary wholesale pricing with SCB are higher than those without SCB, and the optimal profits of the whole supply chain, supplier, and external distributor are higher than they would be without SCB; and (3 considering SCB and introducing the FPM help increase the optimal profits of the whole supply chain, supplier, and external distributor, and improve the efficiency of water resources usage.

  5. Influence of the faces relative arrangement on the optimal reloading station location and analytical determination of its coordinates

    Directory of Open Access Journals (Sweden)

    V.К. Slobodyanyuk

    2017-04-01

    Full Text Available The purpose of this study is to develop a methodology of the optimal rock mass run-of-mine (RoM stock point determination and research of the influence of faces spatial arrangement on this point. The research represents an overview of current researches, where algorithms of the Fermat-Torricelli-Steiner point are used in order to minimize the logistic processes. The methods of mathematical optimization and analytical geometry were applied. Formulae for the optimal point coordinates determination for a 4 faces were established using the latter methods. Mining technology with use of reloading stations is rather common at the deep iron ore pits. In most cases, when deciding on location of RoM stock, its high-altitude position in space of the pit is primarily taken into account. However, the location of the reloading station in a layout also has a significant influence on technical and economic parameters of open-pit mining operations. The traditional approach, which considers a point of the center of gravity as an optimal point for RoM stock location, does not guarantee the minimum haulage. In mathematics, the Fermat-Torricelli point that provides a minimum distance to the vertices of the triangle is known. It is shown that the minimum haulage is provided when the point of RoM stock location and Fermat-Torricelli point coincide. In terms of open pit mining operations, the development of a method that will determine an optimal point of RoM stock location for a working area with respect to the known coordinates of distinguished points on the basis of new weight factors is of particular practical importance. A two-stage solution to the problem of determining the rational point of RoM stock location (with a minimal transport work for any number of faces is proposed. Such optimal point for RoM stock location reduces the transport work by 10–20 %.

  6. Optimizing human-system interface automation design based on a skill-rule-knowledge framework

    International Nuclear Information System (INIS)

    Lin, Chiuhsiang Joe; Yenn, T.-C.; Yang, C.-W.

    2010-01-01

    This study considers the technological change that has occurred in complex systems within the past 30 years. The role of human operators in controlling and interacting with complex systems following the technological change was also investigated. Modernization of instrumentation and control systems and components leads to a new issue of human-automation interaction, in which human operational performance must be considered in automated systems. The human-automation interaction can differ in its types and levels. A system design issue is usually realized: given these technical capabilities, which system functions should be automated and to what extent? A good automation design can be achieved by making an appropriate human-automation function allocation. To our knowledge, only a few studies have been published on how to achieve appropriate automation design with a systematic procedure. Further, there is a surprising lack of information on examining and validating the influences of levels of automation (LOAs) on instrumentation and control systems in the advanced control room (ACR). The study we present in this paper proposed a systematic framework to help in making an appropriate decision towards types of automation (TOA) and LOAs based on a 'Skill-Rule-Knowledge' (SRK) model. From the evaluating results, it was shown that the use of either automatic mode or semiautomatic mode is insufficient to prevent human errors. For preventing the occurrences of human errors and ensuring the safety in ACR, the proposed framework can be valuable for making decisions in human-automation allocation.

  7. Automated diagnostics scoping study. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Quadrel, R.W.; Lash, T.A.

    1994-06-01

    The objective of the Automated Diagnostics Scoping Study was to investigate the needs for diagnostics in building operation and to examine some of the current technologies in automated diagnostics that can address these needs. The study was conducted in two parts. In the needs analysis, the authors interviewed facility managers and engineers at five building sites. In the technology survey, they collected published information on automated diagnostic technologies in commercial and military applications as well as on technologies currently under research. The following describe key areas that the authors identify for the research, development, and deployment of automated diagnostic technologies: tools and techniques to aid diagnosis during building commissioning, especially those that address issues arising from integrating building systems and diagnosing multiple simultaneous faults; technologies to aid diagnosis for systems and components that are unmonitored or unalarmed; automated capabilities to assist cause-and-effect exploration during diagnosis; inexpensive, reliable sensors, especially those that expand the current range of sensory input; technologies that aid predictive diagnosis through trend analysis; integration of simulation and optimization tools with building automation systems to optimize control strategies and energy performance; integration of diagnostic, control, and preventive maintenance technologies. By relating existing technologies to perceived and actual needs, the authors reached some conclusions about the opportunities for automated diagnostics in building operation. Some of a building operator`s needs can be satisfied by off-the-shelf hardware and software. Other needs are not so easily satisfied, suggesting directions for future research. Their conclusions and suggestions are offered in the final section of this study.

  8. Regimes of data output from an automated scanning system into a computer

    International Nuclear Information System (INIS)

    Ovsov, Yu.V.; Shaislamov, P.T.

    1984-01-01

    A method is described for accomplishment of rather a complex algorithm of various coordinate and service data transmission from different automated scanning system devices into a monitoring computer in the automated system for processing images from bubble chambers. The accepted data output algorithm and the developed appropriate equipment enable data transmission both in separate words and word arrays

  9. Automating with SIMATIC S7-400 inside TIA portal configuring, programming and testing with STEP 7 Professional

    CERN Document Server

    Berger, Hans

    2014-01-01

    This book presents a comprehensive description of the configuration of devices and network for the S7-400 components inside the engineering framework TIA Portal. You learn how to formulate and test a control program with the programming languages LAD, FBD, STL, and SCL. The book is rounded off by configuring the distributed I/O with PROFIBUS DP and PROFINET IO using SIMATIC S7-400 and data exchange via Industrial Ethernet. SIMATIC is the globally established automation system for implementing industrial controllers for machines, production plants and processes. SIMATIC S7-400 is the most powerful automation system within SIMATIC. This process controller is ideal for data-intensive tasks that are especially typical for the process industry. With superb communication capability and integrated interfaces it is optimized for larger tasks such as the coordination of entire systems. Open-loop and closed-loop control tasks are formulated with the STEP 7 Professional V11 engineering software in the field-proven progr...

  10. Coordinated Pitch & Torque Control of Large-Scale Wind Turbine Based on Pareto Eciency Analysis

    DEFF Research Database (Denmark)

    Lin, Zhongwei; Chen, Zhenyu; Wu, Qiuwei

    2018-01-01

    For the existing pitch and torque control of the wind turbine generator system (WTGS), further development on coordinated control is necessary to improve effectiveness for practical applications. In this paper, the WTGS is modeled as a coupling combination of two subsystems: the generator torque...... control subsystem and blade pitch control subsystem. Then, the pole positions in each control subsystem are adjusted coordinately to evaluate the controller participation and used as the objective of optimization. A two-level parameters-controllers coordinated optimization scheme is proposed and applied...... to optimize the controller coordination based on the Pareto optimization theory. Three solutions are obtained through optimization, which includes the optimal torque solution, optimal power solution, and satisfactory solution. Detailed comparisons evaluate the performance of the three selected solutions...

  11. Automation in biological crystallization.

    Science.gov (United States)

    Stewart, Patrick Shaw; Mueller-Dieckmann, Jochen

    2014-06-01

    Crystallization remains the bottleneck in the crystallographic process leading from a gene to a three-dimensional model of the encoded protein or RNA. Automation of the individual steps of a crystallization experiment, from the preparation of crystallization cocktails for initial or optimization screens to the imaging of the experiments, has been the response to address this issue. Today, large high-throughput crystallization facilities, many of them open to the general user community, are capable of setting up thousands of crystallization trials per day. It is thus possible to test multiple constructs of each target for their ability to form crystals on a production-line basis. This has improved success rates and made crystallization much more convenient. High-throughput crystallization, however, cannot relieve users of the task of producing samples of high quality. Moreover, the time gained from eliminating manual preparations must now be invested in the careful evaluation of the increased number of experiments. The latter requires a sophisticated data and laboratory information-management system. A review of the current state of automation at the individual steps of crystallization with specific attention to the automation of optimization is given.

  12. A sensor-based automation system for handling nuclear materials

    International Nuclear Information System (INIS)

    Drotning, W.; Kimberly, H.; Wapman, W.; Darras, D.

    1997-01-01

    An automated system is being developed for handling large payloads of radioactive nuclear materials in an analytical laboratory. The automation system performs unpacking and repacking of payloads from shipping and storage containers, and delivery of the payloads to the stations in the laboratory. The system uses machine vision and force/torque sensing to provide sensor-based control of the automation system in order to enhance system safety, flexibility, and robustness, and achieve easy remote operation. The automation system also controls the operation of the laboratory measurement systems and the coordination of them with the robotic system. Particular attention has been given to system design features and analytical methods that provide an enhanced level of operational safety. Independent mechanical gripper interlock and tool release mechanisms were designed to prevent payload mishandling. An extensive Failure Modes and Effects Analysis of the automation system was developed as a safety design analysis tool

  13. Optimizing the balance between task automation and human manual control in simulated submarine track management.

    Science.gov (United States)

    Chen, Stephanie I; Visser, Troy A W; Huf, Samuel; Loft, Shayne

    2017-09-01

    Automation can improve operator performance and reduce workload, but can also degrade operator situation awareness (SA) and the ability to regain manual control. In 3 experiments, we examined the extent to which automation could be designed to benefit performance while ensuring that individuals maintained SA and could regain manual control. Participants completed a simulated submarine track management task under varying task load. The automation was designed to facilitate information acquisition and analysis, but did not make task decisions. Relative to a condition with no automation, the continuous use of automation improved performance and reduced subjective workload, but degraded SA. Automation that was engaged and disengaged by participants as required (adaptable automation) moderately improved performance and reduced workload relative to no automation, but degraded SA. Automation engaged and disengaged based on task load (adaptive automation) provided no benefit to performance or workload, and degraded SA relative to no automation. Automation never led to significant return-to-manual deficits. However, all types of automation led to degraded performance on a nonautomated task that shared information processing requirements with automated tasks. Given these outcomes, further research is urgently required to establish how to design automation to maximize performance while keeping operators cognitively engaged. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  14. Automated optimization and construction of chemometric models based on highly variable raw chromatographic data.

    Science.gov (United States)

    Sinkov, Nikolai A; Johnston, Brandon M; Sandercock, P Mark L; Harynuk, James J

    2011-07-04

    Direct chemometric interpretation of raw chromatographic data (as opposed to integrated peak tables) has been shown to be advantageous in many circumstances. However, this approach presents two significant challenges: data alignment and feature selection. In order to interpret the data, the time axes must be precisely aligned so that the signal from each analyte is recorded at the same coordinates in the data matrix for each and every analyzed sample. Several alignment approaches exist in the literature and they work well when the samples being aligned are reasonably similar. In cases where the background matrix for a series of samples to be modeled is highly variable, the performance of these approaches suffers. Considering the challenge of feature selection, when the raw data are used each signal at each time is viewed as an individual, independent variable; with the data rates of modern chromatographic systems, this generates hundreds of thousands of candidate variables, or tens of millions of candidate variables if multivariate detectors such as mass spectrometers are utilized. Consequently, an automated approach to identify and select appropriate variables for inclusion in a model is desirable. In this research we present an alignment approach that relies on a series of deuterated alkanes which act as retention anchors for an alignment signal, and couple this with an automated feature selection routine based on our novel cluster resolution metric for the construction of a chemometric model. The model system that we use to demonstrate these approaches is a series of simulated arson debris samples analyzed by passive headspace extraction, GC-MS, and interpreted using partial least squares discriminant analysis (PLS-DA). Copyright © 2011 Elsevier B.V. All rights reserved.

  15. Microprocessor-based integration of microfluidic control for the implementation of automated sensor monitoring and multithreaded optimization algorithms.

    Science.gov (United States)

    Ezra, Elishai; Maor, Idan; Bavli, Danny; Shalom, Itai; Levy, Gahl; Prill, Sebastian; Jaeger, Magnus S; Nahmias, Yaakov

    2015-08-01

    Microfluidic applications range from combinatorial synthesis to high throughput screening, with platforms integrating analog perfusion components, digitally controlled micro-valves and a range of sensors that demand a variety of communication protocols. Currently, discrete control units are used to regulate and monitor each component, resulting in scattered control interfaces that limit data integration and synchronization. Here, we present a microprocessor-based control unit, utilizing the MS Gadgeteer open framework that integrates all aspects of microfluidics through a high-current electronic circuit that supports and synchronizes digital and analog signals for perfusion components, pressure elements, and arbitrary sensor communication protocols using a plug-and-play interface. The control unit supports an integrated touch screen and TCP/IP interface that provides local and remote control of flow and data acquisition. To establish the ability of our control unit to integrate and synchronize complex microfluidic circuits we developed an equi-pressure combinatorial mixer. We demonstrate the generation of complex perfusion sequences, allowing the automated sampling, washing, and calibrating of an electrochemical lactate sensor continuously monitoring hepatocyte viability following exposure to the pesticide rotenone. Importantly, integration of an optical sensor allowed us to implement automated optimization protocols that require different computational challenges including: prioritized data structures in a genetic algorithm, distributed computational efforts in multiple-hill climbing searches and real-time realization of probabilistic models in simulated annealing. Our system offers a comprehensive solution for establishing optimization protocols and perfusion sequences in complex microfluidic circuits.

  16. Automated Support for Rapid Coordination of Joint UUV Operation

    Science.gov (United States)

    2015-03-01

    Figure 20. MOOS -IvP Simulation Test Run Using the pMarineViewer Graphical User Interface, from [9...Global Positioning System ISR intelligence, surveillance and reconnaissance MAUV multiple AUVs Mbps megabits per second MOOS -IvP mission oriented...it to UUV coordination. D. Jiang et al. used the mission-oriented operating suite interval programming ( MOOS -IvP) architecture (open source) and a

  17. Laboratory automation in clinical bacteriology: what system to choose?

    Science.gov (United States)

    Croxatto, A; Prod'hom, G; Faverjon, F; Rochais, Y; Greub, G

    2016-03-01

    Automation was introduced many years ago in several diagnostic disciplines such as chemistry, haematology and molecular biology. The first laboratory automation system for clinical bacteriology was released in 2006, and it rapidly proved its value by increasing productivity, allowing a continuous increase in sample volumes despite limited budgets and personnel shortages. Today, two major manufacturers, BD Kiestra and Copan, are commercializing partial or complete laboratory automation systems for bacteriology. The laboratory automation systems are rapidly evolving to provide improved hardware and software solutions to optimize laboratory efficiency. However, the complex parameters of the laboratory and automation systems must be considered to determine the best system for each given laboratory. We address several topics on laboratory automation that may help clinical bacteriologists to understand the particularities and operative modalities of the different systems. We present (a) a comparison of the engineering and technical features of the various elements composing the two different automated systems currently available, (b) the system workflows of partial and complete laboratory automation, which define the basis for laboratory reorganization required to optimize system efficiency, (c) the concept of digital imaging and telebacteriology, (d) the connectivity of laboratory automation to the laboratory information system, (e) the general advantages and disadvantages as well as the expected impacts provided by laboratory automation and (f) the laboratory data required to conduct a workflow assessment to determine the best configuration of an automated system for the laboratory activities and specificities. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  18. Retrieval-based Face Annotation by Weak Label Regularized Local Coordinate Coding.

    Science.gov (United States)

    Wang, Dayong; Hoi, Steven C H; He, Ying; Zhu, Jianke; Mei, Tao; Luo, Jiebo

    2013-08-02

    Retrieval-based face annotation is a promising paradigm of mining massive web facial images for automated face annotation. This paper addresses a critical problem of such paradigm, i.e., how to effectively perform annotation by exploiting the similar facial images and their weak labels which are often noisy and incomplete. In particular, we propose an effective Weak Label Regularized Local Coordinate Coding (WLRLCC) technique, which exploits the principle of local coordinate coding in learning sparse features, and employs the idea of graph-based weak label regularization to enhance the weak labels of the similar facial images. We present an efficient optimization algorithm to solve the WLRLCC task. We conduct extensive empirical studies on two large-scale web facial image databases: (i) a Western celebrity database with a total of $6,025$ persons and $714,454$ web facial images, and (ii)an Asian celebrity database with $1,200$ persons and $126,070$ web facial images. The encouraging results validate the efficacy of the proposed WLRLCC algorithm. To further improve the efficiency and scalability, we also propose a PCA-based approximation scheme and an offline approximation scheme (AWLRLCC), which generally maintains comparable results but significantly saves much time cost. Finally, we show that WLRLCC can also tackle two existing face annotation tasks with promising performance.

  19. Optimal Coordinated Management of a Plug-In Electric Vehicle Charging Station under a Flexible Penalty Contract for Voltage Security

    Directory of Open Access Journals (Sweden)

    Jip Kim

    2016-07-01

    Full Text Available The increasing penetration of plug-in electric vehicles (PEVs may cause a low-voltage problem in the distribution network. In particular, the introduction of charging stations where multiple PEVs are simultaneously charged at the same bus can aggravate the low-voltage problem. Unlike a distribution network operator (DNO who has the overall responsibility for stable and reliable network operation, a charging station operator (CSO may schedule PEV charging without consideration for the resulting severe voltage drop. Therefore, there is a need for the DNO to impose a coordination measure to induce the CSO to adjust its charging schedule to help mitigate the voltage problem. Although the current time-of-use (TOU tariff is an indirect coordination measure that can motivate the CSO to shift its charging demand to off-peak time by imposing a high rate at the peak time, it is limited by its rigidity in that the network voltage condition cannot be flexibly reflected in the tariff. Therefore, a flexible penalty contract (FPC for voltage security to be used as a direct coordination measure is proposed. In addition, the optimal coordinated management is formulated. Using the Pacific Gas and Electric Company (PG&E 69-bus test distribution network, the effectiveness of the coordination was verified by comparison with the current TOU tariff.

  20. Automated procedure for selection of optimal refueling policies for light water reactors

    International Nuclear Information System (INIS)

    Lin, B.I.; Zolotar, B.; Weisman, J.

    1979-01-01

    An automated procedure determining a minimum cost refueling policy has been developed for light water reactors. The procedure is an extension of the equilibrium core approach previously devised for pressurized water reactors (PWRs). Use of 1 1/2-group theory has improved the accuracy of the nuclear model and eliminated tedious fitting of albedos. A simple heuristic algorithm for locating a good starting policy has materially reduced PWR computing time. Inclusion of void effects and use of the Haling principle for axial flux calculations extended the nuclear model to boiling water reactors (BWRs). A good initial estimate of the refueling policy is obtained by recognizing that a nearly uniform distribution of reactivity provides low-power peaking. The initial estimate is improved upon by interchanging groups of four assemblies and is subsequently refined by interchanging individual assemblies. The method yields very favorable results, is simpler than previously proposed BWR fuel optimization schemes, and retains power cost as the objective function

  1. Feasibility evaluation of 3 automated cellular drug screening assays on a robotic workstation.

    Science.gov (United States)

    Soikkeli, Anne; Sempio, Cristina; Kaukonen, Ann Marie; Urtti, Arto; Hirvonen, Jouni; Yliperttula, Marjo

    2010-01-01

    This study presents the implementation and optimization of 3 cell-based assays on a TECAN Genesis workstation-the Caspase-Glo 3/7 and sulforhodamine B (SRB) screening assays and the mechanistic Caco-2 permeability protocol-and evaluates their feasibility for automation. During implementation, the dispensing speed to add drug solutions and fixative trichloroacetic acid and the aspiration speed to remove the supernatant immediately after fixation were optimized. Decontamination steps for cleaning the tips and pipetting tubing were also added. The automated Caspase-Glo 3/7 screen was successfully optimized with Caco-2 cells (Z' 0.7, signal-to-base ratio [S/B] 1.7) but not with DU-145 cells. In contrast, the automated SRB screen was successfully optimized with the DU-145 cells (Z' 0.8, S/B 2.4) but not with the Caco-2 cells (Z' -0.8, S/B 1.4). The automated bidirectional Caco-2 permeability experiments separated successfully low- and high-permeability compounds (Z' 0.8, S/B 84.2) and passive drug permeation from efflux-mediated transport (Z' 0.5, S/B 8.6). Of the assays, the homogeneous Caspase-Glo 3/7 assay benefits the most from automation, but also the heterogeneous SRB assay and Caco-2 permeability experiments gain advantages from automation.

  2. How to assess sustainability in automated manufacturing

    DEFF Research Database (Denmark)

    Dijkman, Teunis Johannes; Rödger, Jan-Markus; Bey, Niki

    2015-01-01

    The aim of this paper is to describe how sustainability in automation can be assessed. The assessment method is illustrated using a case study of a robot. Three aspects of sustainability assessment in automation are identified. Firstly, we consider automation as part of a larger system...... that fulfills the market demand for a given functionality. Secondly, three aspects of sustainability have to be assessed: environment, economy, and society. Thirdly, automation is part of a system with many levels, with different actors on each level, resulting in meeting the market demand. In this system......, (sustainability) specifications move top-down, which helps avoiding sub-optimization and problem shifting. From these three aspects, sustainable automation is defined as automation that contributes to products that fulfill a market demand in a more sustainable way. The case study presents the carbon footprints...

  3. Application of advanced technology to space automation

    Science.gov (United States)

    Schappell, R. T.; Polhemus, J. T.; Lowrie, J. W.; Hughes, C. A.; Stephens, J. R.; Chang, C. Y.

    1979-01-01

    Automated operations in space provide the key to optimized mission design and data acquisition at minimum cost for the future. The results of this study strongly accentuate this statement and should provide further incentive for immediate development of specific automtion technology as defined herein. Essential automation technology requirements were identified for future programs. The study was undertaken to address the future role of automation in the space program, the potential benefits to be derived, and the technology efforts that should be directed toward obtaining these benefits.

  4. Understanding Innovation Engines: Automated Creativity and Improved Stochastic Optimization via Deep Learning.

    Science.gov (United States)

    Nguyen, A; Yosinski, J; Clune, J

    2016-01-01

    The Achilles Heel of stochastic optimization algorithms is getting trapped on local optima. Novelty Search mitigates this problem by encouraging exploration in all interesting directions by replacing the performance objective with a reward for novel behaviors. This reward for novel behaviors has traditionally required a human-crafted, behavioral distance function. While Novelty Search is a major conceptual breakthrough and outperforms traditional stochastic optimization on certain problems, it is not clear how to apply it to challenging, high-dimensional problems where specifying a useful behavioral distance function is difficult. For example, in the space of images, how do you encourage novelty to produce hawks and heroes instead of endless pixel static? Here we propose a new algorithm, the Innovation Engine, that builds on Novelty Search by replacing the human-crafted behavioral distance with a Deep Neural Network (DNN) that can recognize interesting differences between phenotypes. The key insight is that DNNs can recognize similarities and differences between phenotypes at an abstract level, wherein novelty means interesting novelty. For example, a DNN-based novelty search in the image space does not explore in the low-level pixel space, but instead creates a pressure to create new types of images (e.g., churches, mosques, obelisks, etc.). Here, we describe the long-term vision for the Innovation Engine algorithm, which involves many technical challenges that remain to be solved. We then implement a simplified version of the algorithm that enables us to explore some of the algorithm's key motivations. Our initial results, in the domain of images, suggest that Innovation Engines could ultimately automate the production of endless streams of interesting solutions in any domain: for example, producing intelligent software, robot controllers, optimized physical components, and art.

  5. Lighting Automation Flying an Earthlike Habitat

    Science.gov (United States)

    Clark, Toni A.; Kolomenski, Andrei

    2017-01-01

    Currently, spacecraft lighting systems are not demonstrating innovations in automation due to perceived costs in designing circuitry for the communication and automation of lights. The majority of spacecraft lighting systems employ lamps or zone specific manual switches and dimmers. This type of 'hardwired' solution does not easily convert to automation. With advances in solid state lighting, the potential to enhance a spacecraft habitat is lost if the communication and automation problem is not tackled. If we are to build long duration environments, which provide earth-like habitats, minimize crew time, and optimize spacecraft power reserves, innovation in lighting automation is a must. This project researched the use of the DMX512 communication protocol originally developed for high channel count lighting systems. DMX512 is an internationally governed, industry-accepted, lighting communication protocol with wide industry support. The lighting industry markets a wealth of hardware and software that utilizes DMX512, and there may be incentive to space certify the system. Our goal in this research is to enable the development of automated spacecraft habitats for long duration missions. To transform how spacecraft lighting environments are automated, our project conducted a variety of tests to determine a potential scope of capability. We investigated utilization and application of an industry accepted lighting control protocol, DMX512 by showcasing how the lighting system could help conserve power, assist with lighting countermeasures, and utilize spatial body tracking. We hope evaluation and the demonstrations we built will inspire other NASA engineers, architects and researchers to consider employing DMX512 "smart lighting" capabilities into their system architecture. By using DMX512 we will prove the 'wheel' does not need to be reinvented in terms of smart lighting and future spacecraft can use a standard lighting protocol to produce an effective, optimized and

  6. Lighting Automation - Flying an Earthlike Habitat

    Science.gov (United States)

    Clark, Tori A. (Principal Investigator); Kolomenski, Andrei

    2017-01-01

    Currently, spacecraft lighting systems are not demonstrating innovations in automation due to perceived costs in designing circuitry for the communication and automation of lights. The majority of spacecraft lighting systems employ lamps or zone specific manual switches and dimmers. This type of 'hardwired' solution does not easily convert to automation. With advances in solid state lighting, the potential to enhance a spacecraft habitat is lost if the communication and automation problem is not tackled. If we are to build long duration environments, which provide earth-like habitats, minimize crew time, and optimize spacecraft power reserves, innovation in lighting automation is a must. This project researched the use of the DMX512 communication protocol originally developed for high channel count lighting systems. DMX512 is an internationally governed, industry-accepted, lighting communication protocol with wide industry support. The lighting industry markets a wealth of hardware and software that utilizes DMX512, and there may be incentive to space certify the system. Our goal in this research is to enable the development of automated spacecraft habitats for long duration missions. To transform how spacecraft lighting environments are automated, our project conducted a variety of tests to determine a potential scope of capability. We investigated utilization and application of an industry accepted lighting control protocol, DMX512 by showcasing how the lighting system could help conserve power, assist with lighting countermeasures, and utilize spatial body tracking. We hope evaluation and the demonstrations we built will inspire other NASA engineers, architects and researchers to consider employing DMX512 "smart lighting" capabilities into their system architecture. By using DMX512 we will prove the 'wheel' does not need to be reinvented in terms of smart lighting and future spacecraft can use a standard lighting protocol to produce an effective, optimized and

  7. Optimization of Reversed-Phase Peptide Liquid Chromatography Ultraviolet Mass Spectrometry Analyses Using an Automated Blending Methodology

    Science.gov (United States)

    Chakraborty, Asish B.; Berger, Scott J.

    2005-01-01

    The balance between chromatographic performance and mass spectrometric response has been evaluated using an automated series of experiments where separations are produced by the real-time automated blending of water with organic and acidic modifiers. In this work, the concentration effects of two acidic modifiers (formic acid and trifluoroacetic acid) were studied on the separation selectivity, ultraviolet, and mass spectrometry detector response, using a complex peptide mixture. Peptide retention selectivity differences were apparent between the two modifiers, and under the conditions studied, trifluoroacetic acid produced slightly narrower (more concentrated) peaks, but significantly higher electrospray mass spectrometry suppression. Trifluoroacetic acid suppression of electrospray signal and influence on peptide retention and selectivity was dominant when mixtures of the two modifiers were analyzed. Our experimental results indicate that in analyses where the analyzed components are roughly equimolar (e.g., a peptide map of a recombinant protein), the selectivity of peptide separations can be optimized by choice and concentration of acidic modifier, without compromising the ability to obtain effective sequence coverage of a protein. In some cases, these selectivity differences were explored further, and a rational basis for differentiating acidic modifier effects from the underlying peptide sequences is described. PMID:16522853

  8. Review of Automated Design and Optimization of MEMS

    DEFF Research Database (Denmark)

    Achiche, Sofiane; Fan, Zhun; Bolognini, Francesca

    2007-01-01

    carried out. This paper presents a review of these techniques. The design task of MEMS is usually divided into four main stages: System Level, Device Level, Physical Level and the Process Level. The state of the art o automated MEMS design in each of these levels is investigated....

  9. Automated beam steering using optimal control

    Energy Technology Data Exchange (ETDEWEB)

    Allen, C. K. (Christopher K.)

    2004-01-01

    We present a steering algorithm which, with the aid of a model, allows the user to specify beam behavior throughout a beamline, rather than just at specified beam position monitor (BPM) locations. The model is used primarily to compute the values of the beam phase vectors from BPM measurements, and to define cost functions that describe the steering objectives. The steering problem is formulated as constrained optimization problem; however, by applying optimal control theory we can reduce it to an unconstrained optimization whose dimension is the number of control signals.

  10. Are automated molecular dynamics simulations and binding free energy calculations realistic tools in lead optimization? An evaluation of the linear interaction energy (LIE) method

    NARCIS (Netherlands)

    Stjernschantz, E.M.; Marelius, J.; Medina, C.; Jacobsson, M.; Vermeulen, N.P.E.; Oostenbrink, C.

    2006-01-01

    An extensive evaluation of the linear interaction energy (LIE) method for the prediction of binding affinity of docked compounds has been performed, with an emphasis on its applicability in lead optimization. An automated setup is presented, which allows for the use of the method in an industrial

  11. Optimal source coding, removable noise elimination, and natural coordinate system construction for general vector sources using replicator neural networks

    Science.gov (United States)

    Hecht-Nielsen, Robert

    1997-04-01

    A new universal one-chart smooth manifold model for vector information sources is introduced. Natural coordinates (a particular type of chart) for such data manifolds are then defined. Uniformly quantized natural coordinates form an optimal vector quantization code for a general vector source. Replicator neural networks (a specialized type of multilayer perceptron with three hidden layers) are the introduced. As properly configured examples of replicator networks approach minimum mean squared error (e.g., via training and architecture adjustment using randomly chosen vectors from the source), these networks automatically develop a mapping which, in the limit, produces natural coordinates for arbitrary source vectors. The new concept of removable noise (a noise model applicable to a wide variety of real-world noise processes) is then discussed. Replicator neural networks, when configured to approach minimum mean squared reconstruction error (e.g., via training and architecture adjustment on randomly chosen examples from a vector source, each with randomly chosen additive removable noise contamination), in the limit eliminate removable noise and produce natural coordinates for the data vector portions of the noise-corrupted source vectors. Consideration regarding selection of the dimension of a data manifold source model and the training/configuration of replicator neural networks are discussed.

  12. Optimal Coordination of Distance and Directional Overcurrent Relays Considering Different Network Topologies

    OpenAIRE

    Y. Damchi; J. Sadeh; H. Rajabi Mashhadi

    2015-01-01

    Most studies in relay coordination have focused solely on coordination of overcurrent relays while distance relays are used as the main protection of transmission lines. Since, simultaneous coordination of these two types of relays can provide a better protection, in this paper, a new approach is proposed for simultaneous coordination of distance and directional overcurrent relays (D&DOCRs). Also, pursued by most of the previously published studies, the settings of D&DOCRs are usually determi...

  13. Systematic review automation technologies

    Science.gov (United States)

    2014-01-01

    Systematic reviews, a cornerstone of evidence-based medicine, are not produced quickly enough to support clinical practice. The cost of production, availability of the requisite expertise and timeliness are often quoted as major contributors for the delay. This detailed survey of the state of the art of information systems designed to support or automate individual tasks in the systematic review, and in particular systematic reviews of randomized controlled clinical trials, reveals trends that see the convergence of several parallel research projects. We surveyed literature describing informatics systems that support or automate the processes of systematic review or each of the tasks of the systematic review. Several projects focus on automating, simplifying and/or streamlining specific tasks of the systematic review. Some tasks are already fully automated while others are still largely manual. In this review, we describe each task and the effect that its automation would have on the entire systematic review process, summarize the existing information system support for each task, and highlight where further research is needed for realizing automation for the task. Integration of the systems that automate systematic review tasks may lead to a revised systematic review workflow. We envisage the optimized workflow will lead to system in which each systematic review is described as a computer program that automatically retrieves relevant trials, appraises them, extracts and synthesizes data, evaluates the risk of bias, performs meta-analysis calculations, and produces a report in real time. PMID:25005128

  14. Automated system for calibration and control of the CHSPP-800 multichannel γ detector parameters

    International Nuclear Information System (INIS)

    Avvakumov, N.A.; Belikov, N.I.; Goncharenko, Yu.M.

    1987-01-01

    An automated system for adjustment, calibration and control of total absorption Cherenkov spectrometer is described. The system comprises a mechanical platform, capable of moving in two mutually perpendicular directions; movement detectors and limit switches; power unit, automation unit with remote control board. The automated system can operate both in manual control regime with coordinate control by a digital indicator, and in operation regime with computer according to special programs. The platform mounting accuracy is ± 0.1 mm. Application of the automated system has increased the rate of the course of the counter adjustment works 3-5 times

  15. Automated bond order assignment as an optimization problem.

    Science.gov (United States)

    Dehof, Anna Katharina; Rurainski, Alexander; Bui, Quang Bao Anh; Böcker, Sebastian; Lenhof, Hans-Peter; Hildebrandt, Andreas

    2011-03-01

    Numerous applications in Computational Biology process molecular structures and hence strongly rely not only on correct atomic coordinates but also on correct bond order information. For proteins and nucleic acids, bond orders can be easily deduced but this does not hold for other types of molecules like ligands. For ligands, bond order information is not always provided in molecular databases and thus a variety of approaches tackling this problem have been developed. In this work, we extend an ansatz proposed by Wang et al. that assigns connectivity-based penalty scores and tries to heuristically approximate its optimum. In this work, we present three efficient and exact solvers for the problem replacing the heuristic approximation scheme of the original approach: an A*, an ILP and an fixed-parameter approach (FPT) approach. We implemented and evaluated the original implementation, our A*, ILP and FPT formulation on the MMFF94 validation suite and the KEGG Drug database. We show the benefit of computing exact solutions of the penalty minimization problem and the additional gain when computing all optimal (or even suboptimal) solutions. We close with a detailed comparison of our methods. The A* and ILP solution are integrated into the open-source C++ LGPL library BALL and the molecular visualization and modelling tool BALLView and can be downloaded from our homepage www.ball-project.org. The FPT implementation can be downloaded from http://bio.informatik.uni-jena.de/software/.

  16. Problems of complex automation of process at a NPP

    International Nuclear Information System (INIS)

    Naumov, A.V.

    1981-01-01

    The importance of theoretical investigation in determining the level and quality of NPP automation is discussed. Achievements gained in this direction are briefly reviewed on the example of domestic NPPs. Two models of the problem solution on function distribution between the operator and technical means are outlined. The processes subjected to automation are enumerated. Development of the optimal methods of power automatic control of power units is one of the most important problems of NPP automation. Automation of discrete operations especially during the start-up, shut-down or in imergency situations becomes important [ru

  17. Multi Satellite Cooperative and Non-Cooperative Trajectory Coordination

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of this project is to develop a framework to optimize the coordination of multiple spacecraft, each with defined goals. Using this framework, optimal...

  18. Automated firewall analytics design, configuration and optimization

    CERN Document Server

    Al-Shaer, Ehab

    2014-01-01

    This book provides a comprehensive and in-depth study of automated firewall policy analysis for designing, configuring and managing distributed firewalls in large-scale enterpriser networks. It presents methodologies, techniques and tools for researchers as well as professionals to understand the challenges and improve the state-of-the-art of managing firewalls systematically in both research and application domains. Chapters explore set-theory, managing firewall configuration globally and consistently, access control list with encryption, and authentication such as IPSec policies. The author

  19. Automated selection of the optimal cardiac phase for single-beat coronary CT angiography reconstruction

    International Nuclear Information System (INIS)

    Stassi, D.; Ma, H.; Schmidt, T. G.; Dutta, S.; Soderman, A.; Pazzani, D.; Gros, E.; Okerlund, D.

    2016-01-01

    Purpose: Reconstructing a low-motion cardiac phase is expected to improve coronary artery visualization in coronary computed tomography angiography (CCTA) exams. This study developed an automated algorithm for selecting the optimal cardiac phase for CCTA reconstruction. The algorithm uses prospectively gated, single-beat, multiphase data made possible by wide cone-beam imaging. The proposed algorithm differs from previous approaches because the optimal phase is identified based on vessel image quality (IQ) directly, compared to previous approaches that included motion estimation and interphase processing. Because there is no processing of interphase information, the algorithm can be applied to any sampling of image phases, making it suited for prospectively gated studies where only a subset of phases are available. Methods: An automated algorithm was developed to select the optimal phase based on quantitative IQ metrics. For each reconstructed slice at each reconstructed phase, an image quality metric was calculated based on measures of circularity and edge strength of through-plane vessels. The image quality metric was aggregated across slices, while a metric of vessel-location consistency was used to ignore slices that did not contain through-plane vessels. The algorithm performance was evaluated using two observer studies. Fourteen single-beat cardiac CT exams (Revolution CT, GE Healthcare, Chalfont St. Giles, UK) reconstructed at 2% intervals were evaluated for best systolic (1), diastolic (6), or systolic and diastolic phases (7) by three readers and the algorithm. Pairwise inter-reader and reader-algorithm agreement was evaluated using the mean absolute difference (MAD) and concordance correlation coefficient (CCC) between the reader and algorithm-selected phases. A reader-consensus best phase was determined and compared to the algorithm selected phase. In cases where the algorithm and consensus best phases differed by more than 2%, IQ was scored by three

  20. Data Assimilation by delay-coordinate nudging

    Science.gov (United States)

    Pazo, Diego; Lopez, Juan Manuel; Carrassi, Alberto

    2016-04-01

    A new nudging method for data assimilation, delay-coordinate nudging, is presented. Delay-coordinate nudging makes explicit use of present and past observations in the formulation of the forcing driving the model evolution at each time-step. Numerical experiments with a low order chaotic system show that the new method systematically outperforms standard nudging in different model and observational scenarios, also when using an un-optimized formulation of the delay-nudging coefficients. A connection between the optimal delay and the dominant Lyapunov exponent of the dynamics is found based on heuristic arguments and is confirmed by the numerical results, providing a guideline for the practical implementation of the algorithm. Delay-coordinate nudging preserves the easiness of implementation, the intuitive functioning and the reduced computational cost of the standard nudging, making it a potential alternative especially in the field of seasonal-to-decadal predictions with large Earth system models that limit the use of more sophisticated data assimilation procedures.

  1. Novel Handover Optimization with a Coordinated Contiguous Carrier Aggregation Deployment Scenario in LTE-Advanced Systems

    Directory of Open Access Journals (Sweden)

    Ibraheem Shayea

    2016-01-01

    Full Text Available The carrier aggregation (CA technique and Handover Parameters Optimization (HPO function have been introduced in LTE-Advanced systems to enhance system performance in terms of throughput, coverage area, and connection stability and to reduce management complexity. Although LTE-Advanced has benefited from the CA technique, the low spectral efficiency and high ping-pong effect with high outage probabilities in conventional Carrier Aggregation Deployment Scenarios (CADSs have become major challenges for cell edge User Equipment (UE. Also, the existing HPO algorithms are not optimal for selecting the appropriate handover control parameters (HCPs. This paper proposes two solutions by deploying a Coordinated Contiguous-CADS (CC-CADS and a Novel Handover Parameters Optimization algorithm that is based on the Weight Performance Function (NHPO-WPF. The CC-CADS uses two contiguous component carriers (CCs that have two different beam directions. The NHPO-WPF automatically adjusts the HCPs based on the Weight Performance Function (WPF, which is evaluated as a function of the Signal-to-Interference Noise Ratio (SINR, cell load, and UE’s velocity. Simulation results show that the CC-CADS and the NHPO-WPF algorithm provide significant enhancements in system performance over that of conventional CADSs and HPO algorithms from the literature, respectively. The integration of both solutions achieves even better performance than scenarios in which each solution is considered independently.

  2. Optimal Design of Gradient Materials and Bi-Level Optimization of Topology Using Targets (BOTT)

    Science.gov (United States)

    Garland, Anthony

    The objective of this research is to understand the fundamental relationships necessary to develop a method to optimize both the topology and the internal gradient material distribution of a single object while meeting constraints and conflicting objectives. Functionally gradient material (FGM) objects possess continuous varying material properties throughout the object, and they allow an engineer to tailor individual regions of an object to have specific mechanical properties by locally modifying the internal material composition. A variety of techniques exists for topology optimization, and several methods exist for FGM optimization, but combining the two together is difficult. Understanding the relationship between topology and material gradient optimization enables the selection of an appropriate model and the development of algorithms, which allow engineers to design high-performance parts that better meet design objectives than optimized homogeneous material objects. For this research effort, topology optimization means finding the optimal connected structure with an optimal shape. FGM optimization means finding the optimal macroscopic material properties within an object. Tailoring the material constitutive matrix as a function of position results in gradient properties. Once, the target macroscopic properties are known, a mesostructure or a particular material nanostructure can be found which gives the target material properties at each macroscopic point. This research demonstrates that topology and gradient materials can both be optimized together for a single part. The algorithms use a discretized model of the domain and gradient based optimization algorithms. In addition, when considering two conflicting objectives the algorithms in this research generate clear 'features' within a single part. This tailoring of material properties within different areas of a single part (automated design of 'features') using computational design tools is a novel benefit

  3. Directional Overcurrent Relays Coordination Problems in Distributed Generation Systems

    Directory of Open Access Journals (Sweden)

    Jakub Ehrenberger

    2017-09-01

    Full Text Available This paper proposes a new approach to the distributed generation system protection coordination based on directional overcurrent protections with inverse-time characteristics. The key question of protection coordination is the determination of correct values of all inverse-time characteristics coefficients. The coefficients must be correctly chosen considering the sufficiently short tripping times and the sufficiently long selectivity times. In the paper a new approach to protection coordination is designed, in which not only some, but all the required types of short-circuit contributions are taken into account. In radial systems, if the pickup currents are correctly chosen, protection coordination for maximum contributions is enough to ensure selectivity times for all the required short-circuit types. In distributed generation systems, due to different contributions flowing through the primary and selective protections, coordination for maximum contributions is not enough, but all the short-circuit types must be taken into account, and the protection coordination becomes a complex problem. A possible solution to the problem, based on an appropriately designed optimization, has been proposed in the paper. By repeating a simple optimization considering only one short-circuit type, the protection coordination considering all the required short-circuit types has been achieved. To show the importance of considering all the types of short-circuit contributions, setting optimizations with one (the highest and all the types of short-circuit contributions have been performed. Finally, selectivity time values are explored throughout the entire protected section, and both the settings are compared.

  4. AUTOMATED FEATURE BASED TLS DATA REGISTRATION FOR 3D BUILDING MODELING

    OpenAIRE

    K. Kitamura; N. Kochi; S. Kaneko

    2012-01-01

    In this paper we present a novel method for the registration of point cloud data obtained using terrestrial laser scanner (TLS). The final goal of our investigation is the automated reconstruction of CAD drawings and the 3D modeling of objects surveyed by TLS. Because objects are scanned from multiple positions, individual point cloud need to be registered to the same coordinate system. We propose in this paper an automated feature based registration procedure. Our proposed method does not re...

  5. The Atmospheric Data Acquisition And Interpolation Process For Center-TRACON Automation System

    Science.gov (United States)

    Jardin, M. R.; Erzberger, H.; Denery, Dallas G. (Technical Monitor)

    1995-01-01

    The Center-TRACON Automation System (CTAS), an advanced new air traffic automation program, requires knowledge of spatial and temporal atmospheric conditions such as the wind speed and direction, the temperature and the pressure in order to accurately predict aircraft trajectories. Real-time atmospheric data is available in a grid format so that CTAS must interpolate between the grid points to estimate the atmospheric parameter values. The atmospheric data grid is generally not in the same coordinate system as that used by CTAS so that coordinate conversions are required. Both the interpolation and coordinate conversion processes can introduce errors into the atmospheric data and reduce interpolation accuracy. More accurate algorithms may be computationally expensive or may require a prohibitively large amount of data storage capacity so that trade-offs must be made between accuracy and the available computational and data storage resources. The atmospheric data acquisition and processing employed by CTAS will be outlined in this report. The effects of atmospheric data processing on CTAS trajectory prediction will also be analyzed, and several examples of the trajectory prediction process will be given.

  6. Minimization of Distribution Grid Losses by Consumption Coordination

    DEFF Research Database (Denmark)

    Juelsgaard, Morten; Andersen, Palle; Wisniewski, Rafal

    2013-01-01

    for coordinating consumption of electrical energy within the community, with the purpose of reducing grid loading and active power losses. For this we present a simplified model of the electrical grid, including system losses and capacity constraints. Coordination is performed in a distributed fashion, where each...... are obeyed. These objectives are enforced by coordinating consumers through nonlinear tariffs on power consumption. We present simulation test-cases, illustrating that significant reduction of active losses, can be obtained by such coordination. The distributed optimization algorithm, employs the alternating...

  7. Automated forensic DNA purification optimized for FTA card punches and identifiler STR-based PCR analysis.

    Science.gov (United States)

    Tack, Lois C; Thomas, Michelle; Reich, Karl

    2007-03-01

    Forensic labs globally face the same problem-a growing need to process a greater number and wider variety of samples for DNA analysis. The same forensic lab can be tasked all at once with processing mixed casework samples from crime scenes, convicted offender samples for database entry, and tissue from tsunami victims for identification. Besides flexibility in the robotic system chosen for forensic automation, there is a need, for each sample type, to develop new methodology that is not only faster but also more reliable than past procedures. FTA is a chemical treatment of paper, unique to Whatman Bioscience, and is used for the stabilization and storage of biological samples. Here, the authors describe optimization of the Whatman FTA Purification Kit protocol for use with the AmpFlSTR Identifiler PCR Amplification Kit.

  8. Coordinating a Two-Echelon Supply Chain under Carbon Tax

    Directory of Open Access Journals (Sweden)

    Wei Yu

    2017-12-01

    Full Text Available In this paper, we study the impact of carbon tax on carbon emission and retail price in a two-echelon supply chain consisting of a manufacturer and a retailer. Specifically, by adopting two types of contracts, i.e., the modified wholesale price (MW and the modified cost-sharing contract (MS, supply chain coordination is achieved, which promotes the supply chain efficiency. Our study shows that: (1 with the increase of carbon tax, both the optimal emission reduction level and the optimal retail price increase, and then keep unchanged; (2 neither MW nor MS benefits the manufacturer after the supply chain coordination; and (3 to effectively coordinate the supply chain, we propose an innovative supply chain contract that integrates the firms’ optimal decisions under MW or MS with a two part tariff contract (TPT and a fixed fee the retailer can pay to ensure a win–win solution.

  9. Stepwise multi-criteria optimization for robotic radiosurgery

    International Nuclear Information System (INIS)

    Schlaefer, A.; Schweikard, A.

    2008-01-01

    Achieving good conformality and a steep dose gradient around the target volume remains a key aspect of radiosurgery. Clearly, this involves a trade-off between target coverage, conformality of the dose distribution, and sparing of critical structures. Yet, image guidance and robotic beam placement have extended highly conformal dose delivery to extracranial and moving targets. Therefore, the multi-criteria nature of the optimization problem becomes even more apparent, as multiple conflicting clinical goals need to be considered coordinate to obtain an optimal treatment plan. Typically, planning for robotic radiosurgery is based on constrained optimization, namely linear programming. An extension of that approach is presented, such that each of the clinical goals can be addressed separately and in any sequential order. For a set of common clinical goals the mapping to a mathematical objective and a corresponding constraint is defined. The trade-off among the clinical goals is explored by modifying the constraints and optimizing a simple objective, while retaining feasibility of the solution. Moreover, it becomes immediately obvious whether a desired goal can be achieved and where a trade-off is possible. No importance factors or predefined prioritizations of clinical goals are necessary. The presented framework forms the basis for interactive and automated planning procedures. It is demonstrated for a sample case that the linear programming formulation is suitable to search for a clinically optimal treatment, and that the optimization steps can be performed quickly to establish that a Pareto-efficient solution has been found. Furthermore, it is demonstrated how the stepwise approach is preferable compared to modifying importance factors

  10. Automation of On-Board Flightpath Management

    Science.gov (United States)

    Erzberger, H.

    1981-01-01

    The status of concepts and techniques for the design of onboard flight path management systems is reviewed. Such systems are designed to increase flight efficiency and safety by automating the optimization of flight procedures onboard aircraft. After a brief review of the origins and functions of such systems, two complementary methods are described for attacking the key design problem, namely, the synthesis of efficient trajectories. One method optimizes en route, the other optimizes terminal area flight; both methods are rooted in optimal control theory. Simulation and flight test results are reviewed to illustrate the potential of these systems for fuel and cost savings.

  11. Design And Modeling An Automated Digsilent Power System For Optimal New Load Locations

    Directory of Open Access Journals (Sweden)

    Mohamed Saad

    2015-08-01

    Full Text Available Abstract The electric power utilities seek to take advantage of novel approaches to meet growing energy demand. Utilities are under pressure to evolve their classical topologies to increase the usage of distributed generation. Currently the electrical power engineers in many regions of the world are implementing manual methods to measure power consumption for farther assessment of voltage violation. Such process proved to be time consuming costly and inaccurate. Also demand response is a grid management technique where retail or wholesale customers are requested either electronically or manually to reduce their load. Therefore this paper aims to design and model an automated power system for optimal new load locations using DPL DIgSILENT Programming Language. This study is a diagnostic approach that assists system operator about any voltage violation cases that would happen during adding new load to the grid. The process of identifying the optimal bus bar location involves a complicated calculation of the power consumptions at each load bus As a result the DPL program would consider all the IEEE 30 bus internal networks data then a load flow simulation will be executed. To add the new load to the first bus in the network. Therefore the developed model will simulate the new load at each available bus bar in the network and generate three analytical reports for each case that captures the overunder voltage and the loading elements among the grid.

  12. Computer simulation and automation of data processing

    International Nuclear Information System (INIS)

    Tikhonov, A.N.

    1981-01-01

    The principles of computerized simulation and automation of data processing are presented. The automized processing system is constructed according to the module-hierarchical principle. The main operating conditions of the system are as follows: preprocessing, installation analysis, interpretation, accuracy analysis and controlling parameters. The definition of the quasireal experiment permitting to plan the real experiment is given. It is pointed out that realization of the quasireal experiment by means of the computerized installation model with subsequent automized processing permits to scan the quantitative aspect of the system as a whole as well as provides optimal designing of installation parameters for obtaining maximum resolution [ru

  13. Coordinated Voltage Control Scheme for VSC-HVDC Connected Wind Power Plants

    DEFF Research Database (Denmark)

    Guo, Yifei; Gao, Houlei; Wu, Qiuwei

    2017-01-01

    This paper proposes a coordinated voltage control scheme based on model predictive control (MPC) for voltage source converter‐based high voltage direct current (VSC‐HVDC) connected wind power plants (WPPs). In the proposed scheme, voltage regulation capabilities of VSC and WTGs are fully utilized...... and optimally coordinated. Two control modes, namely operation optimization mode and corrective mode, are designed to coordinate voltage control and economic operation of the system. In the first mode, the control objective includes the bus voltages, power losses and dynamic Var reserves of wind turbine...

  14. NASA Systems Autonomy Demonstration Project - Development of Space Station automation technology

    Science.gov (United States)

    Bull, John S.; Brown, Richard; Friedland, Peter; Wong, Carla M.; Bates, William

    1987-01-01

    A 1984 Congressional expansion of the 1958 National Aeronautics and Space Act mandated that NASA conduct programs, as part of the Space Station program, which will yield the U.S. material benefits, particularly in the areas of advanced automation and robotics systems. Demonstration programs are scheduled for automated systems such as the thermal control, expert system coordination of Station subsystems, and automation of multiple subsystems. The programs focus the R&D efforts and provide a gateway for transfer of technology to industry. The NASA Office of Aeronautics and Space Technology is responsible for directing, funding and evaluating the Systems Autonomy Demonstration Project, which will include simulated interactions between novice personnel and astronauts and several automated, expert subsystems to explore the effectiveness of the man-machine interface being developed. Features and progress on the TEXSYS prototype thermal control system expert system are outlined.

  15. Integrated optimization of location assignment and sequencing in multi-shuttle automated storage and retrieval systems under modified 2n-command cycle pattern

    Science.gov (United States)

    Yang, Peng; Peng, Yongfei; Ye, Bin; Miao, Lixin

    2017-09-01

    This article explores the integrated optimization problem of location assignment and sequencing in multi-shuttle automated storage/retrieval systems under the modified 2n-command cycle pattern. The decision of storage and retrieval (S/R) location assignment and S/R request sequencing are jointly considered. An integer quadratic programming model is formulated to describe this integrated optimization problem. The optimal travel cycles for multi-shuttle S/R machines can be obtained to process S/R requests in the storage and retrieval request order lists by solving the model. The small-sized instances are optimally solved using CPLEX. For large-sized problems, two tabu search algorithms are proposed, in which the first come, first served and nearest neighbour are used to generate initial solutions. Various numerical experiments are conducted to examine the heuristics' performance and the sensitivity of algorithm parameters. Furthermore, the experimental results are analysed from the viewpoint of practical application, and a parameter list for applying the proposed heuristics is recommended under different real-life scenarios.

  16. Quantitative analysis of spider locomotion employing computer-automated video tracking

    DEFF Research Database (Denmark)

    Baatrup, E; Bayley, M

    1993-01-01

    The locomotor activity of adult specimens of the wolf spider Pardosa amentata was measured in an open-field setup, using computer-automated colour object video tracking. The x,y coordinates of the animal in the digitized image of the test arena were recorded three times per second during four...

  17. Automated imaging system for single molecules

    Science.gov (United States)

    Schwartz, David Charles; Runnheim, Rodney; Forrest, Daniel

    2012-09-18

    There is provided a high throughput automated single molecule image collection and processing system that requires minimal initial user input. The unique features embodied in the present disclosure allow automated collection and initial processing of optical images of single molecules and their assemblies. Correct focus may be automatically maintained while images are collected. Uneven illumination in fluorescence microscopy is accounted for, and an overall robust imaging operation is provided yielding individual images prepared for further processing in external systems. Embodiments described herein are useful in studies of any macromolecules such as DNA, RNA, peptides and proteins. The automated image collection and processing system and method of same may be implemented and deployed over a computer network, and may be ergonomically optimized to facilitate user interaction.

  18. Crew/Automation Interaction in Space Transportation Systems: Lessons Learned from the Glass Cockpit

    Science.gov (United States)

    Rudisill, Marianne

    2000-01-01

    The progressive integration of automation technologies in commercial transport aircraft flight decks - the 'glass cockpit' - has had a major, and generally positive, impact on flight crew operations. Flight deck automation has provided significant benefits, such as economic efficiency, increased precision and safety, and enhanced functionality within the crew interface. These enhancements, however, may have been accrued at a price, such as complexity added to crew/automation interaction that has been implicated in a number of aircraft incidents and accidents. This report briefly describes 'glass cockpit' evolution. Some relevant aircraft accidents and incidents are described, followed by a more detailed description of human/automation issues and problems (e.g., crew error, monitoring, modes, command authority, crew coordination, workload, and training). This paper concludes with example principles and guidelines for considering 'glass cockpit' human/automation integration within space transportation systems.

  19. Optimal Coordination Strategy of Regional Vertical Emission Abatement Collaboration in a Low-Carbon Environment

    Directory of Open Access Journals (Sweden)

    Daming You

    2018-02-01

    Full Text Available This study introduces a time factor into a low-carbon context, and supposes the contamination control state of local government and the ability of polluting enterprise to abate emissions as linear increasing functions in a regional low-carbon emission abatement cooperation chain. The local government effectuates and upholds the low-carbon development within the jurisdiction that is primarily seeking to transform regional economic development modes, while the polluting enterprise abates the amounts of emitted carbon in the entire period of product through simplifying production, facilitating decontamination, and adopting production technology, thus leading to less contamination. On that basis, we infer that the coordinated joint carbon reduction model and two decentralization contracts expound the dynamic coordination strategy for a regional cooperation chain in terms of vertical carbon abatement. Furthermore, feedback equilibrium strategies that are concerned with several diverse conditions are compared and analyzed. The main results show that a collaborative centralized contract is able to promote the regional low-carbon cooperation chain in order to achieve a win–win situation in both economic and environmental performance. Additionally, the optimal profits of the entire regional low-carbon cooperation channel under an integration scenario evidently outstrip that of two non-collaborative decentralization schemes. Eventually, the validity of the conclusions is verified with a case description and numerical simulation, and the sensitivity of the relevant parameters is analyzed in order to lay a theoretical foundation and thus facilitate the sustainable development of a regional low-carbon environment.

  20. Control coordination abilities in shock combat sports

    Directory of Open Access Journals (Sweden)

    Natalya Boychenko

    2014-12-01

    Full Text Available Purpose: optimize the process control level of coordination abilities in martial arts. Material and Methods: analysis and compilation of scientific and methodological literature, interviews with coaches of drum martial arts, video analysis techniques, teacher observations. Results: identified specific types of coordination abilities in shock combat sports. Pod branny and offered specific and nonspecific tests to monitor the level of species athletes coordination abilities. Conclusion: it is determined that in order to achieve victory in the fight martial artists to navigate the space to be able to assess and manage dynamic and spatio-temporal parameters of movements, maintain balance, have a high coordination of movements. The proposed tests to monitor species coordination abilities athletes allow an objective assessment of not only the overall level of coordination, and the level of specific types of manifestations of this ability.

  1. EPOS for Coordination of Asynchronous Sensor Webs

    Data.gov (United States)

    National Aeronautics and Space Administration — Develop, integrate, and deploy software-based tools to coordinate asynchronous, distributed missions and optimize observation planning spanning simultaneous...

  2. Capacity Impacts and Optimal Geometry of Automated Cars’ Surface Parking Facilities

    Directory of Open Access Journals (Sweden)

    You Kong

    2018-01-01

    Full Text Available The impact of Automated Vehicles (AVs on urban geography has been widely speculated, though there is little quantitative evidence in the literature to establish the magnitude of such effects. To quantify the impact of the greater precision of automated driving on the spatial efficiency of off-street parking facilities, we develop a mixed integer nonlinear model (solved via a branch-and-cut approach and present comparisons against industry-standard requirements for human-driving operation. We demonstrate that gains on the order of 40–50% in spatial efficiency (parking spaces per unit area are in principle achievable while ensuring that each parked vehicle is independently accessible. We further show that the large majority of these efficiency gains can be obtained under current automotive engineering practice in which only the front two wheels pivot. There is a need for standardized methods that take the parking supply of a city as an input and calculate both the aggregate (citywide efficiency impacts of automated driving and the spatial distribution of the effects. This study is intended as an initial step towards this objective.

  3. Routing Optimization of Intelligent Vehicle in Automated Warehouse

    Directory of Open Access Journals (Sweden)

    Yan-cong Zhou

    2014-01-01

    Full Text Available Routing optimization is a key technology in the intelligent warehouse logistics. In order to get an optimal route for warehouse intelligent vehicle, routing optimization in complex global dynamic environment is studied. A new evolutionary ant colony algorithm based on RFID and knowledge-refinement is proposed. The new algorithm gets environmental information timely through the RFID technology and updates the environment map at the same time. It adopts elite ant kept, fallback, and pheromones limitation adjustment strategy. The current optimal route in population space is optimized based on experiential knowledge. The experimental results show that the new algorithm has higher convergence speed and can jump out the U-type or V-type obstacle traps easily. It can also find the global optimal route or approximate optimal one with higher probability in the complex dynamic environment. The new algorithm is proved feasible and effective by simulation results.

  4. SV-AUTOPILOT: optimized, automated construction of structural variation discovery and benchmarking pipelines.

    Science.gov (United States)

    Leung, Wai Yi; Marschall, Tobias; Paudel, Yogesh; Falquet, Laurent; Mei, Hailiang; Schönhuth, Alexander; Maoz Moss, Tiffanie Yael

    2015-03-25

    Many tools exist to predict structural variants (SVs), utilizing a variety of algorithms. However, they have largely been developed and tested on human germline or somatic (e.g. cancer) variation. It seems appropriate to exploit this wealth of technology available for humans also for other species. Objectives of this work included: a) Creating an automated, standardized pipeline for SV prediction. b) Identifying the best tool(s) for SV prediction through benchmarking. c) Providing a statistically sound method for merging SV calls. The SV-AUTOPILOT meta-tool platform is an automated pipeline for standardization of SV prediction and SV tool development in paired-end next-generation sequencing (NGS) analysis. SV-AUTOPILOT comes in the form of a virtual machine, which includes all datasets, tools and algorithms presented here. The virtual machine easily allows one to add, replace and update genomes, SV callers and post-processing routines and therefore provides an easy, out-of-the-box environment for complex SV discovery tasks. SV-AUTOPILOT was used to make a direct comparison between 7 popular SV tools on the Arabidopsis thaliana genome using the Landsberg (Ler) ecotype as a standardized dataset. Recall and precision measurements suggest that Pindel and Clever were the most adaptable to this dataset across all size ranges while Delly performed well for SVs larger than 250 nucleotides. A novel, statistically-sound merging process, which can control the false discovery rate, reduced the false positive rate on the Arabidopsis benchmark dataset used here by >60%. SV-AUTOPILOT provides a meta-tool platform for future SV tool development and the benchmarking of tools on other genomes using a standardized pipeline. It optimizes detection of SVs in non-human genomes using statistically robust merging. The benchmarking in this study has demonstrated the power of 7 different SV tools for analyzing different size classes and types of structural variants. The optional merge

  5. Multi-net optimization of VLSI interconnect

    CERN Document Server

    Moiseev, Konstantin; Wimer, Shmuel

    2015-01-01

    This book covers layout design and layout migration methodologies for optimizing multi-net wire structures in advanced VLSI interconnects. Scaling-dependent models for interconnect power, interconnect delay and crosstalk noise are covered in depth, and several design optimization problems are addressed, such as minimization of interconnect power under delay constraints, or design for minimal delay in wire bundles within a given routing area. A handy reference or a guide for design methodologies and layout automation techniques, this book provides a foundation for physical design challenges of interconnect in advanced integrated circuits.  • Describes the evolution of interconnect scaling and provides new techniques for layout migration and optimization, focusing on multi-net optimization; • Presents research results that provide a level of design optimization which does not exist in commercially-available design automation software tools; • Includes mathematical properties and conditions for optimal...

  6. Automated drug dispensing system reduces medication errors in an intensive care setting.

    Science.gov (United States)

    Chapuis, Claire; Roustit, Matthieu; Bal, Gaëlle; Schwebel, Carole; Pansu, Pascal; David-Tchouda, Sandra; Foroni, Luc; Calop, Jean; Timsit, Jean-François; Allenet, Benoît; Bosson, Jean-Luc; Bedouch, Pierrick

    2010-12-01

    We aimed to assess the impact of an automated dispensing system on the incidence of medication errors related to picking, preparation, and administration of drugs in a medical intensive care unit. We also evaluated the clinical significance of such errors and user satisfaction. Preintervention and postintervention study involving a control and an intervention medical intensive care unit. Two medical intensive care units in the same department of a 2,000-bed university hospital. Adult medical intensive care patients. After a 2-month observation period, we implemented an automated dispensing system in one of the units (study unit) chosen randomly, with the other unit being the control. The overall error rate was expressed as a percentage of total opportunities for error. The severity of errors was classified according to National Coordinating Council for Medication Error Reporting and Prevention categories by an expert committee. User satisfaction was assessed through self-administered questionnaires completed by nurses. A total of 1,476 medications for 115 patients were observed. After automated dispensing system implementation, we observed a reduced percentage of total opportunities for error in the study compared to the control unit (13.5% and 18.6%, respectively; perror (20.4% and 13.5%; perror showed a significant impact of the automated dispensing system in reducing preparation errors (perrors caused no harm (National Coordinating Council for Medication Error Reporting and Prevention category C). The automated dispensing system did not reduce errors causing harm. Finally, the mean for working conditions improved from 1.0±0.8 to 2.5±0.8 on the four-point Likert scale. The implementation of an automated dispensing system reduced overall medication errors related to picking, preparation, and administration of drugs in the intensive care unit. Furthermore, most nurses favored the new drug dispensation organization.

  7. Optimal truss and frame design from projected homogenization-based topology optimization

    DEFF Research Database (Denmark)

    Larsen, S. D.; Sigmund, O.; Groen, J. P.

    2018-01-01

    In this article, we propose a novel method to obtain a near-optimal frame structure, based on the solution of a homogenization-based topology optimization model. The presented approach exploits the equivalence between Michell’s problem of least-weight trusses and a compliance minimization problem...... using optimal rank-2 laminates in the low volume fraction limit. In a fully automated procedure, a discrete structure is extracted from the homogenization-based continuum model. This near-optimal structure is post-optimized as a frame, where the bending stiffness is continuously decreased, to allow...

  8. Optimized and Automated Radiosynthesis of [18F]DHMT for Translational Imaging of Reactive Oxygen Species with Positron Emission Tomography

    Directory of Open Access Journals (Sweden)

    Wenjie Zhang

    2016-12-01

    Full Text Available Reactive oxygen species (ROS play important roles in cell signaling and homeostasis. However, an abnormally high level of ROS is toxic, and is implicated in a number of diseases. Positron emission tomography (PET imaging of ROS can assist in the detection of these diseases. For the purpose of clinical translation of [18F]6-(4-((1-(2-fluoroethyl-1H-1,2,3-triazol-4-ylmethoxyphenyl-5-methyl-5,6-dihydrophenanthridine-3,8-diamine ([18F]DHMT, a promising ROS PET radiotracer, we first manually optimized the large-scale radiosynthesis conditions and then implemented them in an automated synthesis module. Our manual synthesis procedure afforded [18F]DHMT in 120 min with overall radiochemical yield (RCY of 31.6% ± 9.3% (n = 2, decay-uncorrected and specific activity of 426 ± 272 GBq/µmol (n = 2. Fully automated radiosynthesis of [18F]DHMT was achieved within 77 min with overall isolated RCY of 6.9% ± 2.8% (n = 7, decay-uncorrected and specific activity of 155 ± 153 GBq/µmol (n = 7 at the end of synthesis. This study is the first demonstration of producing 2-[18F]fluoroethyl azide by an automated module, which can be used for a variety of PET tracers through click chemistry. It is also the first time that [18F]DHMT was successfully tested for PET imaging in a healthy beagle dog.

  9. Numerical construction of the p(fold) (committor) reaction coordinate for a Markov process.

    Science.gov (United States)

    Krivov, Sergei V

    2011-10-06

    To simplify the description of a complex multidimensional dynamical process, one often projects it onto a single reaction coordinate. In protein folding studies, the folding probability p(fold) is an optimal reaction coordinate which preserves many important properties of the dynamics. The construction of the coordinate is difficult. Here, an efficient numerical approach to construct the p(fold) reaction coordinate for a Markov process (satisfying the detailed balance) is described. The coordinate is obtained by optimizing parameters of a chosen functional form to make a generalized cut-based free energy profile the highest. The approach is illustrated by constructing the p(fold) reaction coordinate for the equilibrium folding simulation of FIP35 protein reported by Shaw et al. (Science 2010, 330, 341-346). © 2011 American Chemical Society

  10. An integrated system for buildings’ energy-efficient automation: Application in the tertiary sector

    International Nuclear Information System (INIS)

    Marinakis, Vangelis; Doukas, Haris; Karakosta, Charikleia; Psarras, John

    2013-01-01

    Highlights: ► We developed an interactive software for building automation systems. ► Monitoring of energy consumption in real time. ► Optimization of energy consumption implementing appropriate control scenarios. ► Pilot appraisal on remote control of active systems in the tertiary sector building. ► Significant decrease in energy and operating cost of A/C system. -- Abstract: Although integrated building automation systems have become increasingly popular, an integrated system which includes remote control technology to enable real-time monitoring of the energy consumption by energy end-users, as well as optimization functions is required. To respond to this common interest, the main aim of the paper is to present an integrated system for buildings’ energy-efficient automation. The proposed system is based on a prototype software tool for the simulation and optimization of energy consumption in the building sector, enhancing the interactivity of building automation systems. The system can incorporate energy-efficient automation functions for heating, cooling and/or lighting based on recent guidance and decisions of the National Law, energy efficiency requirements of EN 15232 and ISO 50001 Energy Management Standard among others. The presented system was applied to a supermarket building in Greece and focused on the remote control of active systems.

  11. Advanced automation for in-space vehicle processing

    Science.gov (United States)

    Sklar, Michael; Wegerif, D.

    1990-01-01

    The primary objective of this 3-year planned study is to assure that the fully evolved Space Station Freedom (SSF) can support automated processing of exploratory mission vehicles. Current study assessments show that required extravehicular activity (EVA) and to some extent intravehicular activity (IVA) manpower requirements for required processing tasks far exceeds the available manpower. Furthermore, many processing tasks are either hazardous operations or they exceed EVA capability. Thus, automation is essential for SSF transportation node functionality. Here, advanced automation represents the replacement of human performed tasks beyond the planned baseline automated tasks. Both physical tasks such as manipulation, assembly and actuation, and cognitive tasks such as visual inspection, monitoring and diagnosis, and task planning are considered. During this first year of activity both the Phobos/Gateway Mars Expedition and Lunar Evolution missions proposed by the Office of Exploration have been evaluated. A methodology for choosing optimal tasks to be automated has been developed. Processing tasks for both missions have been ranked on the basis of automation potential. The underlying concept in evaluating and describing processing tasks has been the use of a common set of 'Primitive' task descriptions. Primitive or standard tasks have been developed both for manual or crew processing and automated machine processing.

  12. An overview of the contaminant analysis automation program

    International Nuclear Information System (INIS)

    Hollen, R.M.; Erkkila, T.; Beugelsdijk, T.J.

    1992-01-01

    The Department of Energy (DOE) has significant amounts of radioactive and hazardous wastes stored, buried, and still being generated at many sites within the United States. These wastes must be characterized to determine the elemental, isotopic, and compound content before remediation can begin. In this paper, the authors project that sampling requirements will necessitate generating more than 10 million samples by 1995, which will far exceed the capabilities of our current manual chemical analysis laboratories. The Contaminant Analysis Automation effort (CAA), with Los Alamos National Laboratory (LANL) as to the coordinating Laboratory, is designing and fabricating robotic systems that will standardize and automate both the hardware and the software of the most common environmental chemical methods. This will be accomplished by designing and producing several unique analysis systems called Standard Analysis Methods (SAM). Each SAM will automate a specific chemical method, including sample preparation, the analytical analysis, and the data interpretation, by using a building block known as the Standard Laboratory Module (SLM). This concept allows the chemist to assemble an automated environmental method using standardized SLMs easily and without the worry of hardware compatibility or the necessity of generating complicated control programs

  13. SciBox, an end-to-end automated science planning and commanding system

    Science.gov (United States)

    Choo, Teck H.; Murchie, Scott L.; Bedini, Peter D.; Steele, R. Josh; Skura, Joseph P.; Nguyen, Lillian; Nair, Hari; Lucks, Michael; Berman, Alice F.; McGovern, James A.; Turner, F. Scott

    2014-01-01

    SciBox is a new technology for planning and commanding science operations for Earth-orbital and planetary space missions. It has been incrementally developed since 2001 and demonstrated on several spaceflight projects. The technology has matured to the point that it is now being used to plan and command all orbital science operations for the MErcury Surface, Space ENvironment, GEochemistry, and Ranging (MESSENGER) mission to Mercury. SciBox encompasses the derivation of observing sequences from science objectives, the scheduling of those sequences, the generation of spacecraft and instrument commands, and the validation of those commands prior to uploading to the spacecraft. Although the process is automated, science and observing requirements are incorporated at each step by a series of rules and parameters to optimize observing opportunities, which are tested and validated through simulation and review. Except for limited special operations and tests, there is no manual scheduling of observations or construction of command sequences. SciBox reduces the lead time for operations planning by shortening the time-consuming coordination process, reduces cost by automating the labor-intensive processes of human-in-the-loop adjudication of observing priorities, reduces operations risk by systematically checking constraints, and maximizes science return by fully evaluating the trade space of observing opportunities to meet MESSENGER science priorities within spacecraft recorder, downlink, scheduling, and orbital-geometry constraints.

  14. Automated Prescription of Oblique Brain 3D MRSI

    OpenAIRE

    Ozhinsky, Eugene; Vigneron, Daniel B.; Chang, Susan M.; Nelson, Sarah J.

    2012-01-01

    Two major difficulties encountered in implementing Magnetic Resonance Spectroscopic Imaging (MRSI) in a clinical setting are limited coverage and difficulty in prescription. The goal of this project was to completely automate the process of 3D PRESS MRSI prescription, including placement of the selection box, saturation bands and shim volume, while maximizing the coverage of the brain. The automated prescription technique included acquisition of an anatomical MRI image, optimization of the ob...

  15. Automation of Personnel Certification Roadbuilding Departments and Enterprises

    Directory of Open Access Journals (Sweden)

    Andrey Vladimirovich Ostroukh

    2015-10-01

    Full Text Available The article suggests science-based solutions to improve the efficiency of personnel certification of road construction departments and enterprises by developing an integrated learning environment based on an automated system of distance learning with the open source software and methodological support, tailored to the professional standards. Personnel certification procedure of road construction departments and enterprises for optimization and automation is analyzed. Optimization solutions for all stages of the preparation and conduction conformance testing to create test tasks for batch import of test tasks in a database of test tasks and adaptive algorithm for bringing the subject of tests proposed and implemented. The developed technique of adaptive presentation may perform various functions, depending on the settings of the algorithm.

  16. Automated Prescription of Oblique Brain 3D MRSI

    Science.gov (United States)

    Ozhinsky, Eugene; Vigneron, Daniel B.; Chang, Susan M.; Nelson, Sarah J.

    2012-01-01

    Two major difficulties encountered in implementing Magnetic Resonance Spectroscopic Imaging (MRSI) in a clinical setting are limited coverage and difficulty in prescription. The goal of this project was to completely automate the process of 3D PRESS MRSI prescription, including placement of the selection box, saturation bands and shim volume, while maximizing the coverage of the brain. The automated prescription technique included acquisition of an anatomical MRI image, optimization of the oblique selection box parameters, optimization of the placement of OVS saturation bands, and loading of the calculated parameters into a customized 3D MRSI pulse sequence. To validate the technique and compare its performance with existing protocols, 3D MRSI data were acquired from 6 exams from 3 healthy volunteers. To assess the performance of the automated 3D MRSI prescription for patients with brain tumors, the data were collected from 16 exams from 8 subjects with gliomas. This technique demonstrated robust coverage of the tumor, high consistency of prescription and very good data quality within the T2 lesion. PMID:22692829

  17. Distribution automation at BC Hydro : a case study

    Energy Technology Data Exchange (ETDEWEB)

    Siew, C. [BC Hydro, Vancouver, BC (Canada). Smart Grid Development Program

    2009-07-01

    This presentation discussed a distribution automation study conducted by BC Hydro to determine methods of improving grid performance by supporting intelligent transmission and distribution systems. The utility's smart grid program includes a number of utility-side and customer-side applications, including enabled demand response, microgrid, and operational efficiency applications. The smart grid program will improve reliability and power quality by 40 per cent, improve conservation and energy efficiency throughout the province, and provide enhanced customer service. Programs and initiatives currently underway at the utility include distribution management, smart metering, distribution automation, and substation automation programs. The utility's automation functionality will include fault interruption and locating, restoration capability, and restoration success. A decision support system has also been established to assist control room and field operating personnel with monitoring and control of the electric distribution system. Protection, control and monitoring (PCM) and volt VAR optimization upgrades are also planned. Reclosers are also being automated, and an automation guide has been developed for switches. tabs., figs.

  18. An Automated Pipeline for Engineering Many-Enzyme Pathways: Computational Sequence Design, Pathway Expression-Flux Mapping, and Scalable Pathway Optimization.

    Science.gov (United States)

    Halper, Sean M; Cetnar, Daniel P; Salis, Howard M

    2018-01-01

    Engineering many-enzyme metabolic pathways suffers from the design curse of dimensionality. There are an astronomical number of synonymous DNA sequence choices, though relatively few will express an evolutionary robust, maximally productive pathway without metabolic bottlenecks. To solve this challenge, we have developed an integrated, automated computational-experimental pipeline that identifies a pathway's optimal DNA sequence without high-throughput screening or many cycles of design-build-test. The first step applies our Operon Calculator algorithm to design a host-specific evolutionary robust bacterial operon sequence with maximally tunable enzyme expression levels. The second step applies our RBS Library Calculator algorithm to systematically vary enzyme expression levels with the smallest-sized library. After characterizing a small number of constructed pathway variants, measurements are supplied to our Pathway Map Calculator algorithm, which then parameterizes a kinetic metabolic model that ultimately predicts the pathway's optimal enzyme expression levels and DNA sequences. Altogether, our algorithms provide the ability to efficiently map the pathway's sequence-expression-activity space and predict DNA sequences with desired metabolic fluxes. Here, we provide a step-by-step guide to applying the Pathway Optimization Pipeline on a desired multi-enzyme pathway in a bacterial host.

  19. Making progress with the automation of systematic reviews: principles of the International Collaboration for the Automation of Systematic Reviews (ICASR).

    Science.gov (United States)

    Beller, Elaine; Clark, Justin; Tsafnat, Guy; Adams, Clive; Diehl, Heinz; Lund, Hans; Ouzzani, Mourad; Thayer, Kristina; Thomas, James; Turner, Tari; Xia, Jun; Robinson, Karen; Glasziou, Paul

    2018-05-19

    Systematic reviews (SR) are vital to health care, but have become complicated and time-consuming, due to the rapid expansion of evidence to be synthesised. Fortunately, many tasks of systematic reviews have the potential to be automated or may be assisted by automation. Recent advances in natural language processing, text mining and machine learning have produced new algorithms that can accurately mimic human endeavour in systematic review activity, faster and more cheaply. Automation tools need to be able to work together, to exchange data and results. Therefore, we initiated the International Collaboration for the Automation of Systematic Reviews (ICASR), to successfully put all the parts of automation of systematic review production together. The first meeting was held in Vienna in October 2015. We established a set of principles to enable tools to be developed and integrated into toolkits.This paper sets out the principles devised at that meeting, which cover the need for improvement in efficiency of SR tasks, automation across the spectrum of SR tasks, continuous improvement, adherence to high quality standards, flexibility of use and combining components, the need for a collaboration and varied skills, the desire for open source, shared code and evaluation, and a requirement for replicability through rigorous and open evaluation.Automation has a great potential to improve the speed of systematic reviews. Considerable work is already being done on many of the steps involved in a review. The 'Vienna Principles' set out in this paper aim to guide a more coordinated effort which will allow the integration of work by separate teams and build on the experience, code and evaluations done by the many teams working across the globe.

  20. Human-centered automation of testing, surveillance and maintenance

    International Nuclear Information System (INIS)

    Bhatt, S.C.; Sun, B.K.H.

    1991-01-01

    Manual surveillance and testing of instrumentation, control and protection systems at nuclear power plants involves system and human errors which can lead to substantial plant down time. Frequent manual testing can also contribute significantly to operation and maintenance cost. Automation technology offers potential for prudent applications at the power plant to reduce testing errors and cost. To help address the testing problems and to harness the benefit of automation application, input from utilities is obtained on suitable automation approaches. This paper includes lessens from successful past experience at a few plants where some island of automation exist. The results are summarized as a set of specifications for semi automatic testing. A human-centered automation methodology is proposed with the guidelines for optimal human/computer division of tasks given. Implementation obstacles for significant changes of testing practices are identified and methods acceptable to nuclear power plants for addressing these obstacles have been suggested

  1. Experimental optimization of a direct injection homogeneous charge compression ignition gasoline engine using split injections with fully automated microgenetic algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Canakci, M. [Kocaeli Univ., Izmit (Turkey); Reitz, R.D. [Wisconsin Univ., Dept. of Mechanical Engineering, Madison, WI (United States)

    2003-03-01

    Homogeneous charge compression ignition (HCCI) is receiving attention as a new low-emission engine concept. Little is known about the optimal operating conditions for this engine operation mode. Combustion under homogeneous, low equivalence ratio conditions results in modest temperature combustion products, containing very low concentrations of NO{sub x} and particulate matter (PM) as well as providing high thermal efficiency. However, this combustion mode can produce higher HC and CO emissions than those of conventional engines. An electronically controlled Caterpillar single-cylinder oil test engine (SCOTE), originally designed for heavy-duty diesel applications, was converted to an HCCI direct injection (DI) gasoline engine. The engine features an electronically controlled low-pressure direct injection gasoline (DI-G) injector with a 60 deg spray angle that is capable of multiple injections. The use of double injection was explored for emission control and the engine was optimized using fully automated experiments and a microgenetic algorithm optimization code. The variables changed during the optimization include the intake air temperature, start of injection timing and the split injection parameters (per cent mass of fuel in each injection, dwell between the pulses). The engine performance and emissions were determined at 700 r/min with a constant fuel flowrate at 10 MPa fuel injection pressure. The results show that significant emissions reductions are possible with the use of optimal injection strategies. (Author)

  2. Developments towards a fully automated AMS system

    International Nuclear Information System (INIS)

    Steier, P.; Puchegger, S.; Golser, R.; Kutschera, W.; Priller, A.; Rom, W.; Wallner, A.; Wild, E.

    2000-01-01

    The possibilities of computer-assisted and automated accelerator mass spectrometry (AMS) measurements were explored. The goal of these efforts is to develop fully automated procedures for 'routine' measurements at the Vienna Environmental Research Accelerator (VERA), a dedicated 3-MV Pelletron tandem AMS facility. As a new tool for automatic tuning of the ion optics we developed a multi-dimensional optimization algorithm robust to noise, which was applied for 14 C and 10 Be. The actual isotope ratio measurements are performed in a fully automated fashion and do not require the presence of an operator. Incoming data are evaluated online and the results can be accessed via Internet. The system was used for 14 C, 10 Be, 26 Al and 129 I measurements

  3. Bayesian Safety Risk Modeling of Human-Flightdeck Automation Interaction

    Science.gov (United States)

    Ancel, Ersin; Shih, Ann T.

    2015-01-01

    Usage of automatic systems in airliners has increased fuel efficiency, added extra capabilities, enhanced safety and reliability, as well as provide improved passenger comfort since its introduction in the late 80's. However, original automation benefits, including reduced flight crew workload, human errors or training requirements, were not achieved as originally expected. Instead, automation introduced new failure modes, redistributed, and sometimes increased workload, brought in new cognitive and attention demands, and increased training requirements. Modern airliners have numerous flight modes, providing more flexibility (and inherently more complexity) to the flight crew. However, the price to pay for the increased flexibility is the need for increased mode awareness, as well as the need to supervise, understand, and predict automated system behavior. Also, over-reliance on automation is linked to manual flight skill degradation and complacency in commercial pilots. As a result, recent accidents involving human errors are often caused by the interactions between humans and the automated systems (e.g., the breakdown in man-machine coordination), deteriorated manual flying skills, and/or loss of situational awareness due to heavy dependence on automated systems. This paper describes the development of the increased complexity and reliance on automation baseline model, named FLAP for FLightdeck Automation Problems. The model development process starts with a comprehensive literature review followed by the construction of a framework comprised of high-level causal factors leading to an automation-related flight anomaly. The framework was then converted into a Bayesian Belief Network (BBN) using the Hugin Software v7.8. The effects of automation on flight crew are incorporated into the model, including flight skill degradation, increased cognitive demand and training requirements along with their interactions. Besides flight crew deficiencies, automation system

  4. Synthetic Teammates as Team Players: Coordination of Human and Synthetic Teammates

    Science.gov (United States)

    2016-05-31

    teammate interactions with human teammates reveal about human-automation coordination needs? 15. SUBJECT TERMS synthetic teammate, human- autonomy teaming...interacting with autonomy - not autonomous vehicles, but autonomous teammates. These experiments have led to a number of discoveries including: 1...given the preponderance of text-based communications in our society and its adoption in time critical military and civilian contexts, the

  5. Physics of automated driving in framework of three-phase traffic theory.

    Science.gov (United States)

    Kerner, Boris S

    2018-04-01

    We have revealed physical features of automated driving in the framework of the three-phase traffic theory for which there is no fixed time headway to the preceding vehicle. A comparison with the classical model approach to automated driving for which an automated driving vehicle tries to reach a fixed (desired or "optimal") time headway to the preceding vehicle has been made. It turns out that automated driving in the framework of the three-phase traffic theory can exhibit the following advantages in comparison with the classical model of automated driving: (i) The absence of string instability. (ii) Considerably smaller speed disturbances at road bottlenecks. (iii) Automated driving vehicles based on the three-phase theory can decrease the probability of traffic breakdown at the bottleneck in mixed traffic flow consisting of human driving and automated driving vehicles; on the contrary, even a single automated driving vehicle based on the classical approach can provoke traffic breakdown at the bottleneck in mixed traffic flow.

  6. Physics of automated driving in framework of three-phase traffic theory

    Science.gov (United States)

    Kerner, Boris S.

    2018-04-01

    We have revealed physical features of automated driving in the framework of the three-phase traffic theory for which there is no fixed time headway to the preceding vehicle. A comparison with the classical model approach to automated driving for which an automated driving vehicle tries to reach a fixed (desired or "optimal") time headway to the preceding vehicle has been made. It turns out that automated driving in the framework of the three-phase traffic theory can exhibit the following advantages in comparison with the classical model of automated driving: (i) The absence of string instability. (ii) Considerably smaller speed disturbances at road bottlenecks. (iii) Automated driving vehicles based on the three-phase theory can decrease the probability of traffic breakdown at the bottleneck in mixed traffic flow consisting of human driving and automated driving vehicles; on the contrary, even a single automated driving vehicle based on the classical approach can provoke traffic breakdown at the bottleneck in mixed traffic flow.

  7. Secure Automated Microgrid Energy System

    Science.gov (United States)

    2016-12-01

    O&M Operations and Maintenance PSO Power System Optimization PV Photovoltaic RAID Redundant Array of Independent Disks RBAC Role...elements of the initial study and operational power system model (feeder size , protective devices, generation sources, controllable loads, transformers...EW-201340) Secure Automated Microgrid Energy System December 2016 This document has been cleared for public release; Distribution Statement A

  8. A scheme for a future distribution automation system in Finnish utilities

    Energy Technology Data Exchange (ETDEWEB)

    Lehtonen, M.; Kaerkkaeinen, S. [VTT Energy, Espoo (Finland); Partanen, J. [Lappeenranta Univ. of Technology (Finland)

    1996-12-31

    This presentation summarizes the results of a project, the aim of which was to define the optimal set of functions for the future distribution automation (DA) systems in Finland. The general factors. which affect the automation needs, are first discussed. The benefits of various functions of DA and demand side management (DSM) are then studied. Next a computer model for a DA feasibility analysis is presented, and some computation results are given. From these. the proposed automation scheme is finally concluded

  9. A scheme for a future distribution automation system in Finnish utilities

    Energy Technology Data Exchange (ETDEWEB)

    Lehtonen, M; Kaerkkaeinen, S [VTT Energy, Espoo (Finland); Partanen, J [Lappeenranta Univ. of Technology (Finland)

    1998-08-01

    This presentation summarizes the results of a project, the aim of which was to define the optimal set of functions for the future distribution automation (DA) systems in Finland. The general factors, which affect the automation needs, are first discussed. The benefits of various functions of DA and demand side management (DSM) are then studied. Next a computer model for a DA feasibility analysis is presented, and some computation results are given. From these, the proposed automation scheme is finally concluded

  10. A scheme for a future distribution automation system in Finnish utilities

    Energy Technology Data Exchange (ETDEWEB)

    Lehtonen, M; Kaerkkaeinen, S [VTT Energy, Espoo (Finland); Partanen, J [Lappeenranta Univ. of Technology (Finland)

    1997-12-31

    This presentation summarizes the results of a project, the aim of which was to define the optimal set of functions for the future distribution automation (DA) systems in Finland. The general factors. which affect the automation needs, are first discussed. The benefits of various functions of DA and demand side management (DSM) are then studied. Next a computer model for a DA feasibility analysis is presented, and some computation results are given. From these. the proposed automation scheme is finally concluded

  11. Improving operating room coordination: communication pattern assessment.

    Science.gov (United States)

    Moss, Jacqueline; Xiao, Yan

    2004-02-01

    To capture communication patterns in operating room (OR) management to characterize the information needs of OR coordination. Technological applications can be used to change system processes to improve communication and information access, thereby decreasing errors and adverse events. The successful design of such applications relies on an understanding of communication patterns among healthcare professionals. Charge nurse communication was observed and documented at four OR suites at three tertiary hospitals. The data collection tool allowed rapid coding of communication patterns in terms of duration, mode, target person, and the purpose of each communication episode. Most (69.24%) of the 2074 communication episodes observed occurred face to face. Coordinating equipment was the most frequently occurring purpose of communication (38.7%) in all suites. The frequency of other purposes in decreasing order were coordinating patient preparedness (25.7%), staffing (18.8%), room assignment (10.7%), and scheduling and rescheduling surgery (6.2%). The results of this study suggest that automating aspects of preparing patients for surgery and surgical equipment management has the potential to reduce information exchange, decreasing interruptions to clinicians and diminishing the possibility of adverse events in the clinical setting.

  12. Coordinate measurement machines as an alignment tool

    International Nuclear Information System (INIS)

    Wand, B.T.

    1991-03-01

    In February of 1990 the Stanford Linear Accelerator Center (SLAC) purchased a LEITZ PM 12-10-6 CMM (Coordinate measurement machine). The machine is shared by the Quality Control Team and the Alignment Team. One of the alignment tasks in positioning beamline components in a particle accelerator is to define the component's magnetic centerline relative to external fiducials. This procedure, called fiducialization, is critical to the overall positioning tolerance of a magnet. It involves the definition of the magnetic center line with respect to the mechanical centerline and the transfer of the mechanical centerline to the external fiducials. To perform the latter a magnet coordinate system has to be established. This means defining an origin and the three rotation angles of the magnet. The datum definition can be done by either optical tooling techniques or with a CMM. As optical tooling measurements are very time consuming, not automated and are prone to errors, it is desirable to use the CMM fiducialization method instead. The establishment of a magnet coordinate system based on the mechanical center and the transfer to external fiducials will be discussed and presented with 2 examples from the Stanford Linear Collider (SLC). 7 figs

  13. Distribution Loss Reduction by Household Consumption Coordination in Smart Grids

    DEFF Research Database (Denmark)

    Juelsgaard, Morten; Andersen, Palle; Wisniewski, Rafal

    2014-01-01

    for coordinating consumption of electrical energy within the community, with the purpose of reducing grid loading and active power losses. For this we present a simplified model of the electrical grid, including system losses and capacity constraints. Coordination is performed in a distributed fashion, where each...... are obeyed. These objectives are enforced by coordinating consumers through a nonlinear penalty on power consumption. We present simulation test-cases, illustrating that significant reduction of active losses, can be obtained by such coordination. The distributed optimization algorithm employs...

  14. Towards full automation of accelerators through computer control

    CERN Document Server

    Gamble, J; Kemp, D; Keyser, R; Koutchouk, Jean-Pierre; Martucci, P P; Tausch, Lothar A; Vos, L

    1980-01-01

    The computer control system of the Intersecting Storage Rings (ISR) at CERN has always laid emphasis on two particular operational aspects, the first being the reproducibility of machine conditions and the second that of giving the operators the possibility to work in terms of machine parameters such as the tune. Already certain phases of the operation are optimized by the control system, whilst others are automated with a minimum of manual intervention. The authors describe this present control system with emphasis on the existing automated facilities and the features of the control system which make it possible. It then discusses the steps needed to completely automate the operational procedure of accelerators. (7 refs).

  15. Towards full automation of accelerators through computer control

    International Nuclear Information System (INIS)

    Gamble, J.; Hemery, J.-Y.; Kemp, D.; Keyser, R.; Koutchouk, J.-P.; Martucci, P.; Tausch, L.; Vos, L.

    1980-01-01

    The computer control system of the Intersecting Storage Rings (ISR) at CERN has always laid emphasis on two particular operational aspects, the first being the reproducibility of machine conditions and the second that of giving the operators the possibility to work in terms of machine parameters such as the tune. Already certain phases of the operation are optimized by the control system, whilst others are automated with a minimum of manual intervention. The paper describes this present control system with emphasis on the existing automated facilities and the features of the control system which make it possible. It then discusses the steps needed to completely automate the operational procedure of accelerators. (Auth.)

  16. Optimizing nitrogen fertilizer application to irrigated wheat. Results of a co-ordinated research project. 1994-1998

    International Nuclear Information System (INIS)

    2000-07-01

    This TECDOC summarizes the results of a Co-ordinated Research Project (CRP) on the Use of Nuclear Techniques for Optimizing Fertilizer Application under Irrigated Wheat to Increase the Efficient Use of Nitrogen Fertilizer and Consequently Reduce Environmental Pollution. The project was carried out between 1994 and 1998 through the technical co-ordination of the Joint FAO/IAEA Division of Nuclear Techniques in Food and Agriculture. Fourteen Member States of the IAEA and FAO carried out a series of field experiments aimed at improving irrigation water and fertilizer-N uptake efficiencies through integrated management of the complex Interactions involving inputs, soils, climate, and wheat cultivars. Its goals were: to investigate various aspects of fertilizer N uptake efficiency of wheat crops under irrigation through an interregional research network involving countries growing large areas of irrigated wheat; to use 15 N and the soil-moisture neutron probe to determine the fate of applied N, to follow water and nitrate movement in the soil, and to determine water balance and water-use efficiency in irrigated wheat cropping systems; to use the data generated to further develop and refine various relationships in the Ceres-Wheat computer simulation model; to use the knowledge generated to produce a N-rate-recommendation package to refine specific management strategies with respect to fertilizer applications and expected yields

  17. Automated defect spatial signature analysis for semiconductor manufacturing process

    Science.gov (United States)

    Tobin, Jr., Kenneth W.; Gleason, Shaun S.; Karnowski, Thomas P.; Sari-Sarraf, Hamed

    1999-01-01

    An apparatus and method for performing automated defect spatial signature alysis on a data set representing defect coordinates and wafer processing information includes categorizing data from the data set into a plurality of high level categories, classifying the categorized data contained in each high level category into user-labeled signature events, and correlating the categorized, classified signature events to a present or incipient anomalous process condition.

  18. Exploring the Lived Experiences of Program Managers Regarding an Automated Logistics Environment

    Science.gov (United States)

    Allen, Ronald Timothy

    2014-01-01

    Automated Logistics Environment (ALE) is a new term used by Navy and aerospace industry executives to describe the aggregate of logistics-related information systems that support modern aircraft weapon systems. The development of logistics information systems is not always well coordinated among programs, often resulting in solutions that cannot…

  19. Protection coordination of the Kennedy Space Center electric distribution network

    Science.gov (United States)

    1976-01-01

    A computer technique is described for visualizing the coordination and protection of any existing system of devices and settings by plotting the tripping characteristics of the involved devices on a common basis. The program determines the optimum settings of a given set of protective devices and configuration in the sense of the best expected coordinated operation of these devices. Subroutines are given for simulating time versus current characteristics of the different relays, circuit breakers, and fuses in the system; coordination index computation; protection checks; plotting; and coordination optimation.

  20. Optimization of the Automated Spray Layer-by-Layer Technique for Thin Film Deposition

    Science.gov (United States)

    2010-06-01

    air- pumped spray-paint cans 17,18 to fully automated systems using high pressure gas .7’ 19 This work uses the automated spray system previously...spray solutions were delivered by ultra high purity nitrogen gas (AirGas) regulated to 25psi, except when examining air pressure effects . The PAH solution...polyelectrolyte solution feed tube, the resulting Venturi effect causes the liquid solution to be drawn up into the airbrush nozzle, where it is

  1. Effects of an Automated Maintenance Management System on organizational communication

    International Nuclear Information System (INIS)

    Bauman, M.B.; VanCott, H.P.

    1988-01-01

    The primary purpose of the project was to evaluate the effectiveness of two techniques for improving organizational communication: (1) an Automated Maintenance Management System (AMMS) and (2) Interdepartmental Coordination Meetings. Additional objectives concerned the preparation of functional requirements for an AMMS, and training modules to improve group communication skills. Four nuclear power plants participated in the evaluation. Two plants installed AMMSs, one plant instituted interdepartmental job coordination meetings, and the fourth plant served as a control for the evaluation. Questionnaires and interviews were used to collect evaluative data. The evaluation focused on five communication or information criteria: timeliness, redundancy, withholding or gatekeeping, feedback, and accuracy/amount

  2. Hooked on a New Technology: The Automation Pioneers in Post-War Norway

    Directory of Open Access Journals (Sweden)

    Stig Kvaal

    2009-07-01

    Full Text Available This paper presents the initial activities in servo engineering in Norway originating in the early 1950s based on contacts at the Massachusets Institute of Technology. The activities were initiated by a small group of servo enthusiasts who, through the Feedback Control Committee in the research council, managed to coordinate national activities and establish strong research groups in Trondheim, Bergen and Oslo. After the initial phase of establishing the research groups, there was a continuous strong focus on connections with industry and industrial applications. In the mid-1960s the committee was strengthened and became the Automation and Data Processing Committee. The initial group of automation pioneers have left a lasting impact on the academic and industrial fields of servo engineering and automation in Norway.

  3. Advancing haemostasis automation--successful implementation of robotic centrifugation and sample processing in a tertiary service hospital.

    Science.gov (United States)

    Sédille-Mostafaie, Nazanin; Engler, Hanna; Lutz, Susanne; Korte, Wolfgang

    2013-06-01

    Laboratories today face increasing pressure to automate operations due to increasing workloads and the need to reduce expenditure. Few studies to date have focussed on the laboratory automation of preanalytical coagulation specimen processing. In the present study, we examined whether a clinical chemistry automation protocol meets the preanalytical requirements for the analyses of coagulation. During the implementation of laboratory automation, we began to operate a pre- and postanalytical automation system. The preanalytical unit processes blood specimens for chemistry, immunology and coagulation by automated specimen processing. As the production of platelet-poor plasma is highly dependent on optimal centrifugation, we examined specimen handling under different centrifugation conditions in order to produce optimal platelet deficient plasma specimens. To this end, manually processed models centrifuged at 1500 g for 5 and 20 min were compared to an automated centrifugation model at 3000 g for 7 min. For analytical assays that are performed frequently enough to be targets for full automation, Passing-Bablok regression analysis showed close agreement between different centrifugation methods, with a correlation coefficient between 0.98 and 0.99 and a bias between -5% and +6%. For seldom performed assays that do not mandate full automation, the Passing-Bablok regression analysis showed acceptable to poor agreement between different centrifugation methods. A full automation solution is suitable and can be recommended for frequent haemostasis testing.

  4. Enhanced Automated Guidance System for Horizontal Auger Boring Based on Image Processing.

    Science.gov (United States)

    Wu, Lingling; Wen, Guojun; Wang, Yudan; Huang, Lei; Zhou, Jiang

    2018-02-15

    Horizontal auger boring (HAB) is a widely used trenchless technology for the high-accuracy installation of gravity or pressure pipelines on line and grade. Differing from other pipeline installations, HAB requires a more precise and automated guidance system for use in a practical project. This paper proposes an economic and enhanced automated optical guidance system, based on optimization research of light-emitting diode (LED) light target and five automated image processing bore-path deviation algorithms. An LED light target was optimized for many qualities, including light color, filter plate color, luminous intensity, and LED layout. The image preprocessing algorithm, direction location algorithm, angle measurement algorithm, deflection detection algorithm, and auto-focus algorithm, compiled in MATLAB, are used to automate image processing for deflection computing and judging. After multiple indoor experiments, this guidance system is applied in a project of hot water pipeline installation, with accuracy controlled within 2 mm in 48-m distance, providing accurate line and grade controls and verifying the feasibility and reliability of the guidance system.

  5. Coordination Analysis Using Global Structural Constraints and Alignment-based Local Features

    Science.gov (United States)

    Hara, Kazuo; Shimbo, Masashi; Matsumoto, Yuji

    We propose a hybrid approach to coordinate structure analysis that combines a simple grammar to ensure consistent global structure of coordinations in a sentence, and features based on sequence alignment to capture local symmetry of conjuncts. The weight of the alignment-based features, which in turn determines the score of coordinate structures, is optimized by perceptron training on a given corpus. A bottom-up chart parsing algorithm efficiently finds the best scoring structure, taking both nested or non-overlapping flat coordinations into account. We demonstrate that our approach outperforms existing parsers in coordination scope detection on the Genia corpus.

  6. Automated high-dose rate brachytherapy treatment planning for a single-channel vaginal cylinder applicator

    Science.gov (United States)

    Zhou, Yuhong; Klages, Peter; Tan, Jun; Chi, Yujie; Stojadinovic, Strahinja; Yang, Ming; Hrycushko, Brian; Medin, Paul; Pompos, Arnold; Jiang, Steve; Albuquerque, Kevin; Jia, Xun

    2017-06-01

    High dose rate (HDR) brachytherapy treatment planning is conventionally performed manually and/or with aids of preplanned templates. In general, the standard of care would be elevated by conducting an automated process to improve treatment planning efficiency, eliminate human error, and reduce plan quality variations. Thus, our group is developing AutoBrachy, an automated HDR brachytherapy planning suite of modules used to augment a clinical treatment planning system. This paper describes our proof-of-concept module for vaginal cylinder HDR planning that has been fully developed. After a patient CT scan is acquired, the cylinder applicator is automatically segmented using image-processing techniques. The target CTV is generated based on physician-specified treatment depth and length. Locations of the dose calculation point, apex point and vaginal surface point, as well as the central applicator channel coordinates, and the corresponding dwell positions are determined according to their geometric relationship with the applicator and written to a structure file. Dwell times are computed through iterative quadratic optimization techniques. The planning information is then transferred to the treatment planning system through a DICOM-RT interface. The entire process was tested for nine patients. The AutoBrachy cylindrical applicator module was able to generate treatment plans for these cases with clinical grade quality. Computation times varied between 1 and 3 min on an Intel Xeon CPU E3-1226 v3 processor. All geometric components in the automated treatment plans were generated accurately. The applicator channel tip positions agreed with the manually identified positions with submillimeter deviations and the channel orientations between the plans agreed within less than 1 degree. The automatically generated plans obtained clinically acceptable quality.

  7. Automated Core Design

    International Nuclear Information System (INIS)

    Kobayashi, Yoko; Aiyoshi, Eitaro

    2005-01-01

    Multistate searching methods are a subfield of distributed artificial intelligence that aims to provide both principles for construction of complex systems involving multiple states and mechanisms for coordination of independent agents' actions. This paper proposes a multistate searching algorithm with reinforcement learning for the automatic core design of a boiling water reactor. The characteristics of this algorithm are that the coupling structure and the coupling operation suitable for the assigned problem are assumed and an optimal solution is obtained by mutual interference in multistate transitions using multiagents. Calculations in an actual plant confirmed that the proposed algorithm increased the convergence ability of the optimization process

  8. Cell-Detection Technique for Automated Patch Clamping

    Science.gov (United States)

    McDowell, Mark; Gray, Elizabeth

    2008-01-01

    A unique and customizable machinevision and image-data-processing technique has been developed for use in automated identification of cells that are optimal for patch clamping. [Patch clamping (in which patch electrodes are pressed against cell membranes) is an electrophysiological technique widely applied for the study of ion channels, and of membrane proteins that regulate the flow of ions across the membranes. Patch clamping is used in many biological research fields such as neurobiology, pharmacology, and molecular biology.] While there exist several hardware techniques for automated patch clamping of cells, very few of those techniques incorporate machine vision for locating cells that are ideal subjects for patch clamping. In contrast, the present technique is embodied in a machine-vision algorithm that, in practical application, enables the user to identify good and bad cells for patch clamping in an image captured by a charge-coupled-device (CCD) camera attached to a microscope, within a processing time of one second. Hence, the present technique can save time, thereby increasing efficiency and reducing cost. The present technique involves the utilization of cell-feature metrics to accurately make decisions on the degree to which individual cells are "good" or "bad" candidates for patch clamping. These metrics include position coordinates (x,y) in the image plane, major-axis length, minor-axis length, area, elongation, roundness, smoothness, angle of orientation, and degree of inclusion in the field of view. The present technique does not require any special hardware beyond commercially available, off-the-shelf patch-clamping hardware: A standard patchclamping microscope system with an attached CCD camera, a personal computer with an imagedata- processing board, and some experience in utilizing imagedata- processing software are all that are needed. A cell image is first captured by the microscope CCD camera and image-data-processing board, then the image

  9. Particle swarm optimization: an alternative in marine propeller optimization?

    Science.gov (United States)

    Vesting, F.; Bensow, R. E.

    2018-01-01

    This article deals with improving and evaluating the performance of two evolutionary algorithm approaches for automated engineering design optimization. Here a marine propeller design with constraints on cavitation nuisance is the intended application. For this purpose, the particle swarm optimization (PSO) algorithm is adapted for multi-objective optimization and constraint handling for use in propeller design. Three PSO algorithms are developed and tested for the optimization of four commercial propeller designs for different ship types. The results are evaluated by interrogating the generation medians and the Pareto front development. The same propellers are also optimized utilizing the well established NSGA-II genetic algorithm to provide benchmark results. The authors' PSO algorithms deliver comparable results to NSGA-II, but converge earlier and enhance the solution in terms of constraints violation.

  10. The Value of Information in Automated Negotiation: A Decision Model for Eliciting User Preferences

    NARCIS (Netherlands)

    T. Baarslag (Tim); M. Kaisers (Michael)

    2017-01-01

    textabstractConsider an agent that can autonomously negotiate and coordinate with others in our stead, to reach outcomes and agreements in our interest. Such automated negotiation agents are already common practice in areas such as high frequency trading, and are now finding applications in domains

  11. Automation facilities for agricultural machinery control

    Directory of Open Access Journals (Sweden)

    A. Yu. Izmaylov

    2017-01-01

    Full Text Available The possibility of use of the automation equipment for agricultural machinery control is investigated. The authors proposed solutions on creation of the centralized unified automated information system for mobile aggregates management. In accordance with the modern requirements this system should be open, integrated into the general schema of agricultural enterprise control. Standard hardware, software and communicative features should be realized in tasks of monitoring and control. Therefore the schema should be get with use the unified modules and Russian standards. The complex multivariate unified automated control system for different objects of agricultural purpose based on block and modular creation should correspond to the following principles: high reliability, simplicity of service, low expenses in case of operation, the short payback period connected to increase in productivity, the reduced losses when harvesting, postharvest processing and storage, the improved energetic indices. Technological processes control in agricultural production is exercised generally with feedback. The example without feedback is program control by temperature in storage in case of the cooling mode. Feedback at technological processes control in agricultural production allows to optimally solve a problem of rational distribution of functions in man-distributed systems and forming the intelligent ergonomic interfaces, consistent with professional perceptions of decision-makers. The negative feedback created by the control unit allows to support automatically a quality index of technological process at the set level. The quantitative analysis of a production situation base itself upon deeply formalized basis of computer facilities that promotes making of the optimal solution. Information automated control system introduction increases labor productivity by 40 percent, reduces energetic costs by 25 percent. Improvement of quality of the executed technological

  12. Automated structure solution, density modification and model building.

    Science.gov (United States)

    Terwilliger, Thomas C

    2002-11-01

    The approaches that form the basis of automated structure solution in SOLVE and RESOLVE are described. The use of a scoring scheme to convert decision making in macromolecular structure solution to an optimization problem has proven very useful and in many cases a single clear heavy-atom solution can be obtained and used for phasing. Statistical density modification is well suited to an automated approach to structure solution because the method is relatively insensitive to choices of numbers of cycles and solvent content. The detection of non-crystallographic symmetry (NCS) in heavy-atom sites and checking of potential NCS operations against the electron-density map has proven to be a reliable method for identification of NCS in most cases. Automated model building beginning with an FFT-based search for helices and sheets has been successful in automated model building for maps with resolutions as low as 3 A. The entire process can be carried out in a fully automatic fashion in many cases.

  13. Coupled Low-thrust Trajectory and System Optimization via Multi-Objective Hybrid Optimal Control

    Science.gov (United States)

    Vavrina, Matthew A.; Englander, Jacob Aldo; Ghosh, Alexander R.

    2015-01-01

    The optimization of low-thrust trajectories is tightly coupled with the spacecraft hardware. Trading trajectory characteristics with system parameters ton identify viable solutions and determine mission sensitivities across discrete hardware configurations is labor intensive. Local independent optimization runs can sample the design space, but a global exploration that resolves the relationships between the system variables across multiple objectives enables a full mapping of the optimal solution space. A multi-objective, hybrid optimal control algorithm is formulated using a multi-objective genetic algorithm as an outer loop systems optimizer around a global trajectory optimizer. The coupled problem is solved simultaneously to generate Pareto-optimal solutions in a single execution. The automated approach is demonstrated on two boulder return missions.

  14. Development of methodologies for optimization of surveillance testing and maintenance of safety related equipment at NPPs. Report of a research coordination meeting. Working material

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-09-01

    This report summarizes the results of the first meeting of the Coordinated Research Programme (CRP) on Development of Methodologies for Optimization of Surveillance Testing and Maintenance of Safety Related Equipment at NPPs, held at the Agency Headquarters in Vienna, from 16 to 20 December 1996. The purpose of this Research Coordination Meeting (RCM) was that all Chief Scientific Investigators of the groups participating in the CRP presented an outline of their proposed research projects. Additionally, the participants discussed the objective, scope, work plan and information channels of the CRP in detail. Based on these presentations and discussions, the entire project plan was updated, completed and included in this report. This report represents a common agreed project work plan for the CRP. Refs, figs, tabs.

  15. AUTOMATION OF CONTROL OF THE BUSINESS PROCESS OF PUBLISHING SCIENTIFIC JOURNALS

    Directory of Open Access Journals (Sweden)

    O. Yu. Sakaliuk

    2016-09-01

    Full Text Available We consider business process automation publishing scientific journals. It describes the focal point of publishing houses Odessa National Academy of Food Technology and the automation of business processes. A complex business process models publishing scientific journals. Analyzed organizational structure of Coordinating Centre of Scientific Journals' Publishing ONAFT structure and created its model. A process model simulation conducted business process notation eEPC and BPMN. Also held database design, creation of file structure and create AIS interface. Implemented interaction with the webcam. Justification feasibility of software development, and the definition of performance based on the results petal chart, it is safe to say that an automated way to much more efficient compared to manual mode. The developed software will accelerate the development of scientific periodicals ONAFT, which in turn improve the academy ratings at the global level, improve its image and credibility.

  16. Automated Design of Propellant-Optimal, End-to-End, Low-Thrust Trajectories for Trojan Asteroid Tours

    Science.gov (United States)

    Stuart, Jeffrey; Howell, Kathleen; Wilson, Roby

    2013-01-01

    The Sun-Jupiter Trojan asteroids are celestial bodies of great scientific interest as well as potential resources offering water and other mineral resources for longterm human exploration of the solar system. Previous investigations under this project have addressed the automated design of tours within the asteroid swarm. This investigation expands the current automation scheme by incorporating options for a complete trajectory design approach to the Trojan asteroids. Computational aspects of the design procedure are automated such that end-to-end trajectories are generated with a minimum of human interaction after key elements and constraints associated with a proposed mission concept are specified.

  17. Optimal Airport Surface Traffic Planning Using Mixed-Integer Linear Programming

    Directory of Open Access Journals (Sweden)

    P. C. Roling

    2008-01-01

    Full Text Available We describe an ongoing research effort pertaining to the development of a surface traffic automation system that will help controllers to better coordinate surface traffic movements related to arrival and departure traffic. More specifically, we describe the concept for a taxi-planning support tool that aims to optimize the routing and scheduling of airport surface traffic in such a way as to deconflict the taxi plans while optimizing delay, total taxi-time, or some other airport efficiency metric. Certain input parameters related to resource demand, such as the expected landing times and the expected pushback times, are rather difficult to predict accurately. Due to uncertainty in the input data driving the taxi-planning process, the taxi-planning tool is designed such that it produces solutions that are robust to uncertainty. The taxi-planning concept presented herein, which is based on mixed-integer linear programming, is designed such that it is able to adapt to perturbations in these input conditions, as well as to account for failure in the actual execution of surface trajectories. The capabilities of the tool are illustrated in a simple hypothetical airport.

  18. Automated Tracking of Cell Migration with Rapid Data Analysis.

    Science.gov (United States)

    DuChez, Brian J

    2017-09-01

    Cell migration is essential for many biological processes including development, wound healing, and metastasis. However, studying cell migration often requires the time-consuming and labor-intensive task of manually tracking cells. To accelerate the task of obtaining coordinate positions of migrating cells, we have developed a graphical user interface (GUI) capable of automating the tracking of fluorescently labeled nuclei. This GUI provides an intuitive user interface that makes automated tracking accessible to researchers with no image-processing experience or familiarity with particle-tracking approaches. Using this GUI, users can interactively determine a minimum of four parameters to identify fluorescently labeled cells and automate acquisition of cell trajectories. Additional features allow for batch processing of numerous time-lapse images, curation of unwanted tracks, and subsequent statistical analysis of tracked cells. Statistical outputs allow users to evaluate migratory phenotypes, including cell speed, distance, displacement, and persistence, as well as measures of directional movement, such as forward migration index (FMI) and angular displacement. © 2017 by John Wiley & Sons, Inc. Copyright © 2017 John Wiley & Sons, Inc.

  19. Method and system for assigning a confidence metric for automated determination of optic disc location

    Science.gov (United States)

    Karnowski, Thomas P [Knoxville, TN; Tobin, Jr., Kenneth W.; Muthusamy Govindasamy, Vijaya Priya [Knoxville, TN; Chaum, Edward [Memphis, TN

    2012-07-10

    A method for assigning a confidence metric for automated determination of optic disc location that includes analyzing a retinal image and determining at least two sets of coordinates locating an optic disc in the retinal image. The sets of coordinates can be determined using first and second image analysis techniques that are different from one another. An accuracy parameter can be calculated and compared to a primary risk cut-off value. A high confidence level can be assigned to the retinal image if the accuracy parameter is less than the primary risk cut-off value and a low confidence level can be assigned to the retinal image if the accuracy parameter is greater than the primary risk cut-off value. The primary risk cut-off value being selected to represent an acceptable risk of misdiagnosis of a disease having retinal manifestations by the automated technique.

  20. Applications of automatic differentiation in topology optimization

    DEFF Research Database (Denmark)

    Nørgaard, Sebastian A.; Sagebaum, Max; Gauger, Nicolas R.

    2017-01-01

    The goal of this article is to demonstrate the applicability and to discuss the advantages and disadvantages of automatic differentiation in topology optimization. The technique makes it possible to wholly or partially automate the evaluation of derivatives for optimization problems and is demons...

  1. Automated dual-wavelength spectrophotometer optimized for phytochrome assay

    International Nuclear Information System (INIS)

    Pratt, L.H.; Wampler, J.E.; Rich, E.S. Jr.

    1985-01-01

    A microcomputer-controlled dual-wavelength spectrophotometer suitable for automated phytochrome assay is described. The optomechanical unit provides for sequential irradiation of the sample by the two measuring wavelengths with intervening dark intervals and for actinic irradiation to interconvert phytochrome between its two forms. Photomultiplier current is amplified, converted to a digital value and transferred into the computer using a custom-designed IEEE-488 bus interface. The microcomputer calculates mathematically both absorbance and absorbance difference values with dynamic correction for photomultiplier dark current. In addition, the computer controls the operating parameters of the spectrophotometer via a separate interface. These parameters include control of the durations of measuring and actinic irradiation intervals and their sequence. 14 references, 4 figures

  2. Coordination under the Shadow of Career Concerns

    DEFF Research Database (Denmark)

    Koch, Alexander; Morgenstern, Albrecht

    To innovate, firms require their employees to develop novel ideas and to coordinate with each other to turn these ideas into products, services or business strategies. Because the quality of implemented designs that employees are associated with affects their labor market opportunities, career...... concerns arise that can both be ‘good’ (enhancing incentives for effort in developing ideas) and ‘bad’ (preventing voluntary coordination). Depending on the strength of career concerns, either group-based incentives or team production are optimal. This finding provides a possible link between the increased...

  3. The role of automation and artificial intelligence

    Science.gov (United States)

    Schappell, R. T.

    1983-07-01

    Consideration is given to emerging technologies that are not currently in common use, yet will be mature enough for implementation in a space station. Artificial intelligence (AI) will permit more autonomous operation and improve the man-machine interfaces. Technology goals include the development of expert systems, a natural language query system, automated planning systems, and AI image understanding systems. Intelligent robots and teleoperators will be needed, together with improved sensory systems for the robotics, housekeeping, vehicle control, and spacecraft housekeeping systems. Finally, NASA is developing the ROBSIM computer program to evaluate level of automation, perform parametric studies and error analyses, optimize trajectories and control systems, and assess AI technology.

  4. Optimal routing in an automated storage/retrieval system with dedicated storage

    NARCIS (Netherlands)

    Berg, van den J.P.; Gademann, A.J.R.M.

    1999-01-01

    We address the sequencing of requests in an automated storage/retrieval system with dedicated storage. We consider the block sequencing approach, where a set of storage and retrieval requests is given beforehand and no new requests come in during operation. The objective for this static problem is

  5. Coordinated Active Power Dispatch for a Microgrid via Distributed Lambda Iteration

    DEFF Research Database (Denmark)

    Hu, Jianqiang; Z. Q. Chen, Michael; Cao, Jinde

    2017-01-01

    A novel distributed optimal dispatch algorithm is proposed for coordinating the operation of multiple micro units in a microgrid, which has incorporated the distributed consensus algorithm in multi-agent systems and the -iteration optimization algorithm in economic dispatch of power systems. Spec...

  6. Efficient Messaging through Cluster Coordinators in Decentralized Controlled Material Flow Systems

    Directory of Open Access Journals (Sweden)

    Lieberoth-Leden Christian

    2016-01-01

    Full Text Available The modularization of the hard- and software is one approach handling the demand for increasing flexibility and changeability of automated material flow systems. A control that is distributed across several different hardware controllers leads to a great demand for coordination between the modules while planning for example transports, especially if there is a mutual dependency between the modules on the executing tasks. Short-term changes in planning often initiate a rescheduling chain reaction, which causes a high communication load in the system. In the presented approach, module clusters with a centralized coordinator are automatically formed out of multiple modules and substitutional take over the surrounding communication for the modules. As a result, they minimize exchanged messages by focusing on the essential information.

  7. Facilitating the BIM coordinator and empowering the suppliers with automated data compliance checking

    NARCIS (Netherlands)

    van Berlo, Léon A. H. M.; Papadonikolaki, E.; Christodoulou, S.E.; Scherer, R.

    2016-01-01

    In projects with Building Information Modelling (BIM), the collaboration among the various actors is a very intricate and intensive process. The various suppliers and engineers provide their input in Industry Foundation Classes (IFC), which in turn is used for design coordination. However, the IFCs

  8. Optimizing Decision Preparedness by Adapting Scenario Complexity and Automating Scenario Generation

    Science.gov (United States)

    Dunne, Rob; Schatz, Sae; Flore, Stephen M.; Nicholson, Denise

    2011-01-01

    Klein's recognition-primed decision (RPD) framework proposes that experts make decisions by recognizing similarities between current decision situations and previous decision experiences. Unfortunately, military personnel arQ often presented with situations that they have not experienced before. Scenario-based training (S8T) can help mitigate this gap. However, SBT remains a challenging and inefficient training approach. To address these limitations, the authors present an innovative formulation of scenario complexity that contributes to the larger research goal of developing an automated scenario generation system. This system will enable trainees to effectively advance through a variety of increasingly complex decision situations and experiences. By adapting scenario complexities and automating generation, trainees will be provided with a greater variety of appropriately calibrated training events, thus broadening their repositories of experience. Preliminary results from empirical testing (N=24) of the proof-of-concept formula are presented, and future avenues of scenario complexity research are also discussed.

  9. Optimal number of stimulation contacts for coordinated reset neuromodulation

    Directory of Open Access Journals (Sweden)

    Borys eLysyansky

    2013-07-01

    Full Text Available In this computational study we investigatecoordinated reset (CR neuromodulation designed for an effective controlof synchronization by multi-site stimulation of neuronal target populations. This method was suggested to effectively counteract pathological neuronal synchronycharacteristic for several neurological disorders. We studyhow many stimulation sites are required for optimal CR-induced desynchronization. We found that a moderate increase of the number of stimulation sitesmay significantly prolong the post-stimulation desynchronized transientafter the stimulation is completely switched off. This can, in turn,reduce the amount of the administered stimulation current for theintermittent ON-OFF CR stimulation protocol, where time intervalswith stimulation ON are recurrently followed by time intervals withstimulation OFF. In addition, we found that the optimal number ofstimulation sites essentially depends on how strongly the administeredcurrent decays within the neuronal tissue with increasing distancefrom the stimulation site. In particular, for a broad spatial stimulationprofile, i.e., for a weak spatial decay rate of the stimulation current,CR stimulation can optimally be delivered via a small number of stimulationsites. Our findings may contribute to an optimization of therapeutic applications of CR neuromodulation.

  10. Combined Optimal Control System for excavator electric drive

    Science.gov (United States)

    Kurochkin, N. S.; Kochetkov, V. P.; Platonova, E. V.; Glushkin, E. Y.; Dulesov, A. S.

    2018-03-01

    The article presents a synthesis of the combined optimal control algorithms of the AC drive rotation mechanism of the excavator. Synthesis of algorithms consists in the regulation of external coordinates - based on the theory of optimal systems and correction of the internal coordinates electric drive using the method "technical optimum". The research shows the advantage of optimal combined control systems for the electric rotary drive over classical systems of subordinate regulation. The paper presents a method for selecting the optimality criterion of coefficients to find the intersection of the range of permissible values of the coordinates of the control object. There is possibility of system settings by choosing the optimality criterion coefficients, which allows one to select the required characteristics of the drive: the dynamic moment (M) and the time of the transient process (tpp). Due to the use of combined optimal control systems, it was possible to significantly reduce the maximum value of the dynamic moment (M) and at the same time - reduce the transient time (tpp).

  11. Automated batch emulsion copolymerization of styrene and butyl acrylate

    NARCIS (Netherlands)

    Mballa Mballa, M.A.; Schubert, U.S.; Heuts, J.P.A.; Herk, van A.M.

    2011-01-01

    This article describes a method for carrying out emulsion copolymerization using an automated synthesizer. For this purpose, batch emulsion copolymerizations of styrene and butyl acrylate were investigated. The optimization of the polymerization system required tuning the liquid transfer method,

  12. Hybrid Architecture for Coordination of AGVs in FMS

    Directory of Open Access Journals (Sweden)

    Eduardo G. Hernandez-Martinez

    2014-03-01

    Full Text Available This paper presents a hybrid control architecture that coordinates the motion of groups of automated guided vehicles in flexible manufacturing systems. The high-level control is based on a Petri net model, using the industrial standard ISA-95, obtaining a task-based coordination of equipment and storage considering process restrictions, logical precedences, shared resources and the assignment of robots to move workpieces individually or in subgroups. On the other hand, in the low-level control, three basic control laws are designed for unicycle-type robots in order to achieve desired formation patterns and marching behaviours, avoiding inter-robot collisions. The control scheme combines the task assignment for the robots obtained from the discrete-event model and the implementation of formation and marching continuous control laws applied to the motion of the mobile robots. The hybrid architecture is implemented and validated for the case of a flexible manufacturing system and four mobile robots using a virtual reality platform.

  13. Automated sequence-specific protein NMR assignment using the memetic algorithm MATCH

    International Nuclear Information System (INIS)

    Volk, Jochen; Herrmann, Torsten; Wuethrich, Kurt

    2008-01-01

    MATCH (Memetic Algorithm and Combinatorial Optimization Heuristics) is a new memetic algorithm for automated sequence-specific polypeptide backbone NMR assignment of proteins. MATCH employs local optimization for tracing partial sequence-specific assignments within a global, population-based search environment, where the simultaneous application of local and global optimization heuristics guarantees high efficiency and robustness. MATCH thus makes combined use of the two predominant concepts in use for automated NMR assignment of proteins. Dynamic transition and inherent mutation are new techniques that enable automatic adaptation to variable quality of the experimental input data. The concept of dynamic transition is incorporated in all major building blocks of the algorithm, where it enables switching between local and global optimization heuristics at any time during the assignment process. Inherent mutation restricts the intrinsically required randomness of the evolutionary algorithm to those regions of the conformation space that are compatible with the experimental input data. Using intact and artificially deteriorated APSY-NMR input data of proteins, MATCH performed sequence-specific resonance assignment with high efficiency and robustness

  14. Immunosuppressant therapeutic drug monitoring by LC-MS/MS: workflow optimization through automated processing of whole blood samples.

    Science.gov (United States)

    Marinova, Mariela; Artusi, Carlo; Brugnolo, Laura; Antonelli, Giorgia; Zaninotto, Martina; Plebani, Mario

    2013-11-01

    Although, due to its high specificity and sensitivity, LC-MS/MS is an efficient technique for the routine determination of immunosuppressants in whole blood, it involves time-consuming manual sample preparation. The aim of the present study was therefore to develop an automated sample-preparation protocol for the quantification of sirolimus, everolimus and tacrolimus by LC-MS/MS using a liquid handling platform. Six-level commercially available blood calibrators were used for assay development, while four quality control materials and three blood samples from patients under immunosuppressant treatment were employed for the evaluation of imprecision. Barcode reading, sample re-suspension, transfer of whole blood samples into 96-well plates, addition of internal standard solution, mixing, and protein precipitation were performed with a liquid handling platform. After plate filtration, the deproteinised supernatants were submitted for SPE on-line. The only manual steps in the entire process were de-capping of the tubes, and transfer of the well plates to the HPLC autosampler. Calibration curves were linear throughout the selected ranges. The imprecision and accuracy data for all analytes were highly satisfactory. The agreement between the results obtained with manual and those obtained with automated sample preparation was optimal (n=390, r=0.96). In daily routine (100 patient samples) the typical overall total turnaround time was less than 6h. Our findings indicate that the proposed analytical system is suitable for routine analysis, since it is straightforward and precise. Furthermore, it incurs less manual workload and less risk of error in the quantification of whole blood immunosuppressant concentrations than conventional methods. © 2013.

  15. An Expanded Theoretical Framework of Care Coordination Across Transitions in Care Settings.

    Science.gov (United States)

    Radwin, Laurel E; Castonguay, Denise; Keenan, Carolyn B; Hermann, Cherice

    2016-01-01

    For many patients, high-quality, patient-centered, and cost-effective health care requires coordination among multiple clinicians and settings. Ensuring optimal care coordination requires a clear understanding of how clinician activities and continuity during transitions affect patient-centeredness and quality outcomes. This article describes an expanded theoretical framework to better understand care coordination. The framework provides clear articulation of concepts. Examples are provided of ways to measure the concepts.

  16. Automation: the competitive edge for HMOs and other alternative delivery systems.

    Science.gov (United States)

    Prussin, J A

    1987-12-01

    Until recently, many, if not most, Health Maintenance Organizations (HMO) were not automated. Moreover, HMOs that were automated tended to be automated only on a limited basis. Recently, however, the highly competitive marketplace within which HMOs and other Alternative Delivery Systems (ADS) exist has required that they operate at a maximum effectiveness and efficiency. Given the complex nature of ADSs, the volume of transactions in ADSs, the large number of members served by ADSs, and the numerous providers who are paid at different rates and on different bases by ADSs, it is impossible for an ADS to operate effectively or efficiently, let alone show optimal performance, without a sophisticated, comprehensive automated system. Reliable automated systems designed specifically to address ADS functions such as enrollment and premium billing, finance and accounting, medical information and patient management, and marketing have recently become available at a reasonable cost.

  17. Coordination of a supply chain with consumer return under vendor-managed consignment inventory and stochastic demand

    Science.gov (United States)

    Wu, Zhihui; Chen, Dongyan; Yu, Hui

    2016-07-01

    In this paper, the problem of the coordination policy is investigated for vendor-managed consignment inventory supply chain subject to consumer return. Here, the market demand is assumed to be affected by promotional effort and consumer return policy. The optimal consignment inventory and the optimal promotional effort level are proposed under the decentralized and centralized decisions. Based on the optimal decision conditions, the markdown allowance-promotional cost-sharing contract is investigated to coordinate the supply chain. Subsequently, the comparison between the two extreme policies shows that full-refund policy dominates the no-return policy when the returning cost and the positive effect of return policy are satisfied certain conditions. Finally, a numerical example is provided to illustrate the impacts of consumer return policy on the coordination contract and optimal profit as well as the effectiveness of the proposed supply chain decision.

  18. Market-Based and System-Wide Fuel Cycle Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, Paul Philip Hood [Univ. of Wisconsin, Madison, WI (United States); Scopatz, Anthony [Univ. of South Carolina, Columbia, SC (United States); Gidden, Matthew [Univ. of Wisconsin, Madison, WI (United States); Carlsen, Robert [Univ. of Wisconsin, Madison, WI (United States); Mouginot, Baptiste [Univ. of Wisconsin, Madison, WI (United States); Flanagan, Robert [Univ. of South Carolina, Columbia, SC (United States)

    2017-06-13

    This work introduces automated optimization into fuel cycle simulations in the Cyclus platform. This includes system-level optimizations, seeking a deployment plan that optimizes the performance over the entire transition, and market-level optimization, seeking an optimal set of material trades at each time step. These concepts were introduced in a way that preserves the flexibility of the Cyclus fuel cycle framework, one of its most important design principles.

  19. Market-Based and System-Wide Fuel Cycle Optimization

    International Nuclear Information System (INIS)

    Wilson, Paul Philip Hood; Scopatz, Anthony; Gidden, Matthew; Carlsen, Robert; Mouginot, Baptiste; Flanagan, Robert

    2017-01-01

    This work introduces automated optimization into fuel cycle simulations in the Cyclus platform. This includes system-level optimizations, seeking a deployment plan that optimizes the performance over the entire transition, and market-level optimization, seeking an optimal set of material trades at each time step. These concepts were introduced in a way that preserves the flexibility of the Cyclus fuel cycle framework, one of its most important design principles.

  20. Application of microcomputer in automating microscope measurements in nuclear emulsion viewing

    International Nuclear Information System (INIS)

    Blaho, D.

    1985-01-01

    Microcomputer system MPS 8010 is described as applied to the automation of data collection and control of a microscope. The data on the measured coordinates of the microscope are recorded on paper tape and listed on a typewriter. The microcomputer system also makes possible automatic control of the microscope position by means of stepping motors according to the value read-out of the paper tape. (author)

  1. Team play with a powerful and independent agent: operational experiences and automation surprises on the Airbus A-320

    Science.gov (United States)

    Sarter, N. B.; Woods, D. D.

    1997-01-01

    Research and operational experience have shown that one of the major problems with pilot-automation interaction is a lack of mode awareness (i.e., the current and future status and behavior of the automation). As a result, pilots sometimes experience so-called automation surprises when the automation takes an unexpected action or fails to behave as anticipated. A lack of mode awareness and automation surprises can he viewed as symptoms of a mismatch between human and machine properties and capabilities. Changes in automation design can therefore he expected to affect the likelihood and nature of problems encountered by pilots. Previous studies have focused exclusively on early generation "glass cockpit" aircraft that were designed based on a similar automation philosophy. To find out whether similar difficulties with maintaining mode awareness are encountered on more advanced aircraft, a corpus of automation surprises was gathered from pilots of the Airbus A-320, an aircraft characterized by high levels of autonomy, authority, and complexity. To understand the underlying reasons for reported breakdowns in human-automation coordination, we also asked pilots about their monitoring strategies and their experiences with and attitude toward the unique design of flight controls on this aircraft.

  2. Configuring the Orion Guidance, Navigation, and Control Flight Software for Automated Sequencing

    Science.gov (United States)

    Odegard, Ryan G.; Siliwinski, Tomasz K.; King, Ellis T.; Hart, Jeremy J.

    2010-01-01

    The Orion Crew Exploration Vehicle is being designed with greater automation capabilities than any other crewed spacecraft in NASA s history. The Guidance, Navigation, and Control (GN&C) flight software architecture is designed to provide a flexible and evolvable framework that accommodates increasing levels of automation over time. Within the GN&C flight software, a data-driven approach is used to configure software. This approach allows data reconfiguration and updates to automated sequences without requiring recompilation of the software. Because of the great dependency of the automation and the flight software on the configuration data, the data management is a vital component of the processes for software certification, mission design, and flight operations. To enable the automated sequencing and data configuration of the GN&C subsystem on Orion, a desktop database configuration tool has been developed. The database tool allows the specification of the GN&C activity sequences, the automated transitions in the software, and the corresponding parameter reconfigurations. These aspects of the GN&C automation on Orion are all coordinated via data management, and the database tool provides the ability to test the automation capabilities during the development of the GN&C software. In addition to providing the infrastructure to manage the GN&C automation, the database tool has been designed with capabilities to import and export artifacts for simulation analysis and documentation purposes. Furthermore, the database configuration tool, currently used to manage simulation data, is envisioned to evolve into a mission planning tool for generating and testing GN&C software sequences and configurations. A key enabler of the GN&C automation design, the database tool allows both the creation and maintenance of the data artifacts, as well as serving the critical role of helping to manage, visualize, and understand the data-driven parameters both during software development

  3. Radius of Care in Secondary Schools in the Midwest: Are Automated External Defibrillators Sufficiently Accessible to Enable Optimal Patient Care?

    Science.gov (United States)

    Osterman, Michael; Claiborne, Tina; Liberi, Victor

    2018-04-25

      Sudden cardiac arrest is the leading cause of death among young athletes. According to the American Heart Association, an automated external defibrillator (AED) should be available within a 1- to 1.5-minute brisk walk from the patient for the highest chance of survival. Secondary school personnel have reported a lack of understanding about the proper number and placement of AEDs for optimal patient care.   To determine whether fixed AEDs were located within a 1- to 1.5-minute timeframe from any location on secondary school property (ie, radius of care).   Cross-sectional study.   Public and private secondary schools in northwest Ohio and southeast Michigan.   Thirty schools (24 public, 6 private) volunteered.   Global positioning system coordinates were used to survey the entire school properties and determine AED locations. From each AED location, the radius of care was calculated for 3 retrieval speeds: walking, jogging, and driving a utility vehicle. Data were analyzed to expose any property area that fell outside the radius of care.   Public schools (37.1% ± 11.0%) possessed more property outside the radius of care than did private schools (23.8% ± 8.0%; F 1,28 = 8.35, P = .01). After accounting for retrieval speed, we still observed differences between school types when personnel would need to walk or jog to retrieve an AED ( F 1.48,41.35 = 4.99, P = .02). The percentages of school property outside the radius of care for public and private schools were 72.6% and 56.3%, respectively, when walking and 34.4% and 12.2%, respectively, when jogging. Only 4.2% of the public and none of the private schools had property outside the radius of care when driving a utility vehicle.   Schools should strategically place AEDs to decrease the percentage of property area outside the radius of care. In some cases, placement in a centralized location that is publicly accessible may be more important than the overall number of AEDs on site.

  4. Assessment selection in human-automation interaction studies: The Failure-GAM2E and review of assessment methods for highly automated driving.

    Science.gov (United States)

    Grane, Camilla

    2018-01-01

    Highly automated driving will change driver's behavioural patterns. Traditional methods used for assessing manual driving will only be applicable for the parts of human-automation interaction where the driver intervenes such as in hand-over and take-over situations. Therefore, driver behaviour assessment will need to adapt to the new driving scenarios. This paper aims at simplifying the process of selecting appropriate assessment methods. Thirty-five papers were reviewed to examine potential and relevant methods. The review showed that many studies still relies on traditional driving assessment methods. A new method, the Failure-GAM 2 E model, with purpose to aid assessment selection when planning a study, is proposed and exemplified in the paper. Failure-GAM 2 E includes a systematic step-by-step procedure defining the situation, failures (Failure), goals (G), actions (A), subjective methods (M), objective methods (M) and equipment (E). The use of Failure-GAM 2 E in a study example resulted in a well-reasoned assessment plan, a new way of measuring trust through feet movements and a proposed Optimal Risk Management Model. Failure-GAM 2 E and the Optimal Risk Management Model are believed to support the planning process for research studies in the field of human-automation interaction. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Optimization of Control Points Number at Coordinate Measurements based on the Monte-Carlo Method

    Science.gov (United States)

    Korolev, A. A.; Kochetkov, A. V.; Zakharov, O. V.

    2018-01-01

    Improving the quality of products causes an increase in the requirements for the accuracy of the dimensions and shape of the surfaces of the workpieces. This, in turn, raises the requirements for accuracy and productivity of measuring of the workpieces. The use of coordinate measuring machines is currently the most effective measuring tool for solving similar problems. The article proposes a method for optimizing the number of control points using Monte Carlo simulation. Based on the measurement of a small sample from batches of workpieces, statistical modeling is performed, which allows one to obtain interval estimates of the measurement error. This approach is demonstrated by examples of applications for flatness, cylindricity and sphericity. Four options of uniform and uneven arrangement of control points are considered and their comparison is given. It is revealed that when the number of control points decreases, the arithmetic mean decreases, the standard deviation of the measurement error increases and the probability of the measurement α-error increases. In general, it has been established that it is possible to repeatedly reduce the number of control points while maintaining the required measurement accuracy.

  6. Applications of the soft computing in the automated history matching

    Energy Technology Data Exchange (ETDEWEB)

    Silva, P.C.; Maschio, C.; Schiozer, D.J. [Unicamp (Brazil)

    2006-07-01

    Reservoir management is a research field in petroleum engineering that optimizes reservoir performance based on environmental, political, economic and technological criteria. Reservoir simulation is based on geological models that simulate fluid flow. Models must be constantly corrected to yield the observed production behaviour. The process of history matching is controlled by the comparison of production data, well test data and measured data from simulations. Parametrization, objective function analysis, sensitivity analysis and uncertainty analysis are important steps in history matching. One of the main challenges facing automated history matching is to develop algorithms that find the optimal solution in multidimensional search spaces. Optimization algorithms can be either global optimizers that work with noisy multi-modal functions, or local optimizers that cannot work with noisy multi-modal functions. The problem with global optimizers is the very large number of function calls, which is an inconvenience due to the long reservoir simulation time. For that reason, techniques such as least squared, thin plane spline, kriging and artificial neural networks (ANN) have been used as substitutes to reservoir simulators. This paper described the use of optimization algorithms to find optimal solution in automated history matching. Several ANN were used, including the generalized regression neural network, fuzzy system with subtractive clustering and radial basis network. The UNIPAR soft computing method was used along with a modified Hooke- Jeeves optimization method. Two case studies with synthetic and real reservoirs are examined. It was concluded that the combination of global and local optimization has the potential to improve the history matching process and that the use of substitute models can reduce computational efforts. 15 refs., 11 figs.

  7. Recycling production designs: the value of coordination and flexibility in aluminum recycling operations

    Science.gov (United States)

    Brommer, Tracey H.

    The growing motivation for aluminum recycling has prompted interest in recycling alternative and more challenging secondary materials. The nature of these alternative secondary materials necessitates the development of an intermediate recycling facility that can reprocess the secondary materials into a liquid product Two downstream aluminum remelters will incorporate the liquid products into their aluminum alloy production schedules. Energy and environmental benefits result from delivering the products as liquid but coordination challenges persist because of the energy cost to maintain the liquid. Further coordination challenges result from the necessity to establish a long term recycling production plan in the presence of long term downstream aluminum remelter production uncertainty and inherent variation in the daily order schedule of the downstream aluminum remelters. In this context a fundamental question arises, considering the metallurgical complexities of dross reprocessing, what is the value of operating a coordinated set of by-product reprocessing plants and remelting cast houses? A methodology is presented to calculate the optimal recycling center production parameters including 1) the number of recycled products, 2) the volume of recycled products, 3) allocation of recycled materials across recycled products, 4) allocation of recycled products across finished alloys, 4) the level of flexibility for the recycling center to operate. The methods implemented include, 1) an optimization model to describe the long term operations of the recycling center, 2) an uncertainty simulation tool, 3) a simulation optimization method, 4) a dynamic simulation tool with four embedded daily production optimization models of varying degrees of flexibility. This methodology is used to quantify the performance of several recycling center production designs of varying levels of coordination and flexibility. This analysis allowed the identification of the optimal recycling

  8. Automated quantification of epicardial adipose tissue using CT angiography: evaluation of a prototype software

    Energy Technology Data Exchange (ETDEWEB)

    Spearman, James V.; Silverman, Justin R.; Krazinski, Aleksander W.; Costello, Philip [Medical University of South Carolina, Department of Radiology and Radiological Science, Charleston, SC (United States); Meinel, Felix G.; Geyer, Lucas L. [Medical University of South Carolina, Department of Radiology and Radiological Science, Charleston, SC (United States); Ludwig-Maximilians-University Hospital, Institute for Clinical Radiology, Munich (Germany); Schoepf, U.J. [Medical University of South Carolina, Department of Radiology and Radiological Science, Charleston, SC (United States); Medical University of South Carolina, Division of Cardiology, Department of Medicine, Charleston, SC (United States); Apfaltrer, Paul [Medical University of South Carolina, Department of Radiology and Radiological Science, Charleston, SC (United States); University Medical Center Mannheim, Medical Faculty Mannheim - Heidelberg University, Institute of Clinical Radiology and Nuclear Medicine, Mannheim (Germany); Canstein, Christian [Siemens Medical Solutions USA, Inc., Malvern, PA (United States); De Cecco, Carlo Nicola [Medical University of South Carolina, Department of Radiology and Radiological Science, Charleston, SC (United States); University of Rome ' ' Sapienza' ' - Polo Pontino, Department of Radiological Sciences, Oncology and Pathology, Latina (Italy)

    2014-02-15

    This study evaluated the performance of a novel automated software tool for epicardial fat volume (EFV) quantification compared to a standard manual technique at coronary CT angiography (cCTA). cCTA data sets of 70 patients (58.6 ± 12.9 years, 33 men) were retrospectively analysed using two different post-processing software applications. Observer 1 performed a manual single-plane pericardial border definition and EFV{sub M} segmentation (manual approach). Two observers used a software program with fully automated 3D pericardial border definition and EFV{sub A} calculation (automated approach). EFV and time required for measuring EFV (including software processing time and manual optimization time) for each method were recorded. Intraobserver and interobserver reliability was assessed on the prototype software measurements. T test, Spearman's rho, and Bland-Altman plots were used for statistical analysis. The final EFV{sub A} (with manual border optimization) was strongly correlated with the manual axial segmentation measurement (60.9 ± 33.2 mL vs. 65.8 ± 37.0 mL, rho = 0.970, P < 0.001). A mean of 3.9 ± 1.9 manual border edits were performed to optimize the automated process. The software prototype required significantly less time to perform the measurements (135.6 ± 24.6 s vs. 314.3 ± 76.3 s, P < 0.001) and showed high reliability (ICC > 0.9). Automated EFV{sub A} quantification is an accurate and time-saving method for quantification of EFV compared to established manual axial segmentation methods. (orig.)

  9. Analysis of the Optimal Duration of Behavioral Observations Based on an Automated Continuous Monitoring System in Tree Swallows (Tachycineta bicolor: Is One Hour Good Enough?

    Directory of Open Access Journals (Sweden)

    Ádám Z Lendvai

    Full Text Available Studies of animal behavior often rely on human observation, which introduces a number of limitations on sampling. Recent developments in automated logging of behaviors make it possible to circumvent some of these problems. Once verified for efficacy and accuracy, these automated systems can be used to determine optimal sampling regimes for behavioral studies. Here, we used a radio-frequency identification (RFID system to quantify parental effort in a bi-parental songbird species: the tree swallow (Tachycineta bicolor. We found that the accuracy of the RFID monitoring system was similar to that of video-recorded behavioral observations for quantifying parental visits. Using RFID monitoring, we also quantified the optimum duration of sampling periods for male and female parental effort by looking at the relationship between nest visit rates estimated from sampling periods with different durations and the total visit numbers for the day. The optimum sampling duration (the shortest observation time that explained the most variation in total daily visits per unit time was 1h for both sexes. These results show that RFID and other automated technologies can be used to quantify behavior when human observation is constrained, and the information from these monitoring technologies can be useful for evaluating the efficacy of human observation methods.

  10. Development of a neuro-fuzzy technique for automated parameter optimization of inverse treatment planning

    International Nuclear Information System (INIS)

    Stieler, Florian; Yan, Hui; Lohr, Frank; Wenz, Frederik; Yin, Fang-Fang

    2009-01-01

    Parameter optimization in the process of inverse treatment planning for intensity modulated radiation therapy (IMRT) is mainly conducted by human planners in order to create a plan with the desired dose distribution. To automate this tedious process, an artificial intelligence (AI) guided system was developed and examined. The AI system can automatically accomplish the optimization process based on prior knowledge operated by several fuzzy inference systems (FIS). Prior knowledge, which was collected from human planners during their routine trial-and-error process of inverse planning, has first to be 'translated' to a set of 'if-then rules' for driving the FISs. To minimize subjective error which could be costly during this knowledge acquisition process, it is necessary to find a quantitative method to automatically accomplish this task. A well-developed machine learning technique, based on an adaptive neuro fuzzy inference system (ANFIS), was introduced in this study. Based on this approach, prior knowledge of a fuzzy inference system can be quickly collected from observation data (clinically used constraints). The learning capability and the accuracy of such a system were analyzed by generating multiple FIS from data collected from an AI system with known settings and rules. Multiple analyses showed good agreements of FIS and ANFIS according to rules (error of the output values of ANFIS based on the training data from FIS of 7.77 ± 0.02%) and membership functions (3.9%), thus suggesting that the 'behavior' of an FIS can be propagated to another, based on this process. The initial experimental results on a clinical case showed that ANFIS is an effective way to build FIS from practical data, and analysis of ANFIS and FIS with clinical cases showed good planning results provided by ANFIS. OAR volumes encompassed by characteristic percentages of isodoses were reduced by a mean of between 0 and 28%. The study demonstrated a feasible way

  11. Development of a neuro-fuzzy technique for automated parameter optimization of inverse treatment planning

    Directory of Open Access Journals (Sweden)

    Wenz Frederik

    2009-09-01

    Full Text Available Abstract Background Parameter optimization in the process of inverse treatment planning for intensity modulated radiation therapy (IMRT is mainly conducted by human planners in order to create a plan with the desired dose distribution. To automate this tedious process, an artificial intelligence (AI guided system was developed and examined. Methods The AI system can automatically accomplish the optimization process based on prior knowledge operated by several fuzzy inference systems (FIS. Prior knowledge, which was collected from human planners during their routine trial-and-error process of inverse planning, has first to be "translated" to a set of "if-then rules" for driving the FISs. To minimize subjective error which could be costly during this knowledge acquisition process, it is necessary to find a quantitative method to automatically accomplish this task. A well-developed machine learning technique, based on an adaptive neuro fuzzy inference system (ANFIS, was introduced in this study. Based on this approach, prior knowledge of a fuzzy inference system can be quickly collected from observation data (clinically used constraints. The learning capability and the accuracy of such a system were analyzed by generating multiple FIS from data collected from an AI system with known settings and rules. Results Multiple analyses showed good agreements of FIS and ANFIS according to rules (error of the output values of ANFIS based on the training data from FIS of 7.77 ± 0.02% and membership functions (3.9%, thus suggesting that the "behavior" of an FIS can be propagated to another, based on this process. The initial experimental results on a clinical case showed that ANFIS is an effective way to build FIS from practical data, and analysis of ANFIS and FIS with clinical cases showed good planning results provided by ANFIS. OAR volumes encompassed by characteristic percentages of isodoses were reduced by a mean of between 0 and 28%. Conclusion The

  12. Automated prescription of oblique brain 3D magnetic resonance spectroscopic imaging.

    Science.gov (United States)

    Ozhinsky, Eugene; Vigneron, Daniel B; Chang, Susan M; Nelson, Sarah J

    2013-04-01

    Two major difficulties encountered in implementing Magnetic Resonance Spectroscopic Imaging (MRSI) in a clinical setting are limited coverage and difficulty in prescription. The goal of this project was to automate completely the process of 3D PRESS MRSI prescription, including placement of the selection box, saturation bands and shim volume, while maximizing the coverage of the brain. The automated prescription technique included acquisition of an anatomical MRI image, optimization of the oblique selection box parameters, optimization of the placement of outer-volume suppression saturation bands, and loading of the calculated parameters into a customized 3D MRSI pulse sequence. To validate the technique and compare its performance with existing protocols, 3D MRSI data were acquired from six exams from three healthy volunteers. To assess the performance of the automated 3D MRSI prescription for patients with brain tumors, the data were collected from 16 exams from 8 subjects with gliomas. This technique demonstrated robust coverage of the tumor, high consistency of prescription and very good data quality within the T2 lesion. Copyright © 2012 Wiley Periodicals, Inc.

  13. Euler's fluid equations: Optimal control vs optimization

    International Nuclear Information System (INIS)

    Holm, Darryl D.

    2009-01-01

    An optimization method used in image-processing (metamorphosis) is found to imply Euler's equations for incompressible flow of an inviscid fluid, without requiring that the Lagrangian particle labels exactly follow the flow lines of the Eulerian velocity vector field. Thus, an optimal control problem and an optimization problem for incompressible ideal fluid flow both yield the same Euler fluid equations, although their Lagrangian parcel dynamics are different. This is a result of the gauge freedom in the definition of the fluid pressure for an incompressible flow, in combination with the symmetry of fluid dynamics under relabeling of their Lagrangian coordinates. Similar ideas are also illustrated for SO(N) rigid body motion.

  14. Altering user' acceptance of automation through prior automation exposure.

    Science.gov (United States)

    Bekier, Marek; Molesworth, Brett R C

    2017-06-01

    Air navigation service providers worldwide see increased use of automation as one solution to overcome the capacity constraints imbedded in the present air traffic management (ATM) system. However, increased use of automation within any system is dependent on user acceptance. The present research sought to determine if the point at which an individual is no longer willing to accept or cooperate with automation can be manipulated. Forty participants underwent training on a computer-based air traffic control programme, followed by two ATM exercises (order counterbalanced), one with and one without the aid of automation. Results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation ('tipping point') decreased; suggesting it is indeed possible to alter automation acceptance. Practitioner Summary: This paper investigates whether the point at which a user of automation rejects automation (i.e. 'tipping point') is constant or can be manipulated. The results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation decreased; suggesting it is possible to alter automation acceptance.

  15. Automation technology saves 30% energy; Automatisierungstechnik spart 30% Energie ein

    Energy Technology Data Exchange (ETDEWEB)

    Klinkow, Torsten; Meyer, Michael [Wago Kontakttechnik GmbH und Co. KG, Minden (Germany)

    2013-04-01

    A systematic energy management is in more demand than ever in order to reduce the increasing energy costs. What used to be a difficult puzzle consisting of different technology components in the early days is today easier to solve by means of a standardized and cost-effective automation technology. With its IO system, Wago Kontakttechnik GmbH and Co. KG (Minden, Federal Republic of Germany) supplies a complete and coordinated portfolio for the energy efficiency.

  16. Distributed coordination of energy storage with distributed generators

    NARCIS (Netherlands)

    Yang, Tao; Wu, Di; Stoorvogel, Antonie Arij; Stoustrup, Jakob

    2016-01-01

    With a growing emphasis on energy efficiency and system flexibility, a great effort has been made recently in developing distributed energy resources (DER), including distributed generators and energy storage systems. This paper first formulates an optimal DER coordination problem considering

  17. Automated MRI segmentation for individualized modeling of current flow in the human head.

    Science.gov (United States)

    Huang, Yu; Dmochowski, Jacek P; Su, Yuzhuo; Datta, Abhishek; Rorden, Christopher; Parra, Lucas C

    2013-12-01

    High-definition transcranial direct current stimulation (HD-tDCS) and high-density electroencephalography require accurate models of current flow for precise targeting and current source reconstruction. At a minimum, such modeling must capture the idiosyncratic anatomy of the brain, cerebrospinal fluid (CSF) and skull for each individual subject. Currently, the process to build such high-resolution individualized models from structural magnetic resonance images requires labor-intensive manual segmentation, even when utilizing available automated segmentation tools. Also, accurate placement of many high-density electrodes on an individual scalp is a tedious procedure. The goal was to develop fully automated techniques to reduce the manual effort in such a modeling process. A fully automated segmentation technique based on Statical Parametric Mapping 8, including an improved tissue probability map and an automated correction routine for segmentation errors, was developed, along with an automated electrode placement tool for high-density arrays. The performance of these automated routines was evaluated against results from manual segmentation on four healthy subjects and seven stroke patients. The criteria include segmentation accuracy, the difference of current flow distributions in resulting HD-tDCS models and the optimized current flow intensities on cortical targets. The segmentation tool can segment out not just the brain but also provide accurate results for CSF, skull and other soft tissues with a field of view extending to the neck. Compared to manual results, automated segmentation deviates by only 7% and 18% for normal and stroke subjects, respectively. The predicted electric fields in the brain deviate by 12% and 29% respectively, which is well within the variability observed for various modeling choices. Finally, optimized current flow intensities on cortical targets do not differ significantly. Fully automated individualized modeling may now be feasible

  18. OPTIMIZING THE DESIGN OF THE SYSTEMS OF INFORMATION PROTECTION IN AUTOMATED INFORMATIONAL SYSTEMS OF INDUSTRIAL ENTERPRISES

    Directory of Open Access Journals (Sweden)

    I. E. L'vovich

    2014-01-01

    Full Text Available Summary. Now to increase of indicators of efficiency and operability of difficult systems apply an automation equipment. The increasing role of information which became universal goods for relationship between various structures is noted. The question of its protection becomes the most actual. Special value is allocated for optimum design at creation of systems of the protection, allowing with the greatest probability to choose the best decisions on a set of alternatives. Now it becomes actual for the majority of the industrial enterprises as correctly designed and introduced system of protection will be pledge of successful functioning and competitiveness of all organization. Stages of works on creation of an information security system of the industrial enterprise are presented. The attention is focused on one of the most important approaches to realization of optimum design – multialternative optimization. In article the structure of creation of system of protection from the point of view of various models is considered, each of which gives an idea of features of design of system as a whole. The special attention is paid to a problem of creation of an information security system as it has the most difficult structure. Tasks for processes of automation of each of design stages of system of information security of the industrial enterprises are designated. Idea of each of stages of works is given at design of system of protection that allows to understand in the best way internal structure of creation of system of protection. Therefore, it is given the chance of evident submission of necessary requirements to creation of a reliable complex of information security of the industrial enterprise. Thereby it is given the chance of leveling of risks at early design stages of systems of protection, the organization and definition of necessary types of hardware-software complexes for future system.

  19. Open Automated Demand Response Communications Specification (Version 1.0)

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Ghatikar, Girish; Kiliccote, Sila; Koch, Ed; Hennage, Dan; Palensky, Peter; McParland, Charles

    2009-02-28

    The development of the Open Automated Demand Response Communications Specification, also known as OpenADR or Open Auto-DR, began in 2002 following the California electricity crisis. The work has been carried out by the Demand Response Research Center (DRRC), which is managed by Lawrence Berkeley National Laboratory. This specification describes an open standards-based communications data model designed to facilitate sending and receiving demand response price and reliability signals from a utility or Independent System Operator to electric customers. OpenADR is one element of the Smart Grid information and communications technologies that are being developed to improve optimization between electric supply and demand. The intention of the open automated demand response communications data model is to provide interoperable signals to building and industrial control systems that are preprogrammed to take action based on a demand response signal, enabling a demand response event to be fully automated, with no manual intervention. The OpenADR specification is a flexible infrastructure to facilitate common information exchange between the utility or Independent System Operator and end-use participants. The concept of an open specification is intended to allow anyone to implement the signaling systems, the automation server or the automation clients.

  20. Optimal Airport Surface Traffic Planning Using Mixed-Integer Linear Programming

    NARCIS (Netherlands)

    Roling, P.C.; Visser, H.G.

    2008-01-01

    We describe an ongoing research effort pertaining to the development of a surface traffic automation system that will help controllers to better coordinate surface traffic movements related to arrival and departure traffic. More specifically, we describe the concept for a taxi-planning support tool

  1. Strategy of arm movement control is determined by minimization of neural effort for joint coordination.

    Science.gov (United States)

    Dounskaia, Natalia; Shimansky, Yury

    2016-06-01

    Optimality criteria underlying organization of arm movements are often validated by testing their ability to adequately predict hand trajectories. However, kinematic redundancy of the arm allows production of the same hand trajectory through different joint coordination patterns. We therefore consider movement optimality at the level of joint coordination patterns. A review of studies of multi-joint movement control suggests that a 'trailing' pattern of joint control is consistently observed during which a single ('leading') joint is rotated actively and interaction torque produced by this joint is the primary contributor to the motion of the other ('trailing') joints. A tendency to use the trailing pattern whenever the kinematic redundancy is sufficient and increased utilization of this pattern during skillful movements suggests optimality of the trailing pattern. The goal of this study is to determine the cost function minimization of which predicts the trailing pattern. We show that extensive experimental testing of many known cost functions cannot successfully explain optimality of the trailing pattern. We therefore propose a novel cost function that represents neural effort for joint coordination. That effort is quantified as the cost of neural information processing required for joint coordination. We show that a tendency to reduce this 'neurocomputational' cost predicts the trailing pattern and that the theoretically developed predictions fully agree with the experimental findings on control of multi-joint movements. Implications for future research of the suggested interpretation of the trailing joint control pattern and the theory of joint coordination underlying it are discussed.

  2. Artificial neural networks for automation of Rutherford backscattering spectroscopy experiments and data analysis

    International Nuclear Information System (INIS)

    Barradas, N.P.; Vieira, A.; Patricio, R.

    2002-01-01

    We present an algorithm based on artificial neural networks able to determine optimized experimental conditions for Rutherford backscattering measurements of Ge-implanted Si. The algorithm can be implemented for any other element implanted into a lighter substrate. It is foreseeable that the method developed in this work can be applied to still many other systems. The algorithm presented is a push-button black box, and does not require any human intervention. It is thus suited for automated control of an experimental setup, given an interface to the relevant hardware. Once the experimental conditions are optimized, the algorithm analyzes the final data obtained, and determines the desired parameters. The method is thus also suited for automated analysis of the data. The algorithm presented can be easily extended to other ion beam analysis techniques. Finally, it is suggested how the artificial neural networks required for automated control and analysis of experiments could be automatically generated. This would be suited for automated generation of the required computer code. Thus could RBS be done without experimentalists, data analysts, or programmers, with only technicians to keep the machines running

  3. Guiding automated NMR structure determination using a global optimization metric, the NMR DP score

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Yuanpeng Janet, E-mail: yphuang@cabm.rutgers.edu; Mao, Binchen; Xu, Fei; Montelione, Gaetano T., E-mail: gtm@rutgers.edu [Rutgers, The State University of New Jersey, Department of Molecular Biology and Biochemistry, Center for Advanced Biotechnology and Medicine, and Northeast Structural Genomics Consortium (United States)

    2015-08-15

    ASDP is an automated NMR NOE assignment program. It uses a distinct bottom-up topology-constrained network anchoring approach for NOE interpretation, with 2D, 3D and/or 4D NOESY peak lists and resonance assignments as input, and generates unambiguous NOE constraints for iterative structure calculations. ASDP is designed to function interactively with various structure determination programs that use distance restraints to generate molecular models. In the CASD–NMR project, ASDP was tested and further developed using blinded NMR data, including resonance assignments, either raw or manually-curated (refined) NOESY peak list data, and in some cases {sup 15}N–{sup 1}H residual dipolar coupling data. In these blinded tests, in which the reference structure was not available until after structures were generated, the fully-automated ASDP program performed very well on all targets using both the raw and refined NOESY peak list data. Improvements of ASDP relative to its predecessor program for automated NOESY peak assignments, AutoStructure, were driven by challenges provided by these CASD–NMR data. These algorithmic improvements include (1) using a global metric of structural accuracy, the discriminating power score, for guiding model selection during the iterative NOE interpretation process, and (2) identifying incorrect NOESY cross peak assignments caused by errors in the NMR resonance assignment list. These improvements provide a more robust automated NOESY analysis program, ASDP, with the unique capability of being utilized with alternative structure generation and refinement programs including CYANA, CNS, and/or Rosetta.

  4. Civil Engineering and Building Service Topographic Permanent Landmarks Network. Spatial Coordinate Optimization

    Directory of Open Access Journals (Sweden)

    Lepadatu Daniel

    2016-06-01

    Full Text Available Sustainable development is a modern concept of adaptation conditions for achieving objectives that respond simultaneously to at least three major requirements: economic, social and environmental. Achieving sustainable development cannot be accomplished without a change of mentality of people and without communities able to use resources rationally and efficiently. For an efficient application programs surveying topography discipline the students have imagined and created a network of local topographic permanent terminals required for reporting the rectangular coordinates of applications. In order to obtain more accurate values of these coordinates we have made several types of measurements that will be presented in detail in this work.

  5. Complacency and Automation Bias in the Use of Imperfect Automation.

    Science.gov (United States)

    Wickens, Christopher D; Clegg, Benjamin A; Vieane, Alex Z; Sebok, Angelia L

    2015-08-01

    We examine the effects of two different kinds of decision-aiding automation errors on human-automation interaction (HAI), occurring at the first failure following repeated exposure to correctly functioning automation. The two errors are incorrect advice, triggering the automation bias, and missing advice, reflecting complacency. Contrasts between analogous automation errors in alerting systems, rather than decision aiding, have revealed that alerting false alarms are more problematic to HAI than alerting misses are. Prior research in decision aiding, although contrasting the two aiding errors (incorrect vs. missing), has confounded error expectancy. Participants performed an environmental process control simulation with and without decision aiding. For those with the aid, automation dependence was created through several trials of perfect aiding performance, and an unexpected automation error was then imposed in which automation was either gone (one group) or wrong (a second group). A control group received no automation support. The correct aid supported faster and more accurate diagnosis and lower workload. The aid failure degraded all three variables, but "automation wrong" had a much greater effect on accuracy, reflecting the automation bias, than did "automation gone," reflecting the impact of complacency. Some complacency was manifested for automation gone, by a longer latency and more modest reduction in accuracy. Automation wrong, creating the automation bias, appears to be a more problematic form of automation error than automation gone, reflecting complacency. Decision-aiding automation should indicate its lower degree of confidence in uncertain environments to avoid the automation bias. © 2015, Human Factors and Ergonomics Society.

  6. Adaptive protection coordination scheme for distribution network with distributed generation using ABC

    Directory of Open Access Journals (Sweden)

    A.M. Ibrahim

    2016-09-01

    Full Text Available This paper presents an adaptive protection coordination scheme for optimal coordination of DOCRs in interconnected power networks with the impact of DG, the used coordination technique is the Artificial Bee Colony (ABC. The scheme adapts to system changes; new relays settings are obtained as generation-level or system-topology changes. The developed adaptive scheme is applied on the IEEE 30-bus test system for both single- and multi-DG existence where results are shown and discussed.

  7. Residue-based Coordinated Selection and Parameter Design of Multiple Power System Stabilizers (PSSs)

    DEFF Research Database (Denmark)

    Su, Chi; Hu, Weihao; Fang, Jiakun

    2013-01-01

    data from time domain simulations. Then a coordinated approach for multiple PSS selection and parameter design based on residue method is proposed and realized in MATLAB m-files. Particle swarm optimization (PSO) is adopted in the coordination process. The IEEE 39-bus New England system model...

  8. Automating the production of high-purity zirconium from waste products in industrial furnaces SKB-5025 and apparatuses TS-40M

    International Nuclear Information System (INIS)

    Lavrikov, S.A.; Kotsar', M.L.; Lapidus, A.O.; Akhtonov, S.G.; Aleksandrov, A.V.; Ogorodnikov, L.V.; Chernyshev, A.A.; Kopysov, N.V.

    2014-01-01

    A disadvantage of iodide refining of zirconium in industrial furnaces in the processing of waste products production in JSC CMP is the low direct yield of the metal in iodide rods and large energy consumption of the process. The aim of this work is to optimize the process by means of automated control. The paper deals with the creation of a test unit to automate the iodide refining of zirconium in JSC CMP. The main features of the unit, the hardware and software of the automated unit and the results of its work during the operation are described. A scheme for the automation of 10 furnaces SKB-5025 by optimizing the total cost of computing equipment and for software improvements was proposed and implemented in 2012

  9. Operation optimization of distributed generation using artificial intelligent techniques

    Directory of Open Access Journals (Sweden)

    Mahmoud H. Elkazaz

    2016-06-01

    Full Text Available Future smart grids will require an observable, controllable and flexible network architecture for reliable and efficient energy delivery. The use of artificial intelligence and advanced communication technologies is essential in building a fully automated system. This paper introduces a new technique for online optimal operation of distributed generation (DG resources, i.e. a hybrid fuel cell (FC and photovoltaic (PV system for residential applications. The proposed technique aims to minimize the total daily operating cost of a group of residential homes by managing the operation of embedded DG units remotely from a control centre. The target is formed as an objective function that is solved using genetic algorithm (GA optimization technique. The optimal settings of the DG units obtained from the optimization process are sent to each DG unit through a fully automated system. The results show that the proposed technique succeeded in defining the optimal operating points of the DGs that affect directly the total operating cost of the entire system.

  10. Adaptive algorithm of selecting optimal variant of errors detection system for digital means of automation facility of oil and gas complex

    Science.gov (United States)

    Poluyan, A. Y.; Fugarov, D. D.; Purchina, O. A.; Nesterchuk, V. V.; Smirnova, O. V.; Petrenkova, S. B.

    2018-05-01

    To date, the problems associated with the detection of errors in digital equipment (DE) systems for the automation of explosive objects of the oil and gas complex are extremely actual. Especially this problem is actual for facilities where a violation of the accuracy of the DE will inevitably lead to man-made disasters and essential material damage, at such facilities, the diagnostics of the accuracy of the DE operation is one of the main elements of the industrial safety management system. In the work, the solution of the problem of selecting the optimal variant of the errors detection system of errors detection by a validation criterion. Known methods for solving these problems have an exponential valuation of labor intensity. Thus, with a view to reduce time for solving the problem, a validation criterion is compiled as an adaptive bionic algorithm. Bionic algorithms (BA) have proven effective in solving optimization problems. The advantages of bionic search include adaptability, learning ability, parallelism, the ability to build hybrid systems based on combining. [1].

  11. Automated Axis Alignment for a Nanomanipulator inside SEM and Its Error Optimization

    Directory of Open Access Journals (Sweden)

    Chao Zhou

    2017-01-01

    Full Text Available In the motion of probing nanostructures, repeating position and movement is frequently happing and tolerance for position error is stringent. The consistency between the axis of manipulators and image is very significant since the visual servo is the most important tool in the automated manipulation. This paper proposed an automated axis alignment method for a nanomanipulator inside the SEM by recognizing the position of a closed-loop controlling the end-effector, which can characterize the relationship of these two axes, and then the rotation matrix can be calculated accordingly. The error of this method and its transfer function are also calculated to compare the iteration method and average method. The method in this paper can accelerate the process of axis alignment to avoid the electron beam induced deposition effect on the end tips. Experiment demonstration shows that it can achieve a 0.1-degree precision in 90 seconds.

  12. Nonlinear Modeling and Coordinate Optimization of a Semi-Active Energy Regenerative Suspension with an Electro-Hydraulic Actuator

    Directory of Open Access Journals (Sweden)

    Farong Kou

    2018-01-01

    Full Text Available In order to coordinate the damping performance and energy regenerative performance of energy regenerative suspension, this paper proposes a structure of a vehicle semi-active energy regenerative suspension with an electro-hydraulic actuator (EHA. In light of the proposed concept, a specific energy regenerative scheme is designed and a mechanical properties test is carried out. Based on the test results, the parameter identification for the system model is conducted using a recursive least squares algorithm. On the basis of the system principle, the nonlinear model of the semi-active energy regenerative suspension with an EHA is built. Meanwhile, linear-quadratic-Gaussian control strategy of the system is designed. Then, the influence of the main parameters of the EHA on the damping performance and energy regenerative performance of the suspension is analyzed. Finally, the main parameters of the EHA are optimized via the genetic algorithm. The test results show that when a sinusoidal is input at the frequency of 2 Hz and the amplitude of 30 mm, the spring mass acceleration root meam square value of the optimized EHA semi-active energy regenerative suspension is reduced by 22.23% and the energy regenerative power RMS value is increased by 40.51%, which means that while meeting the requirements of vehicle ride comfort and driving safety, the energy regenerative performance is improved significantly.

  13. 23rd International Conference on Flexible Automation & Intelligent Manufacturing

    CERN Document Server

    2013-01-01

    The proceedings includes the set of revised papers from the 23rd International Conference on Flexible Automation and Intelligent Manufacturing (FAIM 2013). This conference aims to provide an international forum for the exchange of leading edge scientific knowledge and industrial experience regarding the development and integration of the various aspects of Flexible Automation and Intelligent Manufacturing Systems covering the complete life-cycle of a company’s Products and Processes. Contents will include topics such as: Product, Process and Factory Integrated Design, Manufacturing Technology and Intelligent Systems, Manufacturing Operations Management and Optimization and Manufacturing Networks and MicroFactories.

  14. Automation of Geometry Input for Building Code Compliance Check

    DEFF Research Database (Denmark)

    Petrova, Ekaterina Aleksandrova; Johansen, Peter Lind; Jensen, Rasmus Lund

    2017-01-01

    . That has left the industry in constant pursuit of possibilities for integration of the tool within the Building Information Modelling environment so that the potential provided by the latter can be harvested and the processed can be optimized. This paper presents a solution for automated data extraction...

  15. Expected Improvements in Work Truck Efficiency Through Connectivity and Automation

    Energy Technology Data Exchange (ETDEWEB)

    Walkowicz, Kevin A [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2018-03-12

    This presentation focuses on the potential impact of connected and automated technologies on commercial vehicle operations. It includes topics such as the U.S. Department of Energy's Energy Efficient Mobility Systems (EEMS) program and the Systems and Modeling for Accelerated Research in Transportation (SMART) Mobility Initiative. It also describes National Renewable Energy Laboratory (NREL) research findings pertaining to the potential energy impacts of connectivity and automation and stresses the need for integration and optimization to take advantage of the benefits offered by these transformative technologies while mitigating the potential negative consequences.

  16. Coordination control of distributed systems

    CERN Document Server

    Villa, Tiziano

    2015-01-01

    This book describes how control of distributed systems can be advanced by an integration of control, communication, and computation. The global control objectives are met by judicious combinations of local and nonlocal observations taking advantage of various forms of communication exchanges between distributed controllers. Control architectures are considered according to  increasing degrees of cooperation of local controllers:  fully distributed or decentralized control,  control with communication between controllers,  coordination control, and multilevel control.  The book covers also topics bridging computer science, communication, and control, like communication for control of networks, average consensus for distributed systems, and modeling and verification of discrete and of hybrid systems. Examples and case studies are introduced in the first part of the text and developed throughout the book. They include: control of underwater vehicles, automated-guided vehicles on a container terminal, contro...

  17. Health Care Reform, Care Coordination, and Transformational Leadership.

    Science.gov (United States)

    Steaban, Robin Lea

    2016-01-01

    This article is meant to spur debate on the role of the professional nurse in care coordination as well as the role of nursing leaders for defining and leading to a future state. This work highlights the opportunity and benefits associated with transformation of professional nursing practice in response to the mandates of the Affordable Care Act of 2010. An understanding of core concepts and the work of care coordination are used to propose a model of care coordination based on the population health pyramid. This maximizes the roles of nurses across the continuum as transformational leaders in the patient/family and nursing relationship. The author explores the role of the nurse in a transactional versus transformational relationship with patients, leading to actualization of the nurse in care coordination. Focusing on the role of the nurse leader, the challenges and necessary actions for optimization of the professional nurse role are explored, using principles of transformational leadership.

  18. Automated measurement of stent strut coverage in intravascular optical coherence tomography

    Science.gov (United States)

    Ahn, Chi Young; Kim, Byeong-Keuk; Hong, Myeong-Ki; Jang, Yangsoo; Heo, Jung; Joo, Chulmin; Seo, Jin Keun

    2015-02-01

    Optical coherence tomography (OCT) is a non-invasive, cross-sectional imaging modality that has become a prominent imaging method in percutaneous intracoronary intervention. We present an automated detection algorithm for stent strut coordinates and coverage in OCT images. The algorithm for stent strut detection is composed of a coordinate transformation from the polar to the Cartesian domains and application of second derivative operators in the radial and the circumferential directions. Local region-based active contouring was employed to detect lumen boundaries. We applied the method to the OCT pullback images acquired from human patients in vivo to quantitatively measure stent strut coverage. The validation studies against manual expert assessments demonstrated high Pearson's coefficients ( R = 0.99) in terms of the stent strut coordinates, with no significant bias. An averaged Hausdorff distance of < 120 μm was obtained for vessel border detection. Quantitative comparison in stent strut to vessel wall distance found a bias of < 12.3 μm and a 95% confidence of < 110 μm.

  19. A cost-effective intelligent robotic system with dual-arm dexterous coordination and real-time vision

    Science.gov (United States)

    Marzwell, Neville I.; Chen, Alexander Y. K.

    1991-01-01

    Dexterous coordination of manipulators based on the use of redundant degrees of freedom, multiple sensors, and built-in robot intelligence represents a critical breakthrough in development of advanced manufacturing technology. A cost-effective approach for achieving this new generation of robotics has been made possible by the unprecedented growth of the latest microcomputer and network systems. The resulting flexible automation offers the opportunity to improve the product quality, increase the reliability of the manufacturing process, and augment the production procedures for optimizing the utilization of the robotic system. Moreover, the Advanced Robotic System (ARS) is modular in design and can be upgraded by closely following technological advancements as they occur in various fields. This approach to manufacturing automation enhances the financial justification and ensures the long-term profitability and most efficient implementation of robotic technology. The new system also addresses a broad spectrum of manufacturing demand and has the potential to address both complex jobs as well as highly labor-intensive tasks. The ARS prototype employs the decomposed optimization technique in spatial planning. This technique is implemented to the framework of the sensor-actuator network to establish the general-purpose geometric reasoning system. The development computer system is a multiple microcomputer network system, which provides the architecture for executing the modular network computing algorithms. The knowledge-based approach used in both the robot vision subsystem and the manipulation control subsystems results in the real-time image processing vision-based capability. The vision-based task environment analysis capability and the responsive motion capability are under the command of the local intelligence centers. An array of ultrasonic, proximity, and optoelectronic sensors is used for path planning. The ARS currently has 18 degrees of freedom made up by two

  20. Automated IMRT planning with regional optimization using planning scripts.

    Science.gov (United States)

    Xhaferllari, Ilma; Wong, Eugene; Bzdusek, Karl; Lock, Michael; Chen, Jeff

    2013-01-07

    Intensity-modulated radiation therapy (IMRT) has become a standard technique in radiation therapy for treating different types of cancers. Various class solutions have been developed for simple cases (e.g., localized prostate, whole breast) to generate IMRT plans efficiently. However, for more complex cases (e.g., head and neck, pelvic nodes), it can be time-consuming for a planner to generate optimized IMRT plans. To generate optimal plans in these more complex cases which generally have multiple target volumes and organs at risk, it is often required to have additional IMRT optimization structures such as dose limiting ring structures, adjust beam geometry, select inverse planning objectives and associated weights, and additional IMRT objectives to reduce cold and hot spots in the dose distribution. These parameters are generally manually adjusted with a repeated trial and error approach during the optimization process. To improve IMRT planning efficiency in these more complex cases, an iterative method that incorporates some of these adjustment processes automatically in a planning script is designed, implemented, and validated. In particular, regional optimization has been implemented in an iterative way to reduce various hot or cold spots during the optimization process that begins with defining and automatic segmentation of hot and cold spots, introducing new objectives and their relative weights into inverse planning, and turn this into an iterative process with termination criteria. The method has been applied to three clinical sites: prostate with pelvic nodes, head and neck, and anal canal cancers, and has shown to reduce IMRT planning time significantly for clinical applications with improved plan quality. The IMRT planning scripts have been used for more than 500 clinical cases.

  1. Multiobjective Multifactorial Optimization in Evolutionary Multitasking.

    Science.gov (United States)

    Gupta, Abhishek; Ong, Yew-Soon; Feng, Liang; Tan, Kay Chen

    2016-05-03

    In recent decades, the field of multiobjective optimization has attracted considerable interest among evolutionary computation researchers. One of the main features that makes evolutionary methods particularly appealing for multiobjective problems is the implicit parallelism offered by a population, which enables simultaneous convergence toward the entire Pareto front. While a plethora of related algorithms have been proposed till date, a common attribute among them is that they focus on efficiently solving only a single optimization problem at a time. Despite the known power of implicit parallelism, seldom has an attempt been made to multitask, i.e., to solve multiple optimization problems simultaneously. It is contended that the notion of evolutionary multitasking leads to the possibility of automated transfer of information across different optimization exercises that may share underlying similarities, thereby facilitating improved convergence characteristics. In particular, the potential for automated transfer is deemed invaluable from the standpoint of engineering design exercises where manual knowledge adaptation and reuse are routine. Accordingly, in this paper, we present a realization of the evolutionary multitasking paradigm within the domain of multiobjective optimization. The efficacy of the associated evolutionary algorithm is demonstrated on some benchmark test functions as well as on a real-world manufacturing process design problem from the composites industry.

  2. Automated reasoning in man-machine control systems

    International Nuclear Information System (INIS)

    Stratton, R.C.; Lusk, E.L.

    1983-01-01

    This paper describes a project being undertaken at Argonne National Laboratory to demonstrate the usefulness of automated reasoning techniques in the implementation of a man-machine control system being designed at the EBR-II nuclear power plant. It is shown how automated reasoning influences the choice of optimal roles for both man and machine in the system control process, both for normal and off-normal operation. In addition, the requirements imposed by such a system for a rigorously formal specification of operating states, subsystem states, and transition procedures have a useful impact on the analysis phase. The definitions and rules are discussed for a prototype system which is physically simple yet illustrates some of the complexities inherent in real systems

  3. Automated solid-phase peptide synthesis to obtain therapeutic peptides

    Directory of Open Access Journals (Sweden)

    Veronika Mäde

    2014-05-01

    Full Text Available The great versatility and the inherent high affinities of peptides for their respective targets have led to tremendous progress for therapeutic applications in the last years. In order to increase the drugability of these frequently unstable and rapidly cleared molecules, chemical modifications are of great interest. Automated solid-phase peptide synthesis (SPPS offers a suitable technology to produce chemically engineered peptides. This review concentrates on the application of SPPS by Fmoc/t-Bu protecting-group strategy, which is most commonly used. Critical issues and suggestions for the synthesis are covered. The development of automated methods from conventional to essentially improved microwave-assisted instruments is discussed. In order to improve pharmacokinetic properties of peptides, lipidation and PEGylation are described as covalent conjugation methods, which can be applied by a combination of automated and manual synthesis approaches. The synthesis and application of SPPS is described for neuropeptide Y receptor analogs as an example for bioactive hormones. The applied strategies represent innovative and potent methods for the development of novel peptide drug candidates that can be manufactured with optimized automated synthesis technologies.

  4. Designs for thermal harvesting with nonlinear coordinate transformation

    Science.gov (United States)

    Ji, Qingxiang; Fang, Guodong; Liang, Jun

    2018-04-01

    In this paper a thermal concentrating design method was proposed based on the concept of generating function without knowing the needed coordinate transformation beforehand. The thermal harvesting performance was quantitatively characterized by heat concentrating efficiency and external temperature perturbation. Nonlinear transformations of different forms were employed to design high order thermal concentrators, and corresponding harvesting performances were investigated by numerical simulations. The numerical results shows that the form of coordinate transformation directly influences the distributions of heat flows inside the concentrator, consequently, influences the thermal harvesting behaviors significantly. The concentrating performance can be actively controlled and optimized by changing the form of coordinate transformations. The analysis in this paper offers a beneficial method to flexibly tune the harvesting performance of the thermal concentrator according to the requirements of practical applications.

  5. Client application for automated management training system of NPP personnel

    International Nuclear Information System (INIS)

    Pribysh, P.I.; Poplavskij, I.A.; Karpej, A.L.

    2016-01-01

    This paper describes the client side of automated management training system. This system will optimize the speed of the organization and quality of the training plan; reduce the time of collecting the necessary documentation and facilitate the analysis of the results. (authors)

  6. Distributed EMPC of multiple microgrids for coordinated stochastic energy management

    International Nuclear Information System (INIS)

    Kou, Peng; Liang, Deliang; Gao, Lin

    2017-01-01

    Highlights: • Reducing the system wide operating cost compared to the no-cooperation energy management strategy. • Maintaining the supply and demand balance within each microgrid. • Handling the uncertainties in both supply and demand. • Converting the stochastic optimization problems to standard quadratic and linear programming problems. • Achieving a good balance between control performance and computationally feasibility. - Abstract: The concept of multi-microgrids has the potential to improve the reliability and economic performance of a distribution system. To realize this potential, a coordination among multiple microgrids is needed. In this context, this paper presents a new distributed economic model predictive control scheme for the coordinated stochastic energy management of multi-microgrids. By optimally coordinating the operation of individual microgrids, this scheme maintains the system-wide supply and demand balance in an economical manner. Based on the probabilistic forecasts of renewable power generation and microgrid load, this scheme effectively handles the uncertainties in both supply and demand. Using the Chebyshev inequality and the Delta method, the corresponding stochastic optimization problems have been converted to quadratic and linear programs. The proposed scheme is evaluated on a large-scale case that includes ten interconnected microgrids. The results indicated that the proposed scheme successfully reduces the system wide operating cost, achieves the supply-demand balance in each microgrid, and brings the energy exchange between DNO and main grid to a predefined trajectory.

  7. Improving the driver-automation interaction: an approach using automation uncertainty.

    Science.gov (United States)

    Beller, Johannes; Heesen, Matthias; Vollrath, Mark

    2013-12-01

    The aim of this study was to evaluate whether communicating automation uncertainty improves the driver-automation interaction. A false system understanding of infallibility may provoke automation misuse and can lead to severe consequences in case of automation failure. The presentation of automation uncertainty may prevent this false system understanding and, as was shown by previous studies, may have numerous benefits. Few studies, however, have clearly shown the potential of communicating uncertainty information in driving. The current study fills this gap. We conducted a driving simulator experiment, varying the presented uncertainty information between participants (no uncertainty information vs. uncertainty information) and the automation reliability (high vs.low) within participants. Participants interacted with a highly automated driving system while engaging in secondary tasks and were required to cooperate with the automation to drive safely. Quantile regressions and multilevel modeling showed that the presentation of uncertainty information increases the time to collision in the case of automation failure. Furthermore, the data indicated improved situation awareness and better knowledge of fallibility for the experimental group. Consequently, the automation with the uncertainty symbol received higher trust ratings and increased acceptance. The presentation of automation uncertaintythrough a symbol improves overall driver-automation cooperation. Most automated systems in driving could benefit from displaying reliability information. This display might improve the acceptance of fallible systems and further enhances driver-automation cooperation.

  8. Automated quantification of epicardial adipose tissue using CT angiography: evaluation of a prototype software

    International Nuclear Information System (INIS)

    Spearman, James V.; Silverman, Justin R.; Krazinski, Aleksander W.; Costello, Philip; Meinel, Felix G.; Geyer, Lucas L.; Schoepf, U.J.; Apfaltrer, Paul; Canstein, Christian; De Cecco, Carlo Nicola

    2014-01-01

    This study evaluated the performance of a novel automated software tool for epicardial fat volume (EFV) quantification compared to a standard manual technique at coronary CT angiography (cCTA). cCTA data sets of 70 patients (58.6 ± 12.9 years, 33 men) were retrospectively analysed using two different post-processing software applications. Observer 1 performed a manual single-plane pericardial border definition and EFV M segmentation (manual approach). Two observers used a software program with fully automated 3D pericardial border definition and EFV A calculation (automated approach). EFV and time required for measuring EFV (including software processing time and manual optimization time) for each method were recorded. Intraobserver and interobserver reliability was assessed on the prototype software measurements. T test, Spearman's rho, and Bland-Altman plots were used for statistical analysis. The final EFV A (with manual border optimization) was strongly correlated with the manual axial segmentation measurement (60.9 ± 33.2 mL vs. 65.8 ± 37.0 mL, rho = 0.970, P 0.9). Automated EFV A quantification is an accurate and time-saving method for quantification of EFV compared to established manual axial segmentation methods. (orig.)

  9. An integration weighting method to evaluate extremum coordinates

    International Nuclear Information System (INIS)

    Ilyushchenko, V.I.

    1990-01-01

    The numerical version of the Laplace asymptotics has been used to evaluate the coordinates of extrema of multivariate continuous and discontinuous test functions. The performed computer experiments demonstrate the high efficiency of the integration method proposed. The saturating dependence of extremum coordinates on such parameters as a number of integration subregions and that of K going /theoretically/ to infinity has been studied in detail for the limitand being a ratio of two Laplace integrals with exponentiated K. The given method is an integral equivalent of that of weighted means. As opposed to the standard optimization methods of the zero, first and second order the proposed method can be successfully applied to optimize discontinuous objective functions, too. There are possibilities of applying the integration method in the cases, when the conventional techniques fail due to poor analytical properties of the objective functions near extremal points. The proposed method is efficient in searching for both local and global extrema of multimodal objective functions. 12 refs.; 4 tabs

  10. Let the experts decide? Asymmetric information, abstention, and coordination in standing committees

    DEFF Research Database (Denmark)

    Morton, Rebecca B.; Tyran, Jean-Robert

    2011-01-01

    the asymmetry in information quality is large, we find that voting groups largely coordinate on the SVC equilibrium which is also Pareto optimal. However, we find that when the asymmetry in information quality is not large and the Pareto optimal equilibrium is for all to participate, significant numbers...

  11. Chemical Reactor Automation as a way to Optimize a Laboratory Scale Polymerization Process

    Science.gov (United States)

    Cruz-Campa, Jose L.; Saenz de Buruaga, Isabel; Lopez, Raymundo

    2004-10-01

    The automation of the registration and control of variables involved in a chemical reactor improves the reaction process by making it faster, optimized and without the influence of human error. The objective of this work is to register and control the involved variables (temperatures, reactive fluxes, weights, etc) in an emulsion polymerization reaction. The programs and control algorithms were developed in the language G in LabVIEW®. The designed software is able to send and receive RS232 codified data from the devices (pumps, temperature sensors, mixer, balances, and so on) to and from a personal Computer. The transduction from digital information to movement or measurement actions of the devices is done by electronic components included in the devices. Once the programs were done and proved, chemical reactions of emulsion polymerization were made to validate the system. Moreover, some advanced heat-estimation algorithms were implemented in order to know the heat caused by the reaction and the estimation and control of chemical variables in-line. All the information gotten from the reaction is stored in the PC. The information is then available and ready to use in any commercial data processor software. This work is now being used in a Research Center in order to make emulsion polymerizations under efficient and controlled conditions with reproducible results. The experiences obtained from this project may be used in the implementation of chemical estimation algorithms at pilot plant or industrial scale.

  12. 12th International Conference on Informatics in Control, Automation and Robotics

    CERN Document Server

    Gusikhin, Oleg; Madani, Kurosh; Sasiadek, Jurek

    2016-01-01

    The present book includes a set of selected extended papers from the 11th International Conference on Informatics in Control, Automation and Robotics (ICINCO 2014), held in Vienna, Austria, from 1 to 3 September 2014. The conference brought together researchers, engineers and practitioners interested in the application of informatics to Control, Automation and Robotics. Four simultaneous tracks will be held, covering Intelligent Control Systems, Optimization, Robotics, Automation, Signal Processing, Sensors, Systems Modelling and Control, and Industrial Engineering, Production and Management. Informatics applications are pervasive in many areas of Control, Automation and Robotics. ICINCO 2014 received 301 submissions, from 49 countries, in all continents. After a double blind paper review performed by the Program Committee, 20% were accepted as full papers and thus selected for oral presentation. Additional papers were accepted as short papers and posters. A further selection was made after the Conference, ba...

  13. 12th International Conference on Informatics in Control, Automation and Robotics

    CERN Document Server

    Madani, Kurosh; Gusikhin, Oleg; Sasiadek, Jurek

    2016-01-01

    The present book includes a set of selected extended papers from the 12th International Conference on Informatics in Control, Automation and Robotics (ICINCO 2015), held in Colmar, France, from 21 to 23 July 2015. The conference brought together researchers, engineers and practitioners interested in the application of informatics to Control, Automation and Robotics. Four simultaneous tracks will be held, covering Intelligent Control Systems, Optimization, Robotics, Automation, Signal Processing, Sensors, Systems Modelling and Control, and Industrial Engineering, Production and Management. Informatics applications are pervasive in many areas of Control, Automation and Robotics. ICINCO 2015 received 214 submissions, from 42 countries, in all continents. After a double blind paper review performed by the Program Committee, 14% were accepted as full papers and thus selected for oral presentation. Additional papers were accepted as short papers and posters. A further selection was made after the Conference, based ...

  14. Automated Quantum Mechanical Predictions of Enantioselectivity in a Rhodium-Catalyzed Asymmetric Hydrogenation.

    Science.gov (United States)

    Guan, Yanfei; Wheeler, Steven E

    2017-07-24

    A computational toolkit (AARON: An automated reaction optimizer for new catalysts) is described that automates the density functional theory (DFT) based screening of chiral ligands for transition-metal-catalyzed reactions with well-defined reaction mechanisms but multiple stereocontrolling transition states. This is demonstrated for the Rh-catalyzed asymmetric hydrogenation of (E)-β-aryl-N-acetyl enamides, for which a new C 2 -symmetric phosphorus ligand is designed. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Optimal Attitude Estimation and Filtering Without Using Local Coordinates Part I: Uncontrolled and Deterministic Attitude Dynamics

    OpenAIRE

    Sanyal, Amit K.

    2005-01-01

    There are several attitude estimation algorithms in existence, all of which use local coordinate representations for the group of rigid body orientations. All local coordinate representations of the group of orientations have associated problems. While minimal coordinate representations exhibit kinematic singularities for large rotations, the quaternion representation requires satisfaction of an extra constraint. This paper treats the attitude estimation and filtering problem as an optimizati...

  16. SUPPLY CHAIN COORDINATION WITH UNCERTAINTY IN TWO-ECHELON YIELDS

    OpenAIRE

    HONGJUN PENG; MEIHUA ZHOU; LING QIAN

    2013-01-01

    This paper researches the coordination models in the supply chain where there are uncertain two-echelon yields and random demand. We analyzed three contracts of revenue sharing (RS), overproduction risk sharing (OS), and combination of RS and OS (RO), and contrasted them with uncoordinated model. We studied the optimal order decision for downstream manufacturer and the optimal production decision for upstream manufacturer. Numerical examples were presented to illustrate the results. The study...

  17. Dictionary descent in optimization

    OpenAIRE

    Temlyakov, Vladimir

    2015-01-01

    The problem of convex optimization is studied. Usually in convex optimization the minimization is over a d-dimensional domain. Very often the convergence rate of an optimization algorithm depends on the dimension d. The algorithms studied in this paper utilize dictionaries instead of a canonical basis used in the coordinate descent algorithms. We show how this approach allows us to reduce dimensionality of the problem. Also, we investigate which properties of a dictionary are beneficial for t...

  18. TWO-LEVEL HIERARCHICAL COORDINATION QUEUING METHOD FOR TELECOMMUNICATION NETWORK NODES

    Directory of Open Access Journals (Sweden)

    M. V. Semenyaka

    2014-07-01

    Full Text Available The paper presents hierarchical coordination queuing method. Within the proposed method a queuing problem has been reduced to optimization problem solving that was presented as two-level hierarchical structure. The required distribution of flows and bandwidth allocation was calculated at the first level independently for each macro-queue; at the second level solutions obtained on lower level for each queue were coordinated in order to prevent probable network link overload. The method of goal coordination has been determined for multilevel structure managing, which makes it possible to define the order for consideration of queue cooperation restrictions and calculation tasks distribution between levels of hierarchy. Decisions coordination was performed by the method of Lagrange multipliers. The study of method convergence has been carried out by analytical modeling.

  19. Smart City Platform Development for an Automated Waste Collection System

    Directory of Open Access Journals (Sweden)

    Cicerone Laurentiu Popa

    2017-11-01

    Full Text Available Nowadays, governments and companies are looking for solutions to increase the collection level of various waste types by using new technologies and devices such as smart sensors, Internet of Things (IoT, cloud platforms etc. In order to fulfil this need, this paper presents solutions provided by a research project involving the design, development and implementation of fully automated waste collection systems with an increased usage degree, productivity and storage capacity. The paper will focus on the main results of this research project in turning the automated waste collection system into a smart system so that it can be easily integrated in any smart city infrastructure. For this purpose, the Internet of Things platform for the automated waste collection system provided by the project will allow real time monitoring and communication with central systems. Details about each module are sent to the central systems: various modules’ statuses (working, blocked, needs repairs or maintenance etc.; equipment status; storage systems status (allowing full reports for all waste types; the amount of waste for each module, allowing optimal discharging; route optimization for waste discharging etc. To do that, we describe here an IoT cloud solution integrating device connection, data processing, analytics and management.

  20. Advances in Automation and Robotics

    CERN Document Server

    International conference on Automation and Robotics ICAR2011

    2012-01-01

    The international conference on Automation and Robotics-ICAR2011 is held during December 12-13, 2011 in Dubai, UAE. The proceedings of ICAR2011 have been published by Springer Lecture Notes in Electrical Engineering, which include 163 excellent papers selected from more than 400 submitted papers.   The conference is intended to bring together the researchers and engineers/technologists working in different aspects of intelligent control systems and optimization, robotics and automation, signal processing, sensors, systems modeling and control, industrial engineering, production and management.   This part of proceedings includes 81 papers contributed by many researchers in relevant topic areas covered at ICAR2011 from various countries such as France, Japan, USA, Korea and China etc.     Many papers introduced their advanced research work recently; some of them gave a new solution to problems in the field, with powerful evidence and detail demonstration. Others stated the application of their designed and...

  1. Hierarchical optimal control of large-scale nonlinear chemical processes.

    Science.gov (United States)

    Ramezani, Mohammad Hossein; Sadati, Nasser

    2009-01-01

    In this paper, a new approach is presented for optimal control of large-scale chemical processes. In this approach, the chemical process is decomposed into smaller sub-systems at the first level, and a coordinator at the second level, for which a two-level hierarchical control strategy is designed. For this purpose, each sub-system in the first level can be solved separately, by using any conventional optimization algorithm. In the second level, the solutions obtained from the first level are coordinated using a new gradient-type strategy, which is updated by the error of the coordination vector. The proposed algorithm is used to solve the optimal control problem of a complex nonlinear chemical stirred tank reactor (CSTR), where its solution is also compared with the ones obtained using the centralized approach. The simulation results show the efficiency and the capability of the proposed hierarchical approach, in finding the optimal solution, over the centralized method.

  2. Distributed Coordination of Electric Vehicle Charging in a Community Microgrid Considering Real-Time Price

    DEFF Research Database (Denmark)

    Li, Chendan; Schaltz, Erik; Quintero, Juan Carlos Vasquez

    2016-01-01

    The predictable increasing adoption of EV by residential users imposes the necessity of Electric Vehicle charging coordination, in order to charge effectively while minimizing the impact on the grid. In this paper, a two-stage distributed coordination algorithm for electric vehicle charging...... management in a community microgrid is proposed. Each local EV charging controller is taken as an agent, which can manage the charging to achieve the optimization of the whole community by communicating in a sparse network. The proposed algorithm aims at optimizing real-time, which manages the charging...

  3. Building a framework to manage trust in automation

    Science.gov (United States)

    Metcalfe, J. S.; Marathe, A. R.; Haynes, B.; Paul, V. J.; Gremillion, G. M.; Drnec, K.; Atwater, C.; Estepp, J. R.; Lukos, J. R.; Carter, E. C.; Nothwang, W. D.

    2017-05-01

    All automations must, at some point in their lifecycle, interface with one or more humans. Whether operators, end-users, or bystanders, human responses can determine the perceived utility and acceptance of an automation. It has been long believed that human trust is a primary determinant of human-automation interactions and further presumed that calibrating trust can lead to appropriate choices regarding automation use. However, attempts to improve joint system performance by calibrating trust have not yet provided a generalizable solution. To address this, we identified several factors limiting the direct integration of trust, or metrics thereof, into an active mitigation strategy. The present paper outlines our approach to addressing this important issue, its conceptual underpinnings, and practical challenges encountered in execution. Among the most critical outcomes has been a shift in focus from trust to basic interaction behaviors and their antecedent decisions. This change in focus inspired the development of a testbed and paradigm that was deployed in two experiments of human interactions with driving automation that were executed in an immersive, full-motion simulation environment. Moreover, by integrating a behavior and physiology-based predictor within a novel consequence-based control system, we demonstrated that it is possible to anticipate particular interaction behaviors and influence humans towards more optimal choices about automation use in real time. Importantly, this research provides a fertile foundation for the development and integration of advanced, wearable technologies for sensing and inferring critical state variables for better integration of human elements into otherwise fully autonomous systems.

  4. Automation and control trends in the upstream sector of the oil industry

    Energy Technology Data Exchange (ETDEWEB)

    Plucenio, Agustinho; Pagano, Daniel J. [Universidade Federal de Santa Catarina (UFSC), Florianopolis, SC (Brazil). Programa de Recursos Humanos da ANP em Automacao, Controle e Instrumentacao para a Industria do Petroleo e Gas, PRH-34

    2004-07-01

    The need to continuously improve the aspects of Health, Safety and Environment to operators, installation's security, optimization of oil reservoir recovery in wells operating with different artificial lift methods, subject to different secondary recovery techniques, has motivated the development of technologies in the automation and control for the upstream sector of the oil industry. While the application of control and automation techniques is well established in the downstream sector of the oil industry that is not the case in the downstream sector. One tendency in this sector is the utilization of control via Field bus Networks. This technology uses equipment that communicate with each other in a two wire digital network and can be programmed to execute function blocks algorithms designed to perform a designed control strategy. The most noticeable benefits are the improvements in the process performance and the equipment reusability and interoperability. Proprietary solutions can be replaced by systems composed of equipment supplied by different manufacturers connected in the same network. These equipment operate according to a strategy designed by automation and control engineers under the supervision of professionals working in computer terminals located in different company departments. Other gains are a better understanding about the industry processes, application of optimization techniques, fault detection, equipment maintenance follow-up, and improved operators working conditions and workers qualification. Other tendencies are: permanent well monitoring. Either with installation of down hole sensors based on fiber grating sensors or surface sensors using embedded electronic processors. Developments of instrumentation technology for low cost multiphase flow measurements. Application of control techniques for flow regime control and optimization of reservoir recovery through better identification, optimization and Model Based Predictive Control

  5. AIRCRAFT POWER SUPPLY SYSTEM DESIGN PROCESS AS AN AUTOMATION OBJECT

    Directory of Open Access Journals (Sweden)

    Boris V. Zhmurov

    2018-01-01

    aircraft and take into account all the requirements of the customer and the regulatory and technical documentation is its automation.Automation of the design of EPS aircraft as an optimization task involves the formalization of the object of optimization, as well as the choice of the criterion of efficiency and control actions. Under the object of optimization in this case we mean the design process of the EPS, the formalization of which includes formalization and the design object – the aircraft power supply system.

  6. Challenges in Gaining Large Scale Carbon Reductions through Wireless Home Automation Systems

    DEFF Research Database (Denmark)

    Larsen, Peter Gorm; Rovsing, Poul Ejnar; Toftegaard, Thomas Skjødeberg

    2010-01-01

    Buildings account for more than a 35 % of the energy consumption in Europe. Therefore a step towards more sustainable lifestile is to use home automation to optimize the energy consumption “automatically”. This paper reports about the usage and some of the remaining challenges of especially...... wireless but also powerline communication in a home automation setting. For many years, home automation has been visible to many, but accessible to only a few, because of inadequate integration of systems. A vast number of both standard and proprietary communication protocols are used, and systems...... are often difficult to install and configure so professional assistance is needed. In this paper we report about our experience in constructing an open universal home automation framework enabling interoperability of multiple communication protocols. The framework can easily be expanded in order to support...

  7. Examining the impact of harmonic correlation on vibrational frequencies calculated in localized coordinates

    Energy Technology Data Exchange (ETDEWEB)

    Hanson-Heine, Magnus W. D., E-mail: magnus.hansonheine@nottingham.ac.uk [School of Chemistry, University of Nottingham, University Park, Nottingham NG7 2RD (United Kingdom)

    2015-10-28

    Carefully choosing a set of optimized coordinates for performing vibrational frequency calculations can significantly reduce the anharmonic correlation energy from the self-consistent field treatment of molecular vibrations. However, moving away from normal coordinates also introduces an additional source of correlation energy arising from mode-coupling at the harmonic level. The impact of this new component of the vibrational energy is examined for a range of molecules, and a method is proposed for correcting the resulting self-consistent field frequencies by adding the full coupling energy from connected pairs of harmonic and pseudoharmonic modes, termed vibrational self-consistent field (harmonic correlation). This approach is found to lift the vibrational degeneracies arising from coordinate optimization and provides better agreement with experimental and benchmark frequencies than uncorrected vibrational self-consistent field theory without relying on traditional correlated methods.

  8. A network approach to decentralized coordination of energy production-consumption grids.

    Science.gov (United States)

    Omodei, Elisa; Arenas, Alex

    2018-01-01

    Energy grids are facing a relatively new paradigm consisting in the formation of local distributed energy sources and loads that can operate in parallel independently from the main power grid (usually called microgrids). One of the main challenges in microgrid-like networks management is that of self-adapting to the production and demands in a decentralized coordinated way. Here, we propose a stylized model that allows to analytically predict the coordination of the elements in the network, depending on the network topology. Surprisingly, almost global coordination is attained when users interact locally, with a small neighborhood, instead of the obvious but more costly all-to-all coordination. We compute analytically the optimal value of coordinated users in random homogeneous networks. The methodology proposed opens a new way of confronting the analysis of energy demand-side management in networked systems.

  9. Reduce operational cost and extend the life of pipeline infrastructure by automating remote cathodic protection systems

    Energy Technology Data Exchange (ETDEWEB)

    Rosado, Elroy [Freewave Technologies, Inc., Boulder, CO (United States). Latin America

    2009-07-01

    Energy and Pipeline Companies wrestle to control operating costs largely affected by new government regulations, ageing buried metal assets, rising steel prices, expanding pipeline operations, new interference points, HCA encroachment, restrictive land use policies, heightened network security, and an ageing soon-to-retire work force. With operating costs on the rise, seemingly out of control, many CP and Operations Professionals look to past best practices in cost containment through automation. Many companies achieve solid business results through deployment of telemetry and SCADA automation of remote assets and now hope to expand this success to further optimize operations by automating remote cathodic protection systems. This presentation will provide examples of how new remote cathodic protection systems are helping energy and pipeline companies address the growing issue of the aging pipeline infrastructure and reduce their costs while optimizing their operations. (author)

  10. Introduction of an automated mine surveying system - a method for effective control of mining operations

    Energy Technology Data Exchange (ETDEWEB)

    Mazhdrakov, M.

    1987-04-01

    Reviews developments in automated processing of mine survey data in Bulgaria for 1965-1970. This development has occurred in three phases. In the first phase, computers calculated coordinates of mine survey points; in the second phase, these data were electronically processed; in the third phase, surface and underground mine development is controlled by electronic data processing equipment. Centralized and decentralized electronic processing of data has been introduced at major coal mines. The Bulgarian Pravets 82 microcomputer and the ASMO-MINI program package are in current use at major coal mines. A lack of plotters, due to financial limitations, handicaps large-scale application of automated mine surveying in Bulgaria.

  11. Assessing V and V Processes for Automation with Respect to Vulnerabilities to Loss of Airplane State Awareness

    Science.gov (United States)

    Whitlow, Stephen; Wilkinson, Chris; Hamblin, Chris

    2014-01-01

    Automation has contributed substantially to the sustained improvement of aviation safety by minimizing the physical workload of the pilot and increasing operational efficiency. Nevertheless, in complex and highly automated aircraft, automation also has unintended consequences. As systems become more complex and the authority and autonomy (A&A) of the automation increases, human operators become relegated to the role of a system supervisor or administrator, a passive role not conducive to maintaining engagement and airplane state awareness (ASA). The consequence is that flight crews can often come to over rely on the automation, become less engaged in the human-machine interaction, and lose awareness of the automation mode under which the aircraft is operating. Likewise, the complexity of the system and automation modes may lead to poor understanding of the interaction between a mode of automation and a particular system configuration or phase of flight. These and other examples of mode confusion often lead to mismanaging the aircraftâ€"TM"s energy state or the aircraft deviating from the intended flight path. This report examines methods for assessing whether, and how, operational constructs properly assign authority and autonomy in a safe and coordinated manner, with particular emphasis on assuring adequate airplane state awareness by the flight crew and air traffic controllers in off-nominal and/or complex situations.

  12. Minimising life cycle costs of automated valves in offshore platforms

    Energy Technology Data Exchange (ETDEWEB)

    Yli-Petays, Juha [Metso Automation do Brasil Ltda., Rio de Janeiro, RJ (Brazil); Niemela, Ismo [Metso Automation, Imatra (Finland)

    2012-07-01

    Automated process valves play an essential role in offshore platforms operation. If you are able to optimize their operation and maintenance activities you can receive extensive operational savings with minimal investment. Valves used in offshore platforms doesn't differentiate that much from the valves used in downstream but there are certain specialties, which makes the operations more challenging in offshore: Process valves are more difficult to access and maintain because of space limitations. Also spare part inventories and deliveries are challenging because of offshore platform's remote location. To overcome these challenges usage of digital positioners with diagnostic features has become more common because predictive maintenance capabilities enable possibilities to plan the maintenance activities and this way optimise the spare part orders regarding to valves. There are intelligent controllers available for control valves, automated on/off valves as well as ESD-valves and whole network of automated valves on platforms can be controlled by intelligent valve controllers. This creates many new opportunities in regards of optimized process performance or predictive maintenance point-of-view. By means of intelligent valve controllers and predictive diagnostics, condition monitoring and maintenance planning can also be performed remotely from an onshore location. Thus, intelligent valve controllers provide good way to minimize spending related to total cost of ownership of automated process valves. When purchase value of control valve represent 20% of TCO, intelligent positioner and predictive maintenance methods can enable as high as 30% savings over the life cycle of asset so basically it benefit savings higher than whole investment of monitored asset over its life cycle. This is mainly achieved through the optimized maintenance activities since real life examples has shown that with time based maintenance (preventive maintenance) approach 70% of

  13. Final Report A Multi-Language Environment For Programmable Code Optimization and Empirical Tuning

    Energy Technology Data Exchange (ETDEWEB)

    Yi, Qing [Univ. of Colorado, Colorado Springs, CO (United States); Whaley, Richard Clint [Univ. of Texas, San Antonio, TX (United States); Qasem, Apan [Texas State Univ., San Marcos, TX (United States); Quinlan, Daniel [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-11-23

    This report summarizes our effort and results of building an integrated optimization environment to effectively combine the programmable control and the empirical tuning of source-to-source compiler optimizations within the framework of multiple existing languages, specifically C, C++, and Fortran. The environment contains two main components: the ROSE analysis engine, which is based on the ROSE C/C++/Fortran2003 source-to-source compiler developed by Co-PI Dr.Quinlan et. al at DOE/LLNL, and the POET transformation engine, which is based on an interpreted program transformation language developed by Dr. Yi at University of Texas at San Antonio (UTSA). The ROSE analysis engine performs advanced compiler analysis, identifies profitable code transformations, and then produces output in POET, a language designed to provide programmable control of compiler optimizations to application developers and to support the parameterization of architecture-sensitive optimizations so that their configurations can be empirically tuned later. This POET output can then be ported to different machines together with the user application, where a POET-based search engine empirically reconfigures the parameterized optimizations until satisfactory performance is found. Computational specialists can write POET scripts to directly control the optimization of their code. Application developers can interact with ROSE to obtain optimization feedback as well as provide domain-specific knowledge and high-level optimization strategies. The optimization environment is expected to support different levels of automation and programmer intervention, from fully-automated tuning to semi-automated development and to manual programmable control.

  14. Human-Robot Teaming: Communication, Coordination, and Collaboration

    Science.gov (United States)

    Fong, Terry

    2017-01-01

    In this talk, I will describe how NASA Ames has been studying how human-robot teams can increase the performance, reduce the cost, and increase the success of a variety of endeavors. The central premise of our work is that humans and robots should support one another in order to compensate for limitations of automation and manual control. This principle has broad applicability to a wide range of domains, environments, and situations. At the same time, however, effective human-robot teaming requires communication, coordination, and collaboration -- all of which present significant research challenges. I will discuss some of the ways that NASA Ames is addressing these challenges and present examples of our work involving planetary rovers, free-flying robots, and self-driving cars.

  15. Automated Critical Peak Pricing Field Tests: Program Descriptionand Results

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Watson, David; Motegi, Naoya; Kiliccote, Sila; Xu, Peng

    2006-04-06

    will respond to this form of automation for CPP. (4) Evaluate what type of DR shifting and shedding strategies can be automated. (5) Explore how automation of control strategies can increase participation rates and DR saving levels with CPP. (6) Identify optimal demand response control strategies. (7) Determine occupant and tenant response.

  16. Coordinating a Supply Chain with Risk-Averse Agents under Demand and Consumer Returns Uncertainty

    Directory of Open Access Journals (Sweden)

    Jian Liu

    2013-01-01

    Full Text Available This paper examines the optimal order decision in a supply chain when it faces uncertain demand and uncertain consumer returns. We build consumer returns model with decision-makers’ risk preference under mean-variance objective framework and discuss supply chain coordination problem under wholesale-price-only policy and the manufacturer’s buyback policy, respectively. We find that, with wholesale price policy, the supply chain cannot be coordinated whether the supply chain agents are risk-neutral or risk-averse. However, with buyback policy, the supply chain can be coordinated and the profit of the supply chain can be arbitrarily allocated between the manufacturer and the retailer. Through numerical examples, we illustrate the impact of stochastic consumer returns and the supply chain agents’ risk attitude on the optimal order decision.

  17. Autonomous guided vehicles methods and models for optimal path planning

    CERN Document Server

    Fazlollahtabar, Hamed

    2015-01-01

      This book provides readers with extensive information on path planning optimization for both single and multiple Autonomous Guided Vehicles (AGVs), and discusses practical issues involved in advanced industrial applications of AGVs. After discussing previously published research in the field and highlighting the current gaps, it introduces new models developed by the authors with the goal of reducing costs and increasing productivity and effectiveness in the manufacturing industry. The new models address the increasing complexity of manufacturing networks, due for example to the adoption of flexible manufacturing systems that involve automated material handling systems, robots, numerically controlled machine tools, and automated inspection stations, while also considering the uncertainty and stochastic nature of automated equipment such as AGVs. The book discusses and provides solutions to important issues concerning the use of AGVs in the manufacturing industry, including material flow optimization with A...

  18. Anatomy guided automated SPECT renal seed point estimation

    Science.gov (United States)

    Dwivedi, Shekhar; Kumar, Sailendra

    2010-04-01

    Quantification of SPECT(Single Photon Emission Computed Tomography) images can be more accurate if correct segmentation of region of interest (ROI) is achieved. Segmenting ROI from SPECT images is challenging due to poor image resolution. SPECT is utilized to study the kidney function, though the challenge involved is to accurately locate the kidneys and bladder for analysis. This paper presents an automated method for generating seed point location of both kidneys using anatomical location of kidneys and bladder. The motivation for this work is based on the premise that the anatomical location of the bladder relative to the kidneys will not differ much. A model is generated based on manual segmentation of the bladder and both the kidneys on 10 patient datasets (including sum and max images). Centroid is estimated for manually segmented bladder and kidneys. Relatively easier bladder segmentation is followed by feeding bladder centroid coordinates into the model to generate seed point for kidneys. Percentage error observed in centroid coordinates of organs from ground truth to estimated values from our approach are acceptable. Percentage error of approximately 1%, 6% and 2% is observed in X coordinates and approximately 2%, 5% and 8% is observed in Y coordinates of bladder, left kidney and right kidney respectively. Using a regression model and the location of the bladder, the ROI generation for kidneys is facilitated. The model based seed point estimation will enhance the robustness of kidney ROI estimation for noisy cases.

  19. Flexible Measurement of Bioluminescent Reporters Using an Automated Longitudinal Luciferase Imaging Gas- and Temperature-optimized Recorder (ALLIGATOR).

    Science.gov (United States)

    Crosby, Priya; Hoyle, Nathaniel P; O'Neill, John S

    2017-12-13

    Luciferase-based reporters of cellular gene expression are in widespread use for both longitudinal and end-point assays of biological activity. In circadian rhythms research, for example, clock gene fusions with firefly luciferase give rise to robust rhythms in cellular bioluminescence that persist over many days. Technical limitations associated with photomultiplier tubes (PMT) or conventional microscopy-based methods for bioluminescence quantification have typically demanded that cells and tissues be maintained under quite non-physiological conditions during recording, with a trade-off between sensitivity and throughput. Here, we report a refinement of prior methods that allows long-term bioluminescence imaging with high sensitivity and throughput which supports a broad range of culture conditions, including variable gas and humidity control, and that accepts many different tissue culture plates and dishes. This automated longitudinal luciferase imaging gas- and temperature-optimized recorder (ALLIGATOR) also allows the observation of spatial variations in luciferase expression across a cell monolayer or tissue, which cannot readily be observed by traditional methods. We highlight how the ALLIGATOR provides vastly increased flexibility for the detection of luciferase activity when compared with existing methods.

  20. Coordinated Control of PV Generation and EVs Charging Based on Improved DECell Algorithm

    Directory of Open Access Journals (Sweden)

    Guo Zhao

    2015-01-01

    Full Text Available Recently, the coordination of EVs’ charging and renewable energy has become a hot research all around the globe. Considering the requirements of EV owner and the influence of the PV output fluctuation on the power grid, a three-objective optimization model was established by controlling the EVs charging power during charging process. By integrating the meshing method into differential evolution cellular (DECell genetic algorithm, an improved differential evolution cellular (IDECell genetic algorithm was presented to solve the multiobjective optimization model. Compared to the NSGA-II and DECell, the IDECell algorithm showed better performance in the convergence and uniform distribution. Furthermore, the IDECell algorithm was applied to obtain the Pareto front of nondominated solutions. Followed by the normalized sorting of the nondominated solutions, the optimal solution was chosen to arrive at the optimized coordinated control strategy of PV generation and EVs charging. Compared to typical charging pattern, the optimized charging pattern could reduce the fluctuations of PV generation output power, satisfy the demand of EVs charging quantity, and save the total charging cost.

  1. The PDB_REDO server for macromolecular structure model optimization

    Directory of Open Access Journals (Sweden)

    Robbie P. Joosten

    2014-07-01

    Full Text Available The refinement and validation of a crystallographic structure model is the last step before the coordinates and the associated data are submitted to the Protein Data Bank (PDB. The success of the refinement procedure is typically assessed by validating the models against geometrical criteria and the diffraction data, and is an important step in ensuring the quality of the PDB public archive [Read et al. (2011, Structure, 19, 1395–1412]. The PDB_REDO procedure aims for `constructive validation', aspiring to consistent and optimal refinement parameterization and pro-active model rebuilding, not only correcting errors but striving for optimal interpretation of the electron density. A web server for PDB_REDO has been implemented, allowing thorough, consistent and fully automated optimization of the refinement procedure in REFMAC and partial model rebuilding. The goal of the web server is to help practicing crystallographers to improve their model prior to submission to the PDB. For this, additional steps were implemented in the PDB_REDO pipeline, both in the refinement procedure, e.g. testing of resolution limits and k-fold cross-validation for small test sets, and as new validation criteria, e.g. the density-fit metrics implemented in EDSTATS and ligand validation as implemented in YASARA. Innovative ways to present the refinement and validation results to the user are also described, which together with auto-generated Coot scripts can guide users to subsequent model inspection and improvement. It is demonstrated that using the server can lead to substantial improvement of structure models before they are submitted to the PDB.

  2. An optimization design proposal of automated guided vehicles for mixed type transportation in hospital environments.

    Science.gov (United States)

    González, Domingo; Romero, Luis; Espinosa, María Del Mar; Domínguez, Manuel

    2017-01-01

    The aim of this paper is to present an optimization proposal in the automated guided vehicles design used in hospital logistics, as well as to analyze the impact of its implementation in a real environment. This proposal is based on the design of those elements that would allow the vehicles to deliver an extra cart by the towing method. So, the proposal intention is to improve the productivity and the performance of the current vehicles by using a transportation method of combined carts. The study has been developed following concurrent engineering premises from three different viewpoints. First, the sequence of operations has been described, and second, a proposal of design of the equipment has been undertaken. Finally, the impact of the proposal has been analyzed according to real data from the Hospital Universitario Rio Hortega in Valladolid (Spain). In this particular case, by the implementation of the analyzed proposal in the hospital a reduction of over 35% of the current time of use can be achieved. This result may allow adding new tasks to the vehicles, and according to this, both a new kind of vehicle and a specific module can be developed in order to get a better performance.

  3. An optimization design proposal of automated guided vehicles for mixed type transportation in hospital environments.

    Directory of Open Access Journals (Sweden)

    Domingo González

    Full Text Available The aim of this paper is to present an optimization proposal in the automated guided vehicles design used in hospital logistics, as well as to analyze the impact of its implementation in a real environment.This proposal is based on the design of those elements that would allow the vehicles to deliver an extra cart by the towing method. So, the proposal intention is to improve the productivity and the performance of the current vehicles by using a transportation method of combined carts.The study has been developed following concurrent engineering premises from three different viewpoints. First, the sequence of operations has been described, and second, a proposal of design of the equipment has been undertaken. Finally, the impact of the proposal has been analyzed according to real data from the Hospital Universitario Rio Hortega in Valladolid (Spain. In this particular case, by the implementation of the analyzed proposal in the hospital a reduction of over 35% of the current time of use can be achieved. This result may allow adding new tasks to the vehicles, and according to this, both a new kind of vehicle and a specific module can be developed in order to get a better performance.

  4. Patient Dose Optimization in Fluoroscopically Guided Interventional Procedures. Final Report of a Coordinated Research Project

    International Nuclear Information System (INIS)

    2010-01-01

    In recent years, many surgical procedures have increasingly been replaced by interventional procedures that guide catheters into the arteries under X ray fluoroscopic guidance to perform a variety of operations such as ballooning, embolization, implantation of stents etc. The radiation exposure to patients and staff in such procedures is much higher than in simple radiographic examinations like X ray of chest or abdomen such that radiation induced skin injuries to patients and eye lens opacities among workers have been reported in the 1990's and after. Interventional procedures have grown both in frequency and importance during the last decade. This Coordinated Research Project (CRP) and TECDOC were developed within the International Atomic Energy Agency's (IAEA) framework of statutory responsibility to provide for the worldwide application of the standards for the protection of people against exposure to ionizing radiation. The CRP took place between 2003 and 2005 in six countries, with a view of optimizing the radiation protection of patients undergoing interventional procedures. The Fundamental Safety Principles and the International Basic Safety Standards for Protection against Ionizing Radiation (BSS) issued by the IAEA and co-sponsored by the Food and Agriculture Organization of the United Nations (FAO), the International Labour Organization (ILO), the World Health Organization (WHO), the Pan American Health Organization (PAHO) and the Nuclear Energy Agency (NEA), among others, require the radiation protection of patients undergoing medical exposures through justification of the procedures involved and through optimization. In keeping with its responsibility on the application of standards, the IAEA programme on Radiological Protection of Patients encourages the reduction of patient doses. To facilitate this, it has issued specific advice on the application of the BSS in the field of radiology in Safety Reports Series No. 39 and the three volumes on Radiation

  5. Extensible automated dispersive liquid–liquid microextraction

    Energy Technology Data Exchange (ETDEWEB)

    Li, Songqing; Hu, Lu; Chen, Ketao; Gao, Haixiang, E-mail: hxgao@cau.edu.cn

    2015-05-04

    Highlights: • An extensible automated dispersive liquid–liquid microextraction was developed. • A fully automatic SPE workstation with a modified operation program was used. • Ionic liquid-based in situ DLLME was used as model method. • SPE columns packed with nonwoven polypropylene fiber was used for phase separation. • The approach was applied to the determination of benzoylurea insecticides in water. - Abstract: In this study, a convenient and extensible automated ionic liquid-based in situ dispersive liquid–liquid microextraction (automated IL-based in situ DLLME) was developed. 1-Octyl-3-methylimidazolium bis[(trifluoromethane)sulfonyl]imide ([C{sub 8}MIM]NTf{sub 2}) is formed through the reaction between [C{sub 8}MIM]Cl and lithium bis[(trifluoromethane)sulfonyl]imide (LiNTf{sub 2}) to extract the analytes. Using a fully automatic SPE workstation, special SPE columns packed with nonwoven polypropylene (NWPP) fiber, and a modified operation program, the procedures of the IL-based in situ DLLME, including the collection of a water sample, injection of an ion exchange solvent, phase separation of the emulsified solution, elution of the retained extraction phase, and collection of the eluent into vials, can be performed automatically. The developed approach, coupled with high-performance liquid chromatography–diode array detection (HPLC–DAD), was successfully applied to the detection and concentration determination of benzoylurea (BU) insecticides in water samples. Parameters affecting the extraction performance were investigated and optimized. Under the optimized conditions, the proposed method achieved extraction recoveries of 80% to 89% for water samples. The limits of detection (LODs) of the method were in the range of 0.16–0.45 ng mL{sup −1}. The intra-column and inter-column relative standard deviations (RSDs) were <8.6%. Good linearity (r > 0.9986) was obtained over the calibration range from 2 to 500 ng mL{sup −1}. The proposed

  6. Extensible automated dispersive liquid–liquid microextraction

    International Nuclear Information System (INIS)

    Li, Songqing; Hu, Lu; Chen, Ketao; Gao, Haixiang

    2015-01-01

    Highlights: • An extensible automated dispersive liquid–liquid microextraction was developed. • A fully automatic SPE workstation with a modified operation program was used. • Ionic liquid-based in situ DLLME was used as model method. • SPE columns packed with nonwoven polypropylene fiber was used for phase separation. • The approach was applied to the determination of benzoylurea insecticides in water. - Abstract: In this study, a convenient and extensible automated ionic liquid-based in situ dispersive liquid–liquid microextraction (automated IL-based in situ DLLME) was developed. 1-Octyl-3-methylimidazolium bis[(trifluoromethane)sulfonyl]imide ([C 8 MIM]NTf 2 ) is formed through the reaction between [C 8 MIM]Cl and lithium bis[(trifluoromethane)sulfonyl]imide (LiNTf 2 ) to extract the analytes. Using a fully automatic SPE workstation, special SPE columns packed with nonwoven polypropylene (NWPP) fiber, and a modified operation program, the procedures of the IL-based in situ DLLME, including the collection of a water sample, injection of an ion exchange solvent, phase separation of the emulsified solution, elution of the retained extraction phase, and collection of the eluent into vials, can be performed automatically. The developed approach, coupled with high-performance liquid chromatography–diode array detection (HPLC–DAD), was successfully applied to the detection and concentration determination of benzoylurea (BU) insecticides in water samples. Parameters affecting the extraction performance were investigated and optimized. Under the optimized conditions, the proposed method achieved extraction recoveries of 80% to 89% for water samples. The limits of detection (LODs) of the method were in the range of 0.16–0.45 ng mL −1 . The intra-column and inter-column relative standard deviations (RSDs) were <8.6%. Good linearity (r > 0.9986) was obtained over the calibration range from 2 to 500 ng mL −1 . The proposed method opens a new avenue

  7. Microgrids and distributed generation systems: Control, operation, coordination and planning

    Science.gov (United States)

    Che, Liang

    Distributed Energy Resources (DERs) which include distributed generations (DGs), distributed energy storage systems, and adjustable loads are key components in microgrid operations. A microgrid is a small electric power system integrated with on-site DERs to serve all or some portion of the local load and connected to the utility grid through the point of common coupling (PCC). Microgrids can operate in both grid-connected mode and island mode. The structure and components of hierarchical control for a microgrid at Illinois Institute of Technology (IIT) are discussed and analyzed. Case studies would address the reliable and economic operation of IIT microgrid. The simulation results of IIT microgrid operation demonstrate that the hierarchical control and the coordination strategy of distributed energy resources (DERs) is an effective way of optimizing the economic operation and the reliability of microgrids. The benefits and challenges of DC microgrids are addressed with a DC model for the IIT microgrid. We presented the hierarchical control strategy including the primary, secondary, and tertiary controls for economic operation and the resilience of a DC microgrid. The simulation results verify that the proposed coordinated strategy is an effective way of ensuring the resilient response of DC microgrids to emergencies and optimizing their economic operation at steady state. The concept and prototype of a community microgrid that interconnecting multiple microgrids in a community are proposed. Two works are conducted. For the coordination, novel three-level hierarchical coordination strategy to coordinate the optimal power exchanges among neighboring microgrids is proposed. For the planning, a multi-microgrid interconnection planning framework using probabilistic minimal cut-set (MCS) based iterative methodology is proposed for enhancing the economic, resilience, and reliability signals in multi-microgrid operations. The implementation of high-reliability microgrids

  8. About development of automation control systems

    Science.gov (United States)

    Myshlyaev, L. P.; Wenger, K. G.; Ivushkin, K. A.; Makarov, V. N.

    2018-05-01

    The shortcomings of approaches to the development of modern control automation systems and ways of their improvement are given: the correct formation of objects for study and optimization; a joint synthesis of control objects and control systems, an increase in the structural diversity of the elements of control systems. Diagrams of control systems with purposefully variable structure of their elements are presented. Structures of control algorithms for an object with a purposefully variable structure are given.

  9. Short run hydrothermal coordination with network constraints using an interior point method

    International Nuclear Information System (INIS)

    Lopez Lezama, Jesus Maria; Gallego Pareja, Luis Alfonso; Mejia Giraldo, Diego

    2008-01-01

    This paper presents a lineal optimization model to solve the hydrothermal coordination problem. The main contribution of this work is the inclusion of the network constraints to the hydrothermal coordination problem and its solution using an interior point method. The proposed model allows working with a system that can be completely hydraulic, thermal or mixed. Results are presented on the IEEE 14 bus test system

  10. Automation of BESSY scanning tables

    International Nuclear Information System (INIS)

    Hanton, J.; Kesteman, J.

    1981-01-01

    A micro processor M6800 is used for the automation of scanning and premeasuring BESSY tables. The tasks achieved by the micro processor are: 1. control of spooling of the four asynchronous film winding devices and switching on and off the 4 projections lamps, 2. pre-processing of the data coming from a bi-polar coordinates measuring device, 3. bi-directional interchange of informations between the operator, the BESSY table and the DEC PDP 11/34 mini computer controling the scanning operations, 4. control of the magnification on the table by swapping the projection lenses of appropriate focal lengths and the associated light boxes (under development). In connection with point 4, study is being made for the use of BESSY tables for accurate measurements (+/-5 microns), by encoding the displacements of the projections lenses. (orig.)

  11. Concrete Plant Operations Optimization Using Combined Simulation and Genetic Algorithms

    NARCIS (Netherlands)

    Cao, Ming; Lu, Ming; Zhang, Jian-Ping

    2004-01-01

    This work presents a new approach for concrete plant operations optimization by combining a ready mixed concrete (RMC) production simulation tool (called HKCONSIM) with a genetic algorithm (GA) based optimization procedure. A revamped HKCONSIM computer system can be used to automate the simulation

  12. A Clustering Method for Data in Cylindrical Coordinates

    Directory of Open Access Journals (Sweden)

    Kazuhisa Fujita

    2017-01-01

    Full Text Available We propose a new clustering method for data in cylindrical coordinates based on the k-means. The goal of the k-means family is to maximize an optimization function, which requires a similarity. Thus, we need a new similarity to obtain the new clustering method for data in cylindrical coordinates. In this study, we first derive a new similarity for the new clustering method by assuming a particular probabilistic model. A data point in cylindrical coordinates has radius, azimuth, and height. We assume that the azimuth is sampled from a von Mises distribution and the radius and the height are independently generated from isotropic Gaussian distributions. We derive the new similarity from the log likelihood of the assumed probability distribution. Our experiments demonstrate that the proposed method using the new similarity can appropriately partition synthetic data defined in cylindrical coordinates. Furthermore, we apply the proposed method to color image quantization and show that the methods successfully quantize a color image with respect to the hue element.

  13. A HUMAN AUTOMATION INTERACTION CONCEPT FOR A SMALL MODULAR REACTOR CONTROL ROOM

    Energy Technology Data Exchange (ETDEWEB)

    Le Blanc, Katya; Spielman, Zach; Hill, Rachael

    2017-06-01

    Many advanced nuclear power plant (NPP) designs incorporate higher degrees of automation than the existing fleet of NPPs. Automation is being introduced or proposed in NPPs through a wide variety of systems and technologies, such as advanced displays, computer-based procedures, advanced alarm systems, and computerized operator support systems. Additionally, many new reactor concepts, both full scale and small modular reactors, are proposing increased automation and reduced staffing as part of their concept of operations. However, research consistently finds that there is a fundamental tradeoff between system performance with increased automation and reduced human performance. There is a need to address the question of how to achieve high performance and efficiency of high levels of automation without degrading human performance. One example of a new NPP concept that will utilize greater degrees of automation is the SMR concept from NuScale Power. The NuScale Power design requires 12 modular units to be operated in one single control room, which leads to a need for higher degrees of automation in the control room. Idaho National Laboratory (INL) researchers and NuScale Power human factors and operations staff are working on a collaborative project to address the human performance challenges of increased automation and to determine the principles that lead to optimal performance in highly automated systems. This paper will describe this concept in detail and will describe an experimental test of the concept. The benefits and challenges of the approach will be discussed.

  14. Low cost automation

    International Nuclear Information System (INIS)

    1987-03-01

    This book indicates method of building of automation plan, design of automation facilities, automation and CHIP process like basics of cutting, NC processing machine and CHIP handling, automation unit, such as drilling unit, tapping unit, boring unit, milling unit and slide unit, application of oil pressure on characteristics and basic oil pressure circuit, application of pneumatic, automation kinds and application of process, assembly, transportation, automatic machine and factory automation.

  15. "The Doctor Needs to Know": Acceptability of Smartphone Location Tracking for Care Coordination.

    Science.gov (United States)

    Liss, David T; Serrano, Eloisa; Wakeman, Julie; Nowicki, Christine; Buchanan, David R; Cesan, Ana; Brown, Tiffany

    2018-05-04

    Care coordination can be highly challenging to carry out. When care is fragmented across health systems and providers, there is an increased likelihood of hospital readmissions and wasteful health care spending. During and after care transitions, smartphones have the potential to bolster information transfer and care coordination. However, little research has examined patients' perceptions of using smartphones to coordinate care. This study's primary objective was to explore patient acceptability of a smartphone app that could facilitate care coordination in a safety net setting. Our secondary objective was to identify how clinicians and other members of primary care teams could use this app to coordinate care. This qualitative study was conducted at a federally qualified health center in metropolitan Chicago, IL. We conducted four focus groups (two in English, two in Spanish) with high-risk adults who owned a smartphone and received services from an organizational care management program. We also conducted structured interviews with clinicians and a group interview with care managers. Focus groups elicited patients' perceptions of a smartphone app designed to: (1) identify emergency department (ED) visits and inpatient stays using real-time location data; (2) send automated notifications (ie, alerts) to users' phones, asking whether they were a patient in the hospital; and (3) send automated messages to primary care teams to notify them about patients' confirmed ED visits and inpatient stays. Focus group transcripts were coded based on emergent themes. Clinicians and care managers were asked about messages they would like to receive from the app. Five main themes emerged in patient focus group discussions. First, participants expressed a high degree of willingness to use the proposed app during inpatient stays. Second, participants expressed varying degrees of willingness to use the app during ED visits, particularly for low acuity ED visits. Third, participants

  16. Airfoil shape optimization using non-traditional optimization technique and its validation

    Directory of Open Access Journals (Sweden)

    R. Mukesh

    2014-07-01

    Full Text Available Computational fluid dynamics (CFD is one of the computer-based solution methods which is more widely employed in aerospace engineering. The computational power and time required to carry out the analysis increase as the fidelity of the analysis increases. Aerodynamic shape optimization has become a vital part of aircraft design in the recent years. Generally if we want to optimize an airfoil we have to describe the airfoil and for that, we need to have at least hundred points of x and y co-ordinates. It is really difficult to optimize airfoils with this large number of co-ordinates. Nowadays many different schemes of parameter sets are used to describe general airfoil such as B-spline, and PARSEC. The main goal of these parameterization schemes is to reduce the number of needed parameters as few as possible while controlling the important aerodynamic features effectively. Here the work has been done on the PARSEC geometry representation method. The objective of this work is to introduce the knowledge of describing general airfoil using twelve parameters by representing its shape as a polynomial function. And also we have introduced the concept of Genetic Algorithm to optimize the aerodynamic characteristics of a general airfoil for specific conditions. A MATLAB program has been developed to implement PARSEC, Panel Technique, and Genetic Algorithm. This program has been tested for a standard NACA 2411 airfoil and optimized to improve its coefficient of lift. Pressure distribution and co-efficient of lift for airfoil geometries have been calculated using the Panel method. The optimized airfoil has improved co-efficient of lift compared to the original one. The optimized airfoil is validated using wind tunnel data.

  17. Switching coordination of distributed dc-dc converters for highly efficient photovoltaic power plants

    Science.gov (United States)

    Agamy, Mohammed; Elasser, Ahmed; Sabate, Juan Antonio; Galbraith, Anthony William; Harfman Todorovic, Maja

    2014-09-09

    A distributed photovoltaic (PV) power plant includes a plurality of distributed dc-dc converters. The dc-dc converters are configured to switch in coordination with one another such that at least one dc-dc converter transfers power to a common dc-bus based upon the total system power available from one or more corresponding strings of PV modules. Due to the coordinated switching of the dc-dc converters, each dc-dc converter transferring power to the common dc-bus continues to operate within its optimal efficiency range as well as to optimize the maximum power point tracking in order to increase the energy yield of the PV power plant.

  18. SAE2.py: a python script to automate parameter studies using SCREAMER with application to magnetic switching on Z

    International Nuclear Information System (INIS)

    Orndorff-Plunkett, Franklin

    2011-01-01

    The SCREAMER simulation code is widely used at Sandia National Laboratories for designing and simulating pulsed power accelerator experiments on super power accelerators. A preliminary parameter study of Z with a magnetic switching retrofit illustrates the utility of the automating script for optimizing pulsed power designs. SCREAMER is a circuit based code commonly used in pulsed-power design and requires numerous iterations to find optimal configurations. System optimization using simulations like SCREAMER is by nature inefficient and incomplete when done manually. This is especially the case when the system has many interactive elements whose emergent effects may be unforeseeable and complicated. For increased completeness, efficiency and robustness, investigators should probe a suitably confined parameter space using deterministic, genetic, cultural, ant-colony algorithms or other computational intelligence methods. I have developed SAE2 - a user-friendly, deterministic script that automates the search for optima of pulsed-power designs with SCREAMER. This manual demonstrates how to make input decks for SAE2 and optimize any pulsed-power design that can be modeled using SCREAMER. Application of SAE2 to magnetic switching on model of a potential Z refurbishment illustrates the power of SAE2. With respect to the manual optimization, the automated optimization resulted in 5% greater peak current (10% greater energy) and a 25% increase in safety factor for the most highly stressed element.

  19. Automated Detection of Sepsis Using Electronic Medical Record Data: A Systematic Review.

    Science.gov (United States)

    Despins, Laurel A

    Severe sepsis and septic shock are global issues with high mortality rates. Early recognition and intervention are essential to optimize patient outcomes. Automated detection using electronic medical record (EMR) data can assist this process. This review describes automated sepsis detection using EMR data. PubMed retrieved publications between January 1, 2005 and January 31, 2015. Thirteen studies met study criteria: described an automated detection approach with the potential to detect sepsis or sepsis-related deterioration in real or near-real time; focused on emergency department and hospitalized neonatal, pediatric, or adult patients; and provided performance measures or results indicating the impact of automated sepsis detection. Detection algorithms incorporated systemic inflammatory response and organ dysfunction criteria. Systems in nine studies generated study or care team alerts. Care team alerts did not consistently lead to earlier interventions. Earlier interventions did not consistently translate to improved patient outcomes. Performance measures were inconsistent. Automated sepsis detection is potentially a means to enable early sepsis-related therapy but current performance variability highlights the need for further research.

  20. Optimized mine ventilation on demand (OMVOD)

    International Nuclear Information System (INIS)

    Anderson, M.

    2009-01-01

    This paper provided an overview of the Optimized Mine Ventilation on Demand (OMVOD) system that is being installed at Xstrata Nickel Rim South Project and at Vale Inco's Totten Mine in Sudbury. The OMVOD system is designed to dynamically monitor and control air quality and quantity in real time and dilute and remove hazardous substances including diesel particulate matter (DPM), carbon monoxide (CO) and nitrous oxide (NO 2 ). It is also designed to control the thermal environment and provide ventilation for humans as well as mobile equipment engine combustion according to regulatory standards. The paper highlighted the OMVOD system optimization of energy, air quality measurement and control and production management of the mines through real time dynamic automation. Topics of discussion included real-time tracking and monitoring of diesel equipment; real-time tracking of underground miners; real-time evaluation of mine ventilation networks; and real-time control and optimization of ventilation equipment. ABB and Simsmart Technologies have joined forces to provide underground mining customers with a ventilation optimization solution. Simsmart's OMVOD provides proven real time/dynamic automation technology to significantly reduce energy costs, provide health and safety benefits as well as major capital cost savings while realizing an increase in production.

  1. Number crunchers : instrumentation sector treads path of selective automation

    International Nuclear Information System (INIS)

    Budd, G.

    2006-01-01

    Automation guided by adequate monitoring and control instrumentation is playing an increasingly important role in the oil and gas sector. This article presented a overview of new instruments in the Western Canada Sedimentary Basin (WCSB), where a niche market for instrumentation and automation has grown in tandem with increased drilling activities. Fluctuating prices in oil and gas have also meant that available production methods must be optimized in order to ensure bottom line profits. However, economies of production scale can exclude extensive monitoring techniques in coalbed methane (CBM) activities. Compressor stations are the site of most monitoring and control instrumentation in CBM activities. Compressor packages at the stations include an engine panel that monitors suction pressure and water temperature. Alarm points on all monitoring instrumentation can shut down operations or assist in slight adjustments to machinery to optimize production. In addition, acoustical flow meters are fitted to headers to identify drops in a station's overall volumetric flow of natural gas. Instrumentation at the stations monitors and controls boilers that heat glycol for the gas dehydration process through the use of a pneumatic control loop that communicates with the motor control centre. The system is capable of ensuring shut-downs in emergencies. The combination of automation and carbon dioxide (CO 2 ) flooding has dramatically improved production in the Weyburn oilfield in southeastern Saskatchewan, where data transfer is now completed using Ethernet communications to a SCADA system to communicate with and from 5 satellite sites to a central main plant. It was estimated that the life expectancy of the Weyburn oilfield has been extended by almost 25 years. It was concluded that when harnessed to other technologies and combined with a user-friendly interface, automation can make a huge difference in the production profile of a field. 2 figs

  2. Mathematical model as means of optimization of the automation system of the process of incidents of information security management

    Directory of Open Access Journals (Sweden)

    Yulia G. Krasnozhon

    2018-03-01

    Full Text Available Modern information technologies have an increasing importance for development dynamics and management structure of an enterprise. The management efficiency of implementation of modern information technologies directly related to the quality of information security incident management. However, issues of assessment of the impact of information security incidents management on quality and efficiency of the enterprise management system are not sufficiently highlighted neither in Russian nor in foreign literature. The main direction to approach these problems is the optimization of the process automation system of the information security incident management. Today a special attention is paid to IT-technologies while dealing with information security incidents at mission-critical facilities in Russian Federation such as the Federal Tax Service of Russia (FTS. It is proposed to use the mathematical apparatus of queueing theory in order to build a mathematical model of the system optimization. The developed model allows to estimate quality of the management taking into account the rules and restrictions imposed on the system by the effects of information security incidents. Here an example is given in order to demonstrate the system in work. The obtained statistical data are shown. An implementation of the system discussed here will improve the quality of the Russian FTS services and make responses to information security incidents faster.

  3. RF Gun Optimization Study

    International Nuclear Information System (INIS)

    Alicia Hofler; Pavel Evtushenko

    2007-01-01

    Injector gun design is an iterative process where the designer optimizes a few nonlinearly interdependent beam parameters to achieve the required beam quality for a particle accelerator. Few tools exist to automate the optimization process and thoroughly explore the parameter space. The challenging beam requirements of new accelerator applications such as light sources and electron cooling devices drive the development of RF and SRF photo injectors. A genetic algorithm (GA) has been successfully used to optimize DC photo injector designs at Cornell University [1] and Jefferson Lab [2]. We propose to apply GA techniques to the design of RF and SRF gun injectors. In this paper, we report on the initial phase of the study where we model and optimize a system that has been benchmarked with beam measurements and simulation

  4. Automation and efficiency in the operational processes: a case study in a logistics operator

    OpenAIRE

    Nascimento, Dener Gomes do; Silva, Giovanni Henrique da

    2017-01-01

    Globalization has made the automations become increasingly feasible and with the technological development many operations can be optimized, bringing productivity gains. Logistics is a major benefit of all this development, because lives a time extremely competitive, in which being efficient is a requirement to stay alive in the market. Inserted in this context, this article seeks from the analysis of the processes in a distribution center, identify opportunities to automate operations to gai...

  5. Automated sensor networks to advance ocean science

    Science.gov (United States)

    Schofield, O.; Orcutt, J. A.; Arrott, M.; Vernon, F. L.; Peach, C. L.; Meisinger, M.; Krueger, I.; Kleinert, J.; Chao, Y.; Chien, S.; Thompson, D. R.; Chave, A. D.; Balasuriya, A.

    2010-12-01

    The National Science Foundation has funded the Ocean Observatories Initiative (OOI), which over the next five years will deploy infrastructure to expand scientist’s ability to remotely study the ocean. The deployed infrastructure will be linked by a robust cyberinfrastructure (CI) that will integrate marine observatories into a coherent system-of-systems. OOI is committed to engaging the ocean sciences community during the construction pahse. For the CI, this is being enabled by using a “spiral design strategy” allowing for input throughout the construction phase. In Fall 2009, the OOI CI development team used an existing ocean observing network in the Mid-Atlantic Bight (MAB) to test OOI CI software. The objective of this CI test was to aggregate data from ships, autonomous underwater vehicles (AUVs), shore-based radars, and satellites and make it available to five different data-assimilating ocean forecast models. Scientists used these multi-model forecasts to automate future glider missions in order to demonstrate the feasibility of two-way interactivity between the sensor web and predictive models. The CI software coordinated and prioritized the shared resources that allowed for the semi-automated reconfiguration of assett-tasking, and thus enabled an autonomous execution of observation plans for the fixed and mobile observation platforms. Efforts were coordinated through a web portal that provided an access point for the observational data and model forecasts. Researchers could use the CI software in tandem with the web data portal to assess the performance of individual numerical model results, or multi-model ensembles, through real-time comparisons with satellite, shore-based radar, and in situ robotic measurements. The resulting sensor net will enable a new means to explore and study the world’s oceans by providing scientists a responsive network in the world’s oceans that can be accessed via any wireless network.

  6. Euler's fluid equations: Optimal control vs optimization

    Energy Technology Data Exchange (ETDEWEB)

    Holm, Darryl D., E-mail: d.holm@ic.ac.u [Department of Mathematics, Imperial College London, SW7 2AZ (United Kingdom)

    2009-11-23

    An optimization method used in image-processing (metamorphosis) is found to imply Euler's equations for incompressible flow of an inviscid fluid, without requiring that the Lagrangian particle labels exactly follow the flow lines of the Eulerian velocity vector field. Thus, an optimal control problem and an optimization problem for incompressible ideal fluid flow both yield the same Euler fluid equations, although their Lagrangian parcel dynamics are different. This is a result of the gauge freedom in the definition of the fluid pressure for an incompressible flow, in combination with the symmetry of fluid dynamics under relabeling of their Lagrangian coordinates. Similar ideas are also illustrated for SO(N) rigid body motion.

  7. Optimal control of raw timber production processes

    Science.gov (United States)

    Ivan Kolenka

    1978-01-01

    This paper demonstrates the possibility of optimal planning and control of timber harvesting activ-ities with mathematical optimization models. The separate phases of timber harvesting are represented by coordinated models which can be used to select the optimal decision for the execution of any given phase. The models form a system whose components are connected and...

  8. Coordinated Direct and Relay Transmission with Linear Non-Regenerative Relay Beamforming

    DEFF Research Database (Denmark)

    Sun, Fan; De Carvalho, Elisabeth; Popovski, Petar

    2012-01-01

    Joint processing of multiple communication flows in wireless systems has given rise to a number of novel transmission techniques, notably the two-way relaying, but also more general traffic scenarios, such as coordinated direct and relay (CDR) transmissions. In a CDR scheme the relay has a central...... role in managing the interference and boosting the overall system performance. In this letter we consider the case in which an amplify-and-forward relay has multiple antennas and can use beamforming to support the coordinated transmissions. We focus on one representative traffic type with one uplink...... user and one downlink user. Two different criteria for relay beamforming are analyzed: maximal weighted sum-rate and maximization of the worst-case weighted SNR. We propose iterative optimal solutions, as well as low-complexity near-optimal solutions....

  9. Autonomy and Automation

    Science.gov (United States)

    Shively, Jay

    2017-01-01

    A significant level of debate and confusion has surrounded the meaning of the terms autonomy and automation. Automation is a multi-dimensional concept, and we propose that Remotely Piloted Aircraft Systems (RPAS) automation should be described with reference to the specific system and task that has been automated, the context in which the automation functions, and other relevant dimensions. In this paper, we present definitions of automation, pilot in the loop, pilot on the loop and pilot out of the loop. We further propose that in future, the International Civil Aviation Organization (ICAO) RPAS Panel avoids the use of the terms autonomy and autonomous when referring to automated systems on board RPA. Work Group 7 proposes to develop, in consultation with other workgroups, a taxonomy of Levels of Automation for RPAS.

  10. Incremental learning for automated knowledge capture

    Energy Technology Data Exchange (ETDEWEB)

    Benz, Zachary O. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Basilico, Justin Derrick [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Davis, Warren Leon [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dixon, Kevin R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jones, Brian S. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Martin, Nathaniel [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wendt, Jeremy Daniel [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2013-12-01

    People responding to high-consequence national-security situations need tools to help them make the right decision quickly. The dynamic, time-critical, and ever-changing nature of these situations, especially those involving an adversary, require models of decision support that can dynamically react as a situation unfolds and changes. Automated knowledge capture is a key part of creating individualized models of decision making in many situations because it has been demonstrated as a very robust way to populate computational models of cognition. However, existing automated knowledge capture techniques only populate a knowledge model with data prior to its use, after which the knowledge model is static and unchanging. In contrast, humans, including our national-security adversaries, continually learn, adapt, and create new knowledge as they make decisions and witness their effect. This artificial dichotomy between creation and use exists because the majority of automated knowledge capture techniques are based on traditional batch machine-learning and statistical algorithms. These algorithms are primarily designed to optimize the accuracy of their predictions and only secondarily, if at all, concerned with issues such as speed, memory use, or ability to be incrementally updated. Thus, when new data arrives, batch algorithms used for automated knowledge capture currently require significant recomputation, frequently from scratch, which makes them ill suited for use in dynamic, timecritical, high-consequence decision making environments. In this work we seek to explore and expand upon the capabilities of dynamic, incremental models that can adapt to an ever-changing feature space.

  11. Cassini-Huygens maneuver automation for navigation

    Science.gov (United States)

    Goodson, Troy; Attiyah, Amy; Buffington, Brent; Hahn, Yungsun; Pojman, Joan; Stavert, Bob; Strange, Nathan; Stumpf, Paul; Wagner, Sean; Wolff, Peter; hide

    2006-01-01

    Many times during the Cassini-Huygens mission to Saturn, propulsive maneuvers must be spaced so closely together that there isn't enough time or workforce to execute the maneuver-related software manually, one subsystem at a time. Automation is required. Automating the maneuver design process has involved close cooperation between teams. We present the contribution from the Navigation system. In scope, this includes trajectory propagation and search, generation of ephemerides, general tasks such as email notification and file transfer, and presentation materials. The software has been used to help understand maneuver optimization results, Huygens probe delivery statistics, and Saturn ring-plane crossing geometry. The Maneuver Automation Software (MAS), developed for the Cassini-Huygens program enables frequent maneuvers by handling mundane tasks such as creation of deliverable files, file delivery, generation and transmission of email announcements, generation of presentation material and other supporting documentation. By hand, these tasks took up hours, if not days, of work for each maneuver. Automated, these tasks may be completed in under an hour. During the cruise trajectory the spacing of maneuvers was such that development of a maneuver design could span about a month, involving several other processes in addition to that described, above. Often, about the last five days of this process covered the generation of a final design using an updated orbit-determination estimate. To support the tour trajectory, the orbit determination data cut-off of five days before the maneuver needed to be reduced to approximately one day and the whole maneuver development process needed to be reduced to less than a week..

  12. Data-Driven Assistance Functions for Industrial Automation Systems

    International Nuclear Information System (INIS)

    Windmann, Stefan; Niggemann, Oliver

    2015-01-01

    The increasing amount of data in industrial automation systems overburdens the user in process control and diagnosis tasks. One possibility to cope with these challenges consists of using smart assistance systems that automatically monitor and optimize processes. This article deals with aspects of data-driven assistance systems such as assistance functions, process models and data acquisition. The paper describes novel approaches for self-diagnosis and self-optimization, and shows how these assistance functions can be integrated in different industrial environments. The considered assistance functions are based on process models that are automatically learned from process data. Fault detection and isolation is based on the comparison of observations of the real system with predictions obtained by application of the process models. The process models are further employed for energy efficiency optimization of industrial processes. Experimental results are presented for fault detection and energy efficiency optimization of a drive system. (paper)

  13. A Novel adaptative Discrete Cuckoo Search Algorithm for parameter optimization in computer vision

    Directory of Open Access Journals (Sweden)

    loubna benchikhi

    2017-10-01

    Full Text Available Computer vision applications require choosing operators and their parameters, in order to provide the best outcomes. Often, the users quarry on expert knowledge and must experiment many combinations to find manually the best one. As performance, time and accuracy are important, it is necessary to automate parameter optimization at least for crucial operators. In this paper, a novel approach based on an adaptive discrete cuckoo search algorithm (ADCS is proposed. It automates the process of algorithms’ setting and provides optimal parameters for vision applications. This work reconsiders a discretization problem to adapt the cuckoo search algorithm and presents the procedure of parameter optimization. Some experiments on real examples and comparisons to other metaheuristic-based approaches: particle swarm optimization (PSO, reinforcement learning (RL and ant colony optimization (ACO show the efficiency of this novel method.

  14. An Automated Analysis-Synthesis Package for Design Optimization ...

    African Journals Online (AJOL)

    90 standards is developed for the design optimization of framed structures - continuous beams, plane and space trusses and rigid frames, grids and composite truss-rigid frames. The package will enable the structural engineer to effectively and ...

  15. WARACS: Wrappers to Automate the Reconstruction of Ancestral Character States.

    Science.gov (United States)

    Gruenstaeudl, Michael

    2016-02-01

    Reconstructions of ancestral character states are among the most widely used analyses for evaluating the morphological, cytological, or ecological evolution of an organismic lineage. The software application Mesquite remains the most popular application for such reconstructions among plant scientists, even though its support for automating complex analyses is limited. A software tool is needed that automates the reconstruction and visualization of ancestral character states with Mesquite and similar applications. A set of command line-based Python scripts was developed that (a) communicates standardized input to and output from the software applications Mesquite, BayesTraits, and TreeGraph2; (b) automates the process of ancestral character state reconstruction; and (c) facilitates the visualization of reconstruction results. WARACS provides a simple tool that streamlines the reconstruction and visualization of ancestral character states over a wide array of parameters, including tree distribution, character state, and optimality criterion.

  16. Pilot/Controller Coordinated Decision Making in the Next Generation Air Transportation System

    Science.gov (United States)

    Bearman, Chris; Miller, Ronald c.; Orasanu, Judith M.

    2011-01-01

    Introduction: NextGen technologies promise to provide considerable benefits in terms of enhancing operations and improving safety. However, there needs to be a thorough human factors evaluation of the way these systems will change the way in which pilot and controllers share information. The likely impact of these new technologies on pilot/controller coordinated decision making is considered in this paper using the "operational, informational and evaluative disconnect" framework. Method: Five participant focus groups were held. Participants were four experts in human factors, between x and x research students and a technical expert. The participant focus group evaluated five key NextGen technologies to identify issues that made different disconnects more or less likely. Results: Issues that were identified were: Decision Making will not necessarily improve because pilots and controllers possess the same information; Having a common information source does not mean pilots and controllers are looking at the same information; High levels of automation may lead to disconnects between the technology and pilots/controllers; Common information sources may become the definitive source for information; Overconfidence in the automation may lead to situations where appropriate breakdowns are not initiated. Discussion: The issues that were identified lead to recommendations that need to be considered in the development of NextGen technologies. The current state of development of these technologies provides a good opportunity to utilize recommendations at an early stage so that NextGen technologies do not lead to difficulties in resolving breakdowns in coordinated decision making.

  17. Bilateral Coordination Strategy of Supply Chain with Bidirectional Option Contracts under Inflation

    Directory of Open Access Journals (Sweden)

    Nana Wan

    2015-01-01

    Full Text Available As far as the price increase and the demand contraction caused by inflation are concerned, we establish a Stackelberg game model that incorporates bidirectional option contracts and the effect of inflation and derive the optimal ordering and production policies on a one-period two-stage supply chain composed of one supplier and one retailer. Through using the model of wholesale price contracts as the benchmark, we find that the introduction of bidirectional option contracts can benefit both the supplier and the retailer under inflation scenarios. Based on the conclusions drawn above, we design the bilateral coordination mechanism from the different perspective of two members involved and discuss how bidirectional option contracts should be set to achieve channel coordination under inflation scenarios. Through the sensitivity analysis, we illustrate the effect of inflation on the optimal decision variables and the optimal expected profits of the two parties with bidirectional option contracts.

  18. Development of the automated circulating tumor cell recovery system with microcavity array.

    Science.gov (United States)

    Negishi, Ryo; Hosokawa, Masahito; Nakamura, Seita; Kanbara, Hisashige; Kanetomo, Masafumi; Kikuhara, Yoshihito; Tanaka, Tsuyoshi; Matsunaga, Tadashi; Yoshino, Tomoko

    2015-05-15

    Circulating tumor cells (CTCs) are well recognized as useful biomarker for cancer diagnosis and potential target of drug discovery for metastatic cancer. Efficient and precise recovery of extremely low concentrations of CTCs from blood has been required to increase the detection sensitivity. Here, an automated system equipped with a microcavity array (MCA) was demonstrated for highly efficient and reproducible CTC recovery. The use of MCA allows selective recovery of cancer cells from whole blood on the basis of differences in size between tumor and blood cells. Intra- and inter-assays revealed that the automated system achieved high efficiency and reproducibility equal to the assay manually performed by well-trained operator. Under optimized assay workflow, the automated system allows efficient and precise cell recovery for non-small cell lung cancer cells spiked in whole blood. The automated CTC recovery system will contribute to high-throughput analysis in the further clinical studies on large cohort of cancer patients. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. Optimal Coordinated EV Charging with Reactive Power Support in Constrained Distribution Grids

    Energy Technology Data Exchange (ETDEWEB)

    Paudyal, Sumit; Ceylan, Oğuzhan; Bhattarai, Bishnu P.; Myers, Kurt S.

    2017-07-01

    Electric vehicle (EV) charging/discharging can take place in any P-Q quadrants, which means EVs could support reactive power to the grid while charging the battery. In controlled charging schemes, distribution system operator (DSO) coordinates with the charging of EV fleets to ensure grid’s operating constraints are not violated. In fact, this refers to DSO setting upper bounds on power limits for EV charging. In this work, we demonstrate that if EVs inject reactive power into the grid while charging, DSO could issue higher upper bounds on the active power limits for the EVs for the same set of grid constraints. We demonstrate the concept in an 33-node test feeder with 1,500 EVs. Case studies show that in constrained distribution grids in coordinated charging, average costs of EV charging could be reduced if the charging takes place in the fourth P-Q quadrant compared to charging with unity power factor.

  20. Design and development on automated control system of coated fuel particle fabrication process

    International Nuclear Information System (INIS)

    Liu Malin; Shao Youlin; Liu Bing

    2013-01-01

    With the development trend of the large-scale production of the HTR coated fuel particles, the original manual control system can not meet the requirement and the automation control system of coated fuel particle fabrication in modern industrial grade is needed to develop. The comprehensive analysis aiming at successive 4-layer coating process of TRISO type coated fuel particles was carried out. It was found that the coating process could be divided into five subsystems and nine operating states. The establishment of DCS-type (distributed control system) of automation control system was proposed. According to the rigorous requirements of preparation process for coated particles, the design considerations of DCS were proposed, including the principle of coordinated control, safety and reliability, integration specification, practical and easy to use, and open and easy to update. A complete set of automation control system for coated fuel particle preparation process was manufactured based on fulfilling the requirements of these principles in manufacture practice. The automated control system was put into operation in the production of irradiated samples for HTRPM demonstration project. The experimental results prove that the system can achieve better control of coated fuel particle preparation process and meet the requirements of factory-scale production. (authors)

  1. A Casting Yield Optimization Case Study: Forging Ram

    DEFF Research Database (Denmark)

    Kotas, Petr; Tutum, Cem Celal; Hattel, Jesper Henri

    2010-01-01

    This work summarizes the findings of multi-objective optimization of a gravity sand-cast steel part for which an increase of the casting yield via riser optimization was considered. This was accomplished by coupling a casting simulation software package with an optimization module. The benefits...... of this approach, recently adopted in foundry industry world wide and based on fully automated computer optimization, were demonstrated. First, analyses of filling and solidification of the original casting design were conducted in the standard simulation environment to determine potential flaws and inadequacies...

  2. Coordination of a Random Yield Supply Chain with a Loss-Averse Supplier

    Directory of Open Access Journals (Sweden)

    Jiarong Luo

    2015-01-01

    Full Text Available This paper investigates the coordination of a supply chain consisting of a loss-averse supplier and a risk-neutral buyer who orders products from the supplier who suffers from random yield to meet a deterministic demand. We derive the risk-neutral buyer’s optimal order policy and the loss-averse supplier’s optimal production policy under shortage-penalty-surplus-subsidy (SPSS contracts. We also analyze the impacts of loss aversion on the loss-averse supplier’s production decision making and find that the loss-averse supplier may produce less than, equal to, or more than the risk-neutral supplier. Then, we provide explicit conditions on which the random yield supply chain with a loss-averse supplier can be coordinated under SPSS contracts. Finally, adopting numerical examples, we find that when the shortage penalty is low, the buyer’s optimal order quantity will increase, while the supplier’s optimal production quantity will first decrease and then increase as the loss aversion level increases. When the shortage penalty is high, the buyer’s optimal order quantity will decrease but the supplier’s optimal production quantity will always increase as the loss aversion level increases. Furthermore, the numerical examples provide strong evidence for the view that SPSS contracts can effectively improve the performance of the whole supply chain.

  3. Supply Chain Coordination with Carbon Trading Price and Consumers’ Environmental Awareness Dependent Demand

    Directory of Open Access Journals (Sweden)

    Qinghua Pang

    2018-01-01

    Full Text Available Carbon emissions reduction in supply chain is an effective method to reduce the greenhouse effect. The paper investigates the impacts of carbon trading price and consumers’ environmental awareness on carbon emissions in supply chain under the cap-and-trade system. Firstly, it analyzes the centralized decision structure and obtains the requirements to coordinate carbon emissions reduction and order quantity in supply chain. Secondly, it proposes the supply chain coordination mechanism with revenue-sharing contract based on quantity discount policy, and the requirements that the contract parameters need to satisfy are also given. Thirdly, assuming the market demand is affected by consumer’s environmental awareness in addition form, the paper proposes the methods to determine the optimal order quantity and the optimal level of carbon emissions through model optimization. Finally, it investigates the impacts of carbon trading price on carbon emissions in supply chain. The results show that clean manufacturer’s optimal per-unit carbon emissions increase as the carbon trading price increases, while nongreen manufacturer’s optimal per-unit carbon emissions decrease as the carbon trading price increases. For the middle emissions manufacturer, the optimal per-unit carbon emissions depend on the relationship between the carbon trading price and the carbon reduction coefficient.

  4. Automated Surveillance of Fruit Flies

    Science.gov (United States)

    Potamitis, Ilyas; Rigakis, Iraklis; Tatlas, Nicolaos-Alexandros

    2017-01-01

    Insects of the Diptera order of the Tephritidae family cause costly, annual crop losses worldwide. Monitoring traps are important components of integrated pest management programs used against fruit flies. Here we report the modification of typical, low-cost plastic traps for fruit flies by adding the necessary optoelectronic sensors to monitor the entrance of the trap in order to detect, time-stamp, GPS tag, and identify the species of incoming insects from the optoacoustic spectrum analysis of their wingbeat. We propose that the incorporation of automated streaming of insect counts, environmental parameters and GPS coordinates into informative visualization of collective behavior will finally enable better decision making across spatial and temporal scales, as well as administrative levels. The device presented is at product level of maturity as it has solved many pending issues presented in a previously reported study. PMID:28075346

  5. Automated Surveillance of Fruit Flies

    Directory of Open Access Journals (Sweden)

    Ilyas Potamitis

    2017-01-01

    Full Text Available Insects of the Diptera order of the Tephritidae family cause costly, annual crop losses worldwide. Monitoring traps are important components of integrated pest management programs used against fruit flies. Here we report the modification of typical, low-cost plastic traps for fruit flies by adding the necessary optoelectronic sensors to monitor the entrance of the trap in order to detect, time-stamp, GPS tag, and identify the species of incoming insects from the optoacoustic spectrum analysis of their wingbeat. We propose that the incorporation of automated streaming of insect counts, environmental parameters and GPS coordinates into informative visualization of collective behavior will finally enable better decision making across spatial and temporal scales, as well as administrative levels. The device presented is at product level of maturity as it has solved many pending issues presented in a previously reported study.

  6. Coordinated Control of Cross-Flow Turbines

    Science.gov (United States)

    Strom, Benjamin; Brunton, Steven; Polagye, Brian

    2016-11-01

    Cross-flow turbines, also known as vertical-axis turbines, have several advantages over axial-flow turbines for a number of applications including urban wind power, high-density arrays, and marine or fluvial currents. By controlling the angular velocity applied to the turbine as a function of angular blade position, we have demonstrated a 79 percent increase in cross-flow turbine efficiency over constant-velocity control. This strategy uses the downhill simplex method to optimize control parameter profiles during operation of a model turbine in a recirculating water flume. This optimization method is extended to a set of two turbines, where the blade motions and position of the downstream turbine are optimized to beneficially interact with the coherent structures in the wake of the upstream turbine. This control scheme has the potential to enable high-density arrays of cross-flow turbines to operate at cost-effective efficiency. Turbine wake and force measurements are analyzed for insight into the effect of a coordinated control strategy.

  7. Rapid and convenient semi-automated microwave-assisted solid-phase synthesis of arylopeptoids

    DEFF Research Database (Denmark)

    Rasmussen, Jakob Ewald; Boccia, Marcello Massimo; Nielsen, John

    2014-01-01

    A facile and expedient route to the synthesis of arylopeptoid oligomers (N-alkylated aminomethyl benz-amides) using semi-automated microwave-assisted solid-phase synthesis is presented. The synthesis was optimized for the incorporation of side chains derived from sterically hindered or unreactive...

  8. ToTem: a tool for variant calling pipeline optimization.

    Science.gov (United States)

    Tom, Nikola; Tom, Ondrej; Malcikova, Jitka; Pavlova, Sarka; Kubesova, Blanka; Rausch, Tobias; Kolarik, Miroslav; Benes, Vladimir; Bystry, Vojtech; Pospisilova, Sarka

    2018-06-26

    High-throughput bioinformatics analyses of next generation sequencing (NGS) data often require challenging pipeline optimization. The key problem is choosing appropriate tools and selecting the best parameters for optimal precision and recall. Here we introduce ToTem, a tool for automated pipeline optimization. ToTem is a stand-alone web application with a comprehensive graphical user interface (GUI). ToTem is written in Java and PHP with an underlying connection to a MySQL database. Its primary role is to automatically generate, execute and benchmark different variant calling pipeline settings. Our tool allows an analysis to be started from any level of the process and with the possibility of plugging almost any tool or code. To prevent an over-fitting of pipeline parameters, ToTem ensures the reproducibility of these by using cross validation techniques that penalize the final precision, recall and F-measure. The results are interpreted as interactive graphs and tables allowing an optimal pipeline to be selected, based on the user's priorities. Using ToTem, we were able to optimize somatic variant calling from ultra-deep targeted gene sequencing (TGS) data and germline variant detection in whole genome sequencing (WGS) data. ToTem is a tool for automated pipeline optimization which is freely available as a web application at  https://totem.software .

  9. Simulation of a Production Line with Automated Guided Vehicle: A Case Study

    Directory of Open Access Journals (Sweden)

    Luiz Felipe Verpa Leite

    2015-06-01

    Full Text Available Currently, companies have increasingly needed to improve and develop their processes to flexible the production in order to reduce waiting times and increase productivity through smaller time intervals. To achieve these objectives, efficient and automated transport and handling material systems are required. Therefore, the AGV systems (Automated Guided Vehicle are often used to optimize the flow of materials within the production systems. In this paper, the author evaluates the usage of an AGV system in an industrial environment and analyzes the advantages, disadvantages of the project. Furthermore, the author uses the systems simulation software Promodel® 7.0 to develop a model, based on data collected from real production system, in order to analyze and optimize the use of AGVs. Throughout this paper, problems are identified as well as solution adopted by the author and the results obtained from the simulations.

  10. Home Automation

    OpenAIRE

    Ahmed, Zeeshan

    2010-01-01

    In this paper I briefly discuss the importance of home automation system. Going in to the details I briefly present a real time designed and implemented software and hardware oriented house automation research project, capable of automating house's electricity and providing a security system to detect the presence of unexpected behavior.

  11. Automated Design and Optimization of Pebble-bed Reactor Cores

    International Nuclear Information System (INIS)

    Gougar, Hans D.; Ougouag, Abderrafi M.; Terry, William K.

    2010-01-01

    We present a conceptual design approach for high-temperature gas-cooled reactors using recirculating pebble-bed cores. The design approach employs PEBBED, a reactor physics code specifically designed to solve for and analyze the asymptotic burnup state of pebble-bed reactors, in conjunction with a genetic algorithm to obtain a core that maximizes a fitness value that is a function of user-specified parameters. The uniqueness of the asymptotic core state and the small number of independent parameters that define it suggest that core geometry and fuel cycle can be efficiently optimized toward a specified objective. PEBBED exploits a novel representation of the distribution of pebbles that enables efficient coupling of the burnup and neutron diffusion solvers. With this method, even complex pebble recirculation schemes can be expressed in terms of a few parameters that are amenable to modern optimization techniques. With PEBBED, the user chooses the type and range of core physics parameters that represent the design space. A set of traits, each with acceptable and preferred values expressed by a simple fitness function, is used to evaluate the candidate reactor cores. The stochastic search algorithm automatically drives the generation of core parameters toward the optimal core as defined by the user. The optimized design can then be modeled and analyzed in greater detail using higher resolution and more computationally demanding tools to confirm the desired characteristics. For this study, the design of pebble-bed high temperature reactor concepts subjected to demanding physical constraints demonstrated the efficacy of the PEBBED algorithm.

  12. Price schedules coordination for electricity pool markets

    Science.gov (United States)

    Legbedji, Alexis Motto

    2002-04-01

    We consider the optimal coordination of a class of mathematical programs with equilibrium constraints, which is formally interpreted as a resource-allocation problem. Many decomposition techniques were proposed to circumvent the difficulty of solving large systems with limited computer resources. The considerable improvement in computer architecture has allowed the solution of large-scale problems with increasing speed. Consequently, interest in decomposition techniques has waned. Nonetheless, there is an important class of applications for which decomposition techniques will still be relevant, among others, distributed systems---the Internet, perhaps, being the most conspicuous example---and competitive economic systems. Conceptually, a competitive economic system is a collection of agents that have similar or different objectives while sharing the same system resources. In theory, constructing a large-scale mathematical program and solving it centrally, using currently available computing power can optimize such systems of agents. In practice, however, because agents are self-interested and not willing to reveal some sensitive corporate data, one cannot solve these kinds of coordination problems by simply maximizing the sum of agent's objective functions with respect to their constraints. An iterative price decomposition or Lagrangian dual method is considered best suited because it can operate with limited information. A price-directed strategy, however, can only work successfully when coordinating or equilibrium prices exist, which is not generally the case when a weak duality is unavoidable. Showing when such prices exist and how to compute them is the main subject of this thesis. Among our results, we show that, if the Lagrangian function of a primal program is additively separable, price schedules coordination may be attained. The prices are Lagrange multipliers, and are also the decision variables of a dual program. In addition, we propose a new form of

  13. AUTOMATION OF CONVEYOR BELT TRANSPORT

    Directory of Open Access Journals (Sweden)

    Nenad Marinović

    1990-12-01

    Full Text Available Belt conveyor transport, although one of the most economical mining transport system, introduce many problems to mantain the continuity of the operation. Every stop causes economical loses. Optimal operation require correct tension of the belt, correct belt position and velocity and faultless rolls, which are together input conditions for automation. Detection and position selection of the faults are essential for safety to eliminate fire hazard and for efficient maintenance. Detection and location of idler roll faults are still open problem and up to now not solved successfully (the paper is published in Croatian.

  14. Automated treatment planning engine for prostate seed implant brachytherapy

    International Nuclear Information System (INIS)

    Yu Yan; Zhang, J.B.Y.; Brasacchio, Ralph A.; Okunieff, Paul G.; Rubens, Deborah J.; Strang, John G.; Soni, Arvind; Messing, Edward M.

    1999-01-01

    Purpose: To develop a computer-intelligent planning engine for automated treatment planning and optimization of ultrasound- and template-guided prostate seed implants. Methods and Materials: The genetic algorithm was modified to reflect the 2D nature of the implantation template. A multi-objective decision scheme was used to rank competing solutions, taking into account dose uniformity and conformity to the planning target volume (PTV), dose-sparing of the urethra and the rectum, and the sensitivity of the resulting dosimetry to seed misplacement. Optimized treatment plans were evaluated using selected dosimetric quantifiers, dose-volume histogram (DVH), and sensitivity analysis based on simulated seed placement errors. These dosimetric planning components were integrated into the Prostate Implant Planning Engine for Radiotherapy (PIPER). Results: PIPER has been used to produce a variety of plans for prostate seed implants. In general, maximization of the minimum peripheral dose (mPD) for given implanted total source strength tended to produce peripherally weighted seed patterns. Minimization of the urethral dose further reduced the loading in the central region of the PTV. Isodose conformity to the PTV was achieved when the set of objectives did not reflect seed positioning uncertainties; the corresponding optimal plan generally required fewer seeds and higher source strength per seed compared to the manual planning experience. When seed placement uncertainties were introduced into the set of treatment planning objectives, the optimal plan tended to reach a compromise between the preplanned outcome and the likelihood of retaining the preferred outcome after implantation. The reduction in the volatility of such seed configurations optimized under uncertainty was verified by sensitivity studies. Conclusion: An automated treatment planning engine incorporating real-time sensitivity analysis was found to be a useful tool in dosimetric planning for prostate

  15. A geometrical approach for semi-automated crystal centering and in situ X-ray diffraction data collection

    International Nuclear Information System (INIS)

    Mohammad Yaser Heidari Khajepour; Ferrer, Jean-Luc; Lebrette, Hugo; Vernede, Xavier; Rogues, Pierrick

    2013-01-01

    High-throughput protein crystallography projects pushed forward the development of automated crystallization platforms that are now commonly used. This created an urgent need for adapted and automated equipment for crystal analysis. However, first these crystals have to be harvested, cryo-protected and flash-cooled, operations that can fail or negatively impact on the crystal. In situ X-ray diffraction analysis has become a valid alternative to these operations, and a growing number of users apply it for crystal screening and to solve structures. Nevertheless, even this shortcut may require a significant amount of beam time. In this in situ high-throughput approach, the centering of crystals relative to the beam represents the bottleneck in the analysis process. In this article, a new method to accelerate this process, by recording accurately the local geometry coordinates for each crystal in the crystallization plate, is presented. Subsequently, the crystallization plate can be presented to the X-ray beam by an automated plate-handling device, such as a six-axis robot arm, for an automated crystal centering in the beam, in situ screening or data collection. Here the preliminary results of such a semi-automated pipeline are reported for two distinct test proteins. (authors)

  16. The PWR loading pattern optimization in X-IMAGE

    International Nuclear Information System (INIS)

    Stevens, J.G.; Smith, K.S.; Rempe, K.R.; Downar, T.J.

    1993-01-01

    The design of reactor core loading patterns is difficult due to the staggering number of patterns. The integer nature and nonlinear neutronic response of core design preclude simple prescriptions for generation of the feasible patterns, much less optimization among feasible candidates. Fortunately, recent developments in optimization, graphical user interfaces (GUIs), and the speed and low cost of engineering workstations combine to make loading pattern automation possible. The optimization module SIMAN has been added to X-IMAGE to automatically generate high-quality core loadings

  17. PR-PR: cross-platform laboratory automation system.

    Science.gov (United States)

    Linshiz, Gregory; Stawski, Nina; Goyal, Garima; Bi, Changhao; Poust, Sean; Sharma, Monica; Mutalik, Vivek; Keasling, Jay D; Hillson, Nathan J

    2014-08-15

    To enable protocol standardization, sharing, and efficient implementation across laboratory automation platforms, we have further developed the PR-PR open-source high-level biology-friendly robot programming language as a cross-platform laboratory automation system. Beyond liquid-handling robotics, PR-PR now supports microfluidic and microscopy platforms, as well as protocol translation into human languages, such as English. While the same set of basic PR-PR commands and features are available for each supported platform, the underlying optimization and translation modules vary from platform to platform. Here, we describe these further developments to PR-PR, and demonstrate the experimental implementation and validation of PR-PR protocols for combinatorial modified Golden Gate DNA assembly across liquid-handling robotic, microfluidic, and manual platforms. To further test PR-PR cross-platform performance, we then implement and assess PR-PR protocols for Kunkel DNA mutagenesis and hierarchical Gibson DNA assembly for microfluidic and manual platforms.

  18. Using an integrated automated system to optimize retention and increase frequency of blood donations.

    Science.gov (United States)

    Whitney, J Garrett; Hall, Robert F

    2010-07-01

    This study examines the impact of an integrated, automated phone system to reinforce retention and increase frequency of donations among blood donors. Cultivated by incorporating data results over the past 7 years, the system uses computerized phone messaging to contact blood donors with individualized, multilevel notifications. Donors are contacted at planned intervals to acknowledge and recognize their donations, informed where their blood was sent, asked to participate in a survey, and reminded when they are eligible to donate again. The report statistically evaluates the impact of the various components of the system on donor retention and blood donations and quantifies the fiscal advantages to blood centers. By using information and support systems provided by the automated services and then incorporating the phlebotomists and recruiters to reinforce donor retention, both retention and donations will increase. © 2010 American Association of Blood Banks.

  19. Operational optimization in the downstream; Otimizacao operacional no downstream

    Energy Technology Data Exchange (ETDEWEB)

    Silberman, Luis; Cunha, Filipe Silveira Ramos da [Petroleo Ipiranga, Porto Alegre, RS (Brazil)

    2004-07-01

    On the present competitive down stream's market, there is a great necessity of optimization aiming to guarantee the best price and quality of our clients. Our goal is to attend these expectations while we guarantee an efficient operation. The greatest question is how far we are from the ideal model. This way, a lot of projects have been executed during the last years aiming the operational optimization of all our activities. We divide the projects in 4 areas: Logistic (new modals distribution), Transport (transport optimization - quality and more deliveries with less trucks), Client Support (Internet Ipiranga and Support Center), Distribution Terminals Productivity (automation and environment). This work intend to present our ideal, perfect and complete Downstream Operation model. We will talk about how close we are of this ideal model and we will present the projects that we had already developed and implanted on the automation of the terminals and the logistics area. (author)

  20. The Dynamic Coordinated Development of a Regional Environment-Tourism-Economy System: A Case Study from Western Hunan Province, China

    Directory of Open Access Journals (Sweden)

    Yaoqing Yuan

    2014-08-01

    Full Text Available Based on regional coordination theory and system theory, the authors constructed an evaluation index system for the coordinated development of a regional environment-tourism-economy system with a pressure-state-response (PSR model. With a coordinated development model, it further empirically analyzed the coordinated development state of an environment-tourism-economy system in western Hunan from 2001 to 2012. The results showed that, although this environment-tourism-economy system failed to achieve a high benefit index, inter-subsystem coupling extent, and coordinated development index, the three indices presented an increasing overall trend. This outcome suggested that the sub-systems in this system were developing towards their optimal proportions: the development of these sub-systems (environmental, tourism, and economic was unbalanced in western Hunan. The environment therein sees only slow development although provided with a favorable ecological foundation. Economic development, which has long been lagging, acted as the main factor restricting the coordinated development of a regional environment-tourism-economy system. To promote its coordinated development in western Hunan, the following recommendations were proposed: strengthen the prediction and warnings on the evolution of the whole system; optimize the industry’s structure; reinforce environmental management.

  1. System floorplanning optimization

    KAUST Repository

    Browning, David W.

    2012-12-01

    Notebook and Laptop Original Equipment Manufacturers (OEMs) place great emphasis on creating unique system designs to differentiate themselves in the mobile market. These systems are developed from the \\'outside in\\' with the focus on how the system is perceived by the end-user. As a consequence, very little consideration is given to the interconnections or power of the devices within the system with a mentality of \\'just make it fit\\'. In this paper we discuss the challenges of Notebook system design and the steps by which system floor-planning tools and algorithms can be used to provide an automated method to optimize this process to ensure all required components most optimally fit inside the Notebook system. © 2012 IEEE.

  2. System floorplanning optimization

    KAUST Repository

    Browning, David W.

    2013-01-10

    Notebook and Laptop Original Equipment Manufacturers (OEMs) place great emphasis on creating unique system designs to differentiate themselves in the mobile market. These systems are developed from the \\'outside in\\' with the focus on how the system is perceived by the end-user. As a consequence, very little consideration is given to the interconnections or power of the devices within the system with a mentality of \\'just make it fit\\'. In this paper we discuss the challenges of Notebook system design and the steps by which system floor-planning tools and algorithms can be used to provide an automated method to optimize this process to ensure all required components most optimally fit inside the Notebook system.

  3. Process automation

    International Nuclear Information System (INIS)

    Moser, D.R.

    1986-01-01

    Process automation technology has been pursued in the chemical processing industries and to a very limited extent in nuclear fuel reprocessing. Its effective use has been restricted in the past by the lack of diverse and reliable process instrumentation and the unavailability of sophisticated software designed for process control. The Integrated Equipment Test (IET) facility was developed by the Consolidated Fuel Reprocessing Program (CFRP) in part to demonstrate new concepts for control of advanced nuclear fuel reprocessing plants. A demonstration of fuel reprocessing equipment automation using advanced instrumentation and a modern, microprocessor-based control system is nearing completion in the facility. This facility provides for the synergistic testing of all chemical process features of a prototypical fuel reprocessing plant that can be attained with unirradiated uranium-bearing feed materials. The unique equipment and mission of the IET facility make it an ideal test bed for automation studies. This effort will provide for the demonstration of the plant automation concept and for the development of techniques for similar applications in a full-scale plant. A set of preliminary recommendations for implementing process automation has been compiled. Some of these concepts are not generally recognized or accepted. The automation work now under way in the IET facility should be useful to others in helping avoid costly mistakes because of the underutilization or misapplication of process automation. 6 figs

  4. Learning optimal embedded cascades.

    Science.gov (United States)

    Saberian, Mohammad Javad; Vasconcelos, Nuno

    2012-10-01

    The problem of automatic and optimal design of embedded object detector cascades is considered. Two main challenges are identified: optimization of the cascade configuration and optimization of individual cascade stages, so as to achieve the best tradeoff between classification accuracy and speed, under a detection rate constraint. Two novel boosting algorithms are proposed to address these problems. The first, RCBoost, formulates boosting as a constrained optimization problem which is solved with a barrier penalty method. The constraint is the target detection rate, which is met at all iterations of the boosting process. This enables the design of embedded cascades of known configuration without extensive cross validation or heuristics. The second, ECBoost, searches over cascade configurations to achieve the optimal tradeoff between classification risk and speed. The two algorithms are combined into an overall boosting procedure, RCECBoost, which optimizes both the cascade configuration and its stages under a detection rate constraint, in a fully automated manner. Extensive experiments in face, car, pedestrian, and panda detection show that the resulting detectors achieve an accuracy versus speed tradeoff superior to those of previous methods.

  5. Searching for globally optimal functional forms for interatomic potentials using genetic programming with parallel tempering.

    Science.gov (United States)

    Slepoy, A; Peters, M D; Thompson, A P

    2007-11-30

    Molecular dynamics and other molecular simulation methods rely on a potential energy function, based only on the relative coordinates of the atomic nuclei. Such a function, called a force field, approximately represents the electronic structure interactions of a condensed matter system. Developing such approximate functions and fitting their parameters remains an arduous, time-consuming process, relying on expert physical intuition. To address this problem, a functional programming methodology was developed that may enable automated discovery of entirely new force-field functional forms, while simultaneously fitting parameter values. The method uses a combination of genetic programming, Metropolis Monte Carlo importance sampling and parallel tempering, to efficiently search a large space of candidate functional forms and parameters. The methodology was tested using a nontrivial problem with a well-defined globally optimal solution: a small set of atomic configurations was generated and the energy of each configuration was calculated using the Lennard-Jones pair potential. Starting with a population of random functions, our fully automated, massively parallel implementation of the method reproducibly discovered the original Lennard-Jones pair potential by searching for several hours on 100 processors, sampling only a minuscule portion of the total search space. This result indicates that, with further improvement, the method may be suitable for unsupervised development of more accurate force fields with completely new functional forms. Copyright (c) 2007 Wiley Periodicals, Inc.

  6. Automated experimentation in ecological networks.

    Science.gov (United States)

    Lurgi, Miguel; Robertson, David

    2011-05-09

    In ecological networks, natural communities are studied from a complex systems perspective by representing interactions among species within them in the form of a graph, which is in turn analysed using mathematical tools. Topological features encountered in complex networks have been proved to provide the systems they represent with interesting attributes such as robustness and stability, which in ecological systems translates into the ability of communities to resist perturbations of different kinds. A focus of research in community ecology is on understanding the mechanisms by which these complex networks of interactions among species in a community arise. We employ an agent-based approach to model ecological processes operating at the species' interaction level for the study of the emergence of organisation in ecological networks. We have designed protocols of interaction among agents in a multi-agent system based on ecological processes occurring at the interaction level between species in plant-animal mutualistic communities. Interaction models for agents coordination thus engineered facilitate the emergence of network features such as those found in ecological networks of interacting species, in our artificial societies of agents. Agent based models developed in this way facilitate the automation of the design an execution of simulation experiments that allow for the exploration of diverse behavioural mechanisms believed to be responsible for community organisation in ecological communities. This automated way of conducting experiments empowers the study of ecological networks by exploiting the expressive power of interaction models specification in agent systems.

  7. OPTIMIZATION OF ATM AND BRANCH CASH OPERATIONS USING AN INTEGRATED CASH REQUIREMENT FORECASTING AND CASH OPTIMIZATION MODEL

    OpenAIRE

    Canser BİLİR

    2018-01-01

    In this study, an integrated cash requirement forecasting and cash inventory optimization model is implemented in both the branch and automated teller machine (ATM) networks of a mid-sized bank in Turkey to optimize the bank’s cash supply chain. The implemented model’s objective is to minimize the idle cash levels at both branches and ATMs without decreasing the customer service level (CSL) by providing the correct amount of cash at the correct location and time. To the best of our knowledge,...

  8. Laboratory automation: a challenge for the 1990s.

    Science.gov (United States)

    Mordini, C

    1994-01-01

    THERE IS TREMENDOUS PRESSURE ON INDUSTRY AND LABORATORIES TO DEVELOP INCREASINGLY COMPLEX PROCUCTS: for example catalysts, chiral chemicals, drugs and ceramics; conform to regulations; cope with increasingly severe competition; and meet steadily increasing costs. It is difficult, in this situation, to remain productive and competitive. It is vital to be equipped with, and be able to use appropriately, all the suitable methodologies and technologies. Working methods and personnel have to be appropriate. The future depends on three interdependent domains: automation in the broadest sense of the word, instrumentation and information systems. The easy work has already been done. Between 1984 and 1990, it was a question of going from nothing to something; now, it is necessary to increase and optimize.THEREFORE, THE CRUCIAL QUESTION IS NOW: 'how can we go quicker in experimentation and acquire more knowledge, while spending less money?' One solution is to use all the aspects of automation (robotics, instrumentation, data). Successful laboratory automation depends.on: shortened time to market; improved efficiency/cost ratio; motivation/competence/ expertise; communication; and knowledge acquisition. This paper examines some of the major technological areas of application.

  9. Coordination and collective performance: Cooperative goals boost interpersonal synchrony and task outcomes

    Directory of Open Access Journals (Sweden)

    Jamie S. Allsop

    2016-09-01

    Full Text Available Whether it be a rugby team or a rescue crew, ensuring peak group performance is a primary goal during collective activities. In reality however, groups often suffer from productivity losses that can lead to less than optimal outputs. Where researchers have focused on this problem, inefficiencies in the way team members coordinate their efforts has been identified as one potent source of productivity decrements. Here we set out to explore whether performance on a simple object movement task is shaped by the spontaneous emergence of interpersonally coordinated behavior. Forty-six pairs of participants were instructed to either compete or cooperate in order to empty a container of approximately 100 small plastic balls as quickly and accurately as possible. Each trial was recorded to video and a frame-differencing approach was employed to estimate between-person coordination. The results revealed that cooperative pairs coordinated to a greater extent than their competitive counterparts. Furthermore, coordination, as well as movement regularity were positively related to accuracy, an effect that was most prominent when the task was structured such that opportunities to coordinate were restricted. These findings are discussed with regard to contemporary theories of coordination and collective performance.

  10. Method for semi-automated microscopy of filtration-enriched circulating tumor cells.

    Science.gov (United States)

    Pailler, Emma; Oulhen, Marianne; Billiot, Fanny; Galland, Alexandre; Auger, Nathalie; Faugeroux, Vincent; Laplace-Builhé, Corinne; Besse, Benjamin; Loriot, Yohann; Ngo-Camus, Maud; Hemanda, Merouan; Lindsay, Colin R; Soria, Jean-Charles; Vielh, Philippe; Farace, Françoise

    2016-07-14

    Circulating tumor cell (CTC)-filtration methods capture high numbers of CTCs in non-small-cell lung cancer (NSCLC) and metastatic prostate cancer (mPCa) patients, and hold promise as a non-invasive technique for treatment selection and disease monitoring. However filters have drawbacks that make the automation of microscopy challenging. We report the semi-automated microscopy method we developed to analyze filtration-enriched CTCs from NSCLC and mPCa patients. Spiked cell lines in normal blood and CTCs were enriched by ISET (isolation by size of epithelial tumor cells). Fluorescent staining was carried out using epithelial (pan-cytokeratins, EpCAM), mesenchymal (vimentin, N-cadherin), leukocyte (CD45) markers and DAPI. Cytomorphological staining was carried out with Mayer-Hemalun or Diff-Quik. ALK-, ROS1-, ERG-rearrangement were detected by filter-adapted-FISH (FA-FISH). Microscopy was carried out using an Ariol scanner. Two combined assays were developed. The first assay sequentially combined four-color fluorescent staining, scanning, automated selection of CD45(-) cells, cytomorphological staining, then scanning and analysis of CD45(-) cell phenotypical and cytomorphological characteristics. CD45(-) cell selection was based on DAPI and CD45 intensity, and a nuclear area >55 μm(2). The second assay sequentially combined fluorescent staining, automated selection of CD45(-) cells, FISH scanning on CD45(-) cells, then analysis of CD45(-) cell FISH signals. Specific scanning parameters were developed to deal with the uneven surface of filters and CTC characteristics. Thirty z-stacks spaced 0.6 μm apart were defined as the optimal setting, scanning 82 %, 91 %, and 95 % of CTCs in ALK-, ROS1-, and ERG-rearranged patients respectively. A multi-exposure protocol consisting of three separate exposure times for green and red fluorochromes was optimized to analyze the intensity, size and thickness of FISH signals. The semi-automated microscopy method reported here

  11. Automated oil spill detection with multispectral imagery

    Science.gov (United States)

    Bradford, Brian N.; Sanchez-Reyes, Pedro J.

    2011-06-01

    In this publication we present an automated detection method for ocean surface oil, like that which existed in the Gulf of Mexico as a result of the April 20, 2010 Deepwater Horizon drilling rig explosion. Regions of surface oil in airborne imagery are isolated using red, green, and blue bands from multispectral data sets. The oil shape isolation procedure involves a series of image processing functions to draw out the visual phenomenological features of the surface oil. These functions include selective color band combinations, contrast enhancement and histogram warping. An image segmentation process then separates out contiguous regions of oil to provide a raster mask to an analyst. We automate the detection algorithm to allow large volumes of data to be processed in a short time period, which can provide timely oil coverage statistics to response crews. Geo-referenced and mosaicked data sets enable the largest identified oil regions to be mapped to exact geographic coordinates. In our simulation, multispectral imagery came from multiple sources including first-hand data collected from the Gulf. Results of the simulation show the oil spill coverage area as a raster mask, along with histogram statistics of the oil pixels. A rough square footage estimate of the coverage is reported if the image ground sample distance is available.

  12. Automation bias: a systematic review of frequency, effect mediators, and mitigators.

    Science.gov (United States)

    Goddard, Kate; Roudsari, Abdul; Wyatt, Jeremy C

    2012-01-01

    Automation bias (AB)--the tendency to over-rely on automation--has been studied in various academic fields. Clinical decision support systems (CDSS) aim to benefit the clinical decision-making process. Although most research shows overall improved performance with use, there is often a failure to recognize the new errors that CDSS can introduce. With a focus on healthcare, a systematic review of the literature from a variety of research fields has been carried out, assessing the frequency and severity of AB, the effect mediators, and interventions potentially mitigating this effect. This is discussed alongside automation-induced complacency, or insufficient monitoring of automation output. A mix of subject specific and freetext terms around the themes of automation, human-automation interaction, and task performance and error were used to search article databases. Of 13 821 retrieved papers, 74 met the inclusion criteria. User factors such as cognitive style, decision support systems (DSS), and task specific experience mediated AB, as did attitudinal driving factors such as trust and confidence. Environmental mediators included workload, task complexity, and time constraint, which pressurized cognitive resources. Mitigators of AB included implementation factors such as training and emphasizing user accountability, and DSS design factors such as the position of advice on the screen, updated confidence levels attached to DSS output, and the provision of information versus recommendation. By uncovering the mechanisms by which AB operates, this review aims to help optimize the clinical decision-making process for CDSS developers and healthcare practitioners.

  13. Production-process optimization algorithm: Application to fed-batch bioprocess

    Czech Academy of Sciences Publication Activity Database

    Pčolka, M.; Čelikovský, Sergej

    2017-01-01

    Roč. 354, č. 18 (2017), s. 8529-8551 ISSN 0016-0032 R&D Projects: GA ČR(CZ) GA17-04682S Institutional support: RVO:67985556 Keywords : Optimal control * Bioprocess * Optimization Subject RIV: BC - Control Systems Theory OBOR OECD: Automation and control systems Impact factor: 3.139, year: 2016 https://doi.org/10.1016/j.jfranklin.2017.10.012

  14. Joint force protection advanced security system (JFPASS) "the future of force protection: integrate and automate"

    Science.gov (United States)

    Lama, Carlos E.; Fagan, Joe E.

    2009-09-01

    The United States Department of Defense (DoD) defines 'force protection' as "preventive measures taken to mitigate hostile actions against DoD personnel (to include family members), resources, facilities, and critical information." Advanced technologies enable significant improvements in automating and distributing situation awareness, optimizing operator time, and improving sustainability, which enhance protection and lower costs. The JFPASS Joint Capability Technology Demonstration (JCTD) demonstrates a force protection environment that combines physical security and Chemical, Biological, Radiological, Nuclear, and Explosive (CBRNE) defense through the application of integrated command and control and data fusion. The JFPASS JCTD provides a layered approach to force protection by integrating traditional sensors used in physical security, such as video cameras, battlefield surveillance radars, unmanned and unattended ground sensors. The optimization of human participation and automation of processes is achieved by employment of unmanned ground vehicles, along with remotely operated lethal and less-than-lethal weapon systems. These capabilities are integrated via a tailorable, user-defined common operational picture display through a data fusion engine operating in the background. The combined systems automate the screening of alarms, manage the information displays, and provide assessment and response measures. The data fusion engine links disparate sensors and systems, and applies tailored logic to focus the assessment of events. It enables timely responses by providing the user with automated and semi-automated decision support tools. The JFPASS JCTD uses standard communication/data exchange protocols, which allow the system to incorporate future sensor technologies or communication networks, while maintaining the ability to communicate with legacy or existing systems.

  15. Synthesis of Mechanisms

    DEFF Research Database (Denmark)

    Hansen, John Michael

    1999-01-01

    These notes describe an automated procedure for analysis and synthesis of mechanisms. The analysis method is based on the body coordinate formulation, and the synthesis is based on applying optimization methods, used to minimize the difference between an actual and a desired behaviour...

  16. Machine assisted reaction optimization: A self-optimizing reactor system for continuous-flow photochemical reactions

    KAUST Repository

    Poscharny, K.; Fabry, D.C.; Heddrich, S.; Sugiono, E.; Liauw, M.A.; Rueping, Magnus

    2018-01-01

    A methodology for the synthesis of oxetanes from benzophenone and furan derivatives is presented. UV-light irradiation in batch and flow systems allowed the [2 + 2] cycloaddition reaction to proceed and a broad range of oxetanes could be synthesized in manual and automated fashion. The identification of high-yielding reaction parameters was achieved through a new self-optimizing photoreactor system.

  17. Machine assisted reaction optimization: A self-optimizing reactor system for continuous-flow photochemical reactions

    KAUST Repository

    Poscharny, K.

    2018-04-07

    A methodology for the synthesis of oxetanes from benzophenone and furan derivatives is presented. UV-light irradiation in batch and flow systems allowed the [2 + 2] cycloaddition reaction to proceed and a broad range of oxetanes could be synthesized in manual and automated fashion. The identification of high-yielding reaction parameters was achieved through a new self-optimizing photoreactor system.

  18. A Composite Contract for Coordinating a Supply Chain with Price and Effort Dependent Stochastic Demand

    Directory of Open Access Journals (Sweden)

    Yu-Shuang Liu

    2016-01-01

    Full Text Available As the demand is more sensitive to price and sales effort, this paper investigates the issue of channel coordination for a supply chain with one manufacturer and one retailer facing price and effort dependent stochastic demand. A composite contract based on the quantity-restricted returns and target sales rebate can achieve coordination in this setting. Two main problems are addressed: (1 how to coordinate the decentralized supply chain; (2 how to determine the optimal sales effort level, pricing, and inventory decisions under the additive demand case. Numerical examples are presented to verify the effectiveness of combined contract in supply chain coordination and highlight model sensitivities to parametric changes.

  19. Engineering applications of heuristic multilevel optimization methods

    Science.gov (United States)

    Barthelemy, Jean-Francois M.

    1989-01-01

    Some engineering applications of heuristic multilevel optimization methods are presented and the discussion focuses on the dependency matrix that indicates the relationship between problem functions and variables. Coordination of the subproblem optimizations is shown to be typically achieved through the use of exact or approximate sensitivity analysis. Areas for further development are identified.

  20. Automation of Geometry Input for Building Code Compliance Check

    DEFF Research Database (Denmark)

    Petrova, Ekaterina Aleksandrova; Johansen, Peter Lind; Jensen, Rasmus Lund

    2017-01-01

    Documentation of compliance with the energy performance regulations at the end of the detailed design phase is mandatory for building owners in Denmark. Therefore, besides multidisciplinary input, the building design process requires various iterative analyses, so that the optimal solutions can....... That has left the industry in constant pursuit of possibilities for integration of the tool within the Building Information Modelling environment so that the potential provided by the latter can be harvested and the processed can be optimized. This paper presents a solution for automated data extraction...... from building geometry created in Autodesk Revit and its translation to input for compliance check analysis....

  1. Multiobjective optimization framework for landmark measurement error correction in three-dimensional cephalometric tomography.

    Science.gov (United States)

    DeCesare, A; Secanell, M; Lagravère, M O; Carey, J

    2013-01-01

    The purpose of this study is to minimize errors that occur when using a four vs six landmark superimpositioning method in the cranial base to define the co-ordinate system. Cone beam CT volumetric data from ten patients were used for this study. Co-ordinate system transformations were performed. A co-ordinate system was constructed using two planes defined by four anatomical landmarks located by an orthodontist. A second co-ordinate system was constructed using four anatomical landmarks that are corrected using a numerical optimization algorithm for any landmark location operator error using information from six landmarks. The optimization algorithm minimizes the relative distance and angle between the known fixed points in the two images to find the correction. Measurement errors and co-ordinates in all axes were obtained for each co-ordinate system. Significant improvement is observed after using the landmark correction algorithm to position the final co-ordinate system. The errors found in a previous study are significantly reduced. Errors found were between 1 mm and 2 mm. When analysing real patient data, it was found that the 6-point correction algorithm reduced errors between images and increased intrapoint reliability. A novel method of optimizing the overlay of three-dimensional images using a 6-point correction algorithm was introduced and examined. This method demonstrated greater reliability and reproducibility than the previous 4-point correction algorithm.

  2. Control strategies for wind farm power optimization: LES study

    Science.gov (United States)

    Ciri, Umberto; Rotea, Mario; Leonardi, Stefano

    2017-11-01

    Turbines in wind farms operate in off-design conditions as wake interactions occur for particular wind directions. Advanced wind farm control strategies aim at coordinating and adjusting turbine operations to mitigate power losses in such conditions. Coordination is achieved by controlling on upstream turbines either the wake intensity, through the blade pitch angle or the generator torque, or the wake direction, through yaw misalignment. Downstream turbines can be adapted to work in waked conditions and limit power losses, using the blade pitch angle or the generator torque. As wind conditions in wind farm operations may change significantly, it is difficult to determine and parameterize the variations of the coordinated optimal settings. An alternative is model-free control and optimization of wind farms, which does not require any parameterization and can track the optimal settings as conditions vary. In this work, we employ a model-free optimization algorithm, extremum-seeking control, to find the optimal set-points of generator torque, blade pitch and yaw angle for a three-turbine configuration. Large-Eddy Simulations are used to provide a virtual environment to evaluate the performance of the control strategies under realistic, unsteady incoming wind. This work was supported by the National Science Foundation, Grants No. 1243482 (the WINDINSPIRE project) and IIP 1362033 (I/UCRC WindSTAR). TACC is acknowledged for providing computational time.

  3. Non-uniformly sampled grids in double pole coordinate system for freeform reflector construction

    Science.gov (United States)

    Ma, Donglin; Pacheco, Shaun; Feng, Zexin; Liang, Rongguang

    2015-08-01

    We propose a new method to design freeform reflectors by nonuniformly sampling the source intensity distribution in double pole coordinate system. In double pole coordinate system, there is no pole for the whole hemisphere because both poles of the spherical coordinate system are moved to southernmost point of the sphere and overlapped together. With symmetric definition of both angular coordinates in the modified double pole coordinate system, a better match between the source intensity distribution and target irradiance distribution can be achieved for reflectors with large acceptance solid angle, leading to higher light efficiency and better uniformity on the target surface. With non-uniform sampling of the source intensity, we can design circular freeform reflector to obtain uniform rectangular illumination pattern. Aided by the feedback optimization, the freeform reflector can achieve the collection efficiency for ideal point source over 0.7 and relative standard deviation (RSD) less than 0.1.

  4. Mean-field theory of spin-glasses with finite coordination number

    Science.gov (United States)

    Kanter, I.; Sompolinsky, H.

    1987-01-01

    The mean-field theory of dilute spin-glasses is studied in the limit where the average coordination number is finite. The zero-temperature phase diagram is calculated and the relationship between the spin-glass phase and the percolation transition is discussed. The present formalism is applicable also to graph optimization problems.

  5. Coordinated Energy Management in Heterogeneous Processors

    Directory of Open Access Journals (Sweden)

    Indrani Paul

    2014-01-01

    Full Text Available This paper examines energy management in a heterogeneous processor consisting of an integrated CPU–GPU for high-performance computing (HPC applications. Energy management for HPC applications is challenged by their uncompromising performance requirements and complicated by the need for coordinating energy management across distinct core types – a new and less understood problem. We examine the intra-node CPU–GPU frequency sensitivity of HPC applications on tightly coupled CPU–GPU architectures as the first step in understanding power and performance optimization for a heterogeneous multi-node HPC system. The insights from this analysis form the basis of a coordinated energy management scheme, called DynaCo, for integrated CPU–GPU architectures. We implement DynaCo on a modern heterogeneous processor and compare its performance to a state-of-the-art power- and performance-management algorithm. DynaCo improves measured average energy-delay squared (ED2 product by up to 30% with less than 2% average performance loss across several exascale and other HPC workloads.

  6. Flexible Automation System for Determination of Elemental Composition of Incrustations in Clogged Biliary Endoprostheses Using ICP-MS.

    Science.gov (United States)

    Fleischer, Heidi; Ramani, Kinjal; Blitti, Koffi; Roddelkopf, Thomas; Warkentin, Mareike; Behrend, Detlef; Thurow, Kerstin

    2018-02-01

    Automation systems are well established in industries and life science laboratories, especially in bioscreening and high-throughput applications. An increasing demand of automation solutions can be seen in the field of analytical measurement in chemical synthesis, quality control, and medical and pharmaceutical fields, as well as research and development. In this study, an automation solution was developed and optimized for the investigation of new biliary endoprostheses (stents), which should reduce clogging after implantation in the human body. The material inside the stents (incrustations) has to be controlled regularly and under identical conditions. The elemental composition is one criterion to be monitored in stent development. The manual procedure was transferred to an automated process including sample preparation, elemental analysis using inductively coupled plasma mass spectrometry (ICP-MS), and data evaluation. Due to safety issues, microwave-assisted acid digestion was executed outside of the automation system. The performance of the automated process was determined and validated. The measurement results and the processing times were compared for both the manual and the automated procedure. Finally, real samples of stent incrustations and pig bile were analyzed using the automation system.

  7. Automated Lead Optimization of MMP-12 Inhibitors Using a Genetic Algorithm.

    Science.gov (United States)

    Pickett, Stephen D; Green, Darren V S; Hunt, David L; Pardoe, David A; Hughes, Ian

    2011-01-13

    Traditional lead optimization projects involve long synthesis and testing cycles, favoring extensive structure-activity relationship (SAR) analysis and molecular design steps, in an attempt to limit the number of cycles that a project must run to optimize a development candidate. Microfluidic-based chemistry and biology platforms, with cycle times of minutes rather than weeks, lend themselves to unattended autonomous operation. The bottleneck in the lead optimization process is therefore shifted from synthesis or test to SAR analysis and design. As such, the way is open to an algorithm-directed process, without the need for detailed user data analysis. Here, we present results of two synthesis and screening experiments, undertaken using traditional methodology, to validate a genetic algorithm optimization process for future application to a microfluidic system. The algorithm has several novel features that are important for the intended application. For example, it is robust to missing data and can suggest compounds for retest to ensure reliability of optimization. The algorithm is first validated on a retrospective analysis of an in-house library embedded in a larger virtual array of presumed inactive compounds. In a second, prospective experiment with MMP-12 as the target protein, 140 compounds are submitted for synthesis over 10 cycles of optimization. Comparison is made to the results from the full combinatorial library that was synthesized manually and tested independently. The results show that compounds selected by the algorithm are heavily biased toward the more active regions of the library, while the algorithm is robust to both missing data (compounds where synthesis failed) and inactive compounds. This publication places the full combinatorial library and biological data into the public domain with the intention of advancing research into algorithm-directed lead optimization methods.

  8. Coordination of bidding strategies in day-ahead energy and spinning reserve markets

    International Nuclear Information System (INIS)

    Fushuan Wen; David, A.K.

    2002-01-01

    In this paper, the problem of building optimally coordinated bidding strategies for competitive suppliers in day-ahead energy and spinning reserve markets is addressed. It is assumed that each supplier bids 24 linear energy supply functions and 24 linear spinning reserve supply functions, one for each hour, into the energy and spinning reserve markets, respectively, and each market is cleared separately and simultaneously for all the 24 delivery hours. Each supplier makes decisions on unit commitment and chooses the coefficients in the linear energy and spinning reserve supply functions to maximise total benefits, subject to expectations about how rival suppliers will bid in both markets. Two different bidding schemes have been suggested for each hour, and based on them an overall coordinated bidding strategy in the day-ahead energy and spinning reserve market is then developed. Stochastic optimisation models are first developed to describe these two different bidding schemes and a genetic algorithm (GA) is then used to build the optimally coordinated bidding strategies for each scheme and to develop an overall bidding strategy for the day-ahead energy and spinning reserve markets. A numerical example is utilised to illustrate the essential features of the method. (Author)

  9. A Fully Automated High-Throughput Flow Cytometry Screening System Enabling Phenotypic Drug Discovery.

    Science.gov (United States)

    Joslin, John; Gilligan, James; Anderson, Paul; Garcia, Catherine; Sharif, Orzala; Hampton, Janice; Cohen, Steven; King, Miranda; Zhou, Bin; Jiang, Shumei; Trussell, Christopher; Dunn, Robert; Fathman, John W; Snead, Jennifer L; Boitano, Anthony E; Nguyen, Tommy; Conner, Michael; Cooke, Mike; Harris, Jennifer; Ainscow, Ed; Zhou, Yingyao; Shaw, Chris; Sipes, Dan; Mainquist, James; Lesley, Scott

    2018-05-01

    The goal of high-throughput screening is to enable screening of compound libraries in an automated manner to identify quality starting points for optimization. This often involves screening a large diversity of compounds in an assay that preserves a connection to the disease pathology. Phenotypic screening is a powerful tool for drug identification, in that assays can be run without prior understanding of the target and with primary cells that closely mimic the therapeutic setting. Advanced automation and high-content imaging have enabled many complex assays, but these are still relatively slow and low throughput. To address this limitation, we have developed an automated workflow that is dedicated to processing complex phenotypic assays for flow cytometry. The system can achieve a throughput of 50,000 wells per day, resulting in a fully automated platform that enables robust phenotypic drug discovery. Over the past 5 years, this screening system has been used for a variety of drug discovery programs, across many disease areas, with many molecules advancing quickly into preclinical development and into the clinic. This report will highlight a diversity of approaches that automated flow cytometry has enabled for phenotypic drug discovery.

  10. Coordinating a Supply Chain When Manufacturer Makes Cost Reduction Investment in Supplier

    Directory of Open Access Journals (Sweden)

    Shilei Huang

    2016-01-01

    Full Text Available We consider a supply chain consisting of an upstream supplier and a downstream manufacturer, in which the supplier provides a component to the manufacturer, facing a price-sensitive and uncertain demand. The manufacturer makes cost reduction investment in the supplier to improve the supplier’s production efficiency, which benefits the entire supply chain. We derive the optimal investment and operating decisions. Both the centralized and decentralized supply chains are studied. We show that the optimal investment and operating decisions in the decentralized setting may deviate from that in the centralized setting. To avoid the profit loss caused by such a deviation, we develop a coordination mechanism by introducing a combined policy of revenue-sharing policy and investment cost-sharing policy. We also show that the developed coordination mechanism can achieve Pareto improvement for the two players.

  11. Cardiac imaging: working towards fully-automated machine analysis & interpretation.

    Science.gov (United States)

    Slomka, Piotr J; Dey, Damini; Sitek, Arkadiusz; Motwani, Manish; Berman, Daniel S; Germano, Guido

    2017-03-01

    Non-invasive imaging plays a critical role in managing patients with cardiovascular disease. Although subjective visual interpretation remains the clinical mainstay, quantitative analysis facilitates objective, evidence-based management, and advances in clinical research. This has driven developments in computing and software tools aimed at achieving fully automated image processing and quantitative analysis. In parallel, machine learning techniques have been used to rapidly integrate large amounts of clinical and quantitative imaging data to provide highly personalized individual patient-based conclusions. Areas covered: This review summarizes recent advances in automated quantitative imaging in cardiology and describes the latest techniques which incorporate machine learning principles. The review focuses on the cardiac imaging techniques which are in wide clinical use. It also discusses key issues and obstacles for these tools to become utilized in mainstream clinical practice. Expert commentary: Fully-automated processing and high-level computer interpretation of cardiac imaging are becoming a reality. Application of machine learning to the vast amounts of quantitative data generated per scan and integration with clinical data also facilitates a move to more patient-specific interpretation. These developments are unlikely to replace interpreting physicians but will provide them with highly accurate tools to detect disease, risk-stratify, and optimize patient-specific treatment. However, with each technological advance, we move further from human dependence and closer to fully-automated machine interpretation.

  12. Automated alignment of optical components for high-power diode lasers

    Science.gov (United States)

    Brecher, C.; Pyschny, N.; Haag, S.; Guerrero Lule, V.

    2012-03-01

    Despite major progress in developing brilliant laser sources a huge potential for cost reductions can be found in simpler setups and automated assembly processes, especially for large volume applications. In this presentation, a concept for flexible automation in optics assembly is presented which is based on standard micro assembly systems with relatively large workspace and modular micromanipulators to enhance the system with additional degrees of freedom and a very high motion resolution. The core component is a compact flexure-based micromanipulator especially designed for the alignment of micro optical components which will be described in detail. The manipulator has been applied in different scenarios to develop and investigate automated alignment processes. This paper focuses on the automated alignment of fast axis collimation (FAC) lenses which is a crucial step during the production of diode lasers. The handling and positioning system, the measuring arrangement for process feedback during active alignment as well as the alignment strategy will be described. The fine alignment of the FAC lens is performed with the micromanipulator under concurrent analysis of the far and the near field intensity distribution. An optimization of the image processing chains for the alignment of a FAC in front of a diode bar led to cycle times of less than 30 seconds. An outlook on other applications and future work regarding the development of automated assembly processes as well as new ideas for flexible assembly systems with desktop robots will close the talk.

  13. Development of Fully Automated Low-Cost Immunoassay System for Research Applications.

    Science.gov (United States)

    Wang, Guochun; Das, Champak; Ledden, Bradley; Sun, Qian; Nguyen, Chien

    2017-10-01

    Enzyme-linked immunosorbent assay (ELISA) automation for routine operation in a small research environment would be very attractive. A portable fully automated low-cost immunoassay system was designed, developed, and evaluated with several protein analytes. It features disposable capillary columns as the reaction sites and uses real-time calibration for improved accuracy. It reduces the overall assay time to less than 75 min with the ability of easy adaptation of new testing targets. The running cost is extremely low due to the nature of automation, as well as reduced material requirements. Details about system configuration, components selection, disposable fabrication, system assembly, and operation are reported. The performance of the system was initially established with a rabbit immunoglobulin G (IgG) assay, and an example of assay adaptation with an interleukin 6 (IL6) assay is shown. This system is ideal for research use, but could work for broader testing applications with further optimization.

  14. WARACS: Wrappers to Automate the Reconstruction of Ancestral Character States1

    Science.gov (United States)

    Gruenstaeudl, Michael

    2016-01-01

    Premise of the study: Reconstructions of ancestral character states are among the most widely used analyses for evaluating the morphological, cytological, or ecological evolution of an organismic lineage. The software application Mesquite remains the most popular application for such reconstructions among plant scientists, even though its support for automating complex analyses is limited. A software tool is needed that automates the reconstruction and visualization of ancestral character states with Mesquite and similar applications. Methods and Results: A set of command line–based Python scripts was developed that (a) communicates standardized input to and output from the software applications Mesquite, BayesTraits, and TreeGraph2; (b) automates the process of ancestral character state reconstruction; and (c) facilitates the visualization of reconstruction results. Conclusions: WARACS provides a simple tool that streamlines the reconstruction and visualization of ancestral character states over a wide array of parameters, including tree distribution, character state, and optimality criterion. PMID:26949580

  15. Decentralized Control Using Global Optimization (DCGO) (Preprint)

    National Research Council Canada - National Science Library

    Flint, Matthew; Khovanova, Tanya; Curry, Michael

    2007-01-01

    The coordination of a team of distributed air vehicles requires a complex optimization, balancing limited communication bandwidths, non-instantaneous planning times and network delays, while at the...

  16. Optimizing the response to surveillance alerts in automated surveillance systems.

    Science.gov (United States)

    Izadi, Masoumeh; Buckeridge, David L

    2011-02-28

    Although much research effort has been directed toward refining algorithms for disease outbreak alerting, considerably less attention has been given to the response to alerts generated from statistical detection algorithms. Given the inherent inaccuracy in alerting, it is imperative to develop methods that help public health personnel identify optimal policies in response to alerts. This study evaluates the application of dynamic decision making models to the problem of responding to outbreak detection methods, using anthrax surveillance as an example. Adaptive optimization through approximate dynamic programming is used to generate a policy for decision making following outbreak detection. We investigate the degree to which the model can tolerate noise theoretically, in order to keep near optimal behavior. We also evaluate the policy from our model empirically and compare it with current approaches in routine public health practice for investigating alerts. Timeliness of outbreak confirmation and total costs associated with the decisions made are used as performance measures. Using our approach, on average, 80 per cent of outbreaks were confirmed prior to the fifth day of post-attack with considerably less cost compared to response strategies currently in use. Experimental results are also provided to illustrate the robustness of the adaptive optimization approach and to show the realization of the derived error bounds in practice. Copyright © 2011 John Wiley & Sons, Ltd.

  17. An automated approach to the design of decision tree classifiers

    Science.gov (United States)

    Argentiero, P.; Chin, R.; Beaudet, P.

    1982-01-01

    An automated technique is presented for designing effective decision tree classifiers predicated only on a priori class statistics. The procedure relies on linear feature extractions and Bayes table look-up decision rules. Associated error matrices are computed and utilized to provide an optimal design of the decision tree at each so-called 'node'. A by-product of this procedure is a simple algorithm for computing the global probability of correct classification assuming the statistical independence of the decision rules. Attention is given to a more precise definition of decision tree classification, the mathematical details on the technique for automated decision tree design, and an example of a simple application of the procedure using class statistics acquired from an actual Landsat scene.

  18. Improved Solutions for the Optimal Coordination of DOCRs Using Firefly Algorithm

    Directory of Open Access Journals (Sweden)

    Muhammad Sulaiman

    2018-01-01

    Full Text Available Nature-inspired optimization techniques are useful tools in electrical engineering problems to minimize or maximize an objective function. In this paper, we use the firefly algorithm to improve the optimal solution for the problem of directional overcurrent relays (DOCRs. It is a complex and highly nonlinear constrained optimization problem. In this problem, we have two types of design variables, which are variables for plug settings (PSs and the time dial settings (TDSs for each relay in the circuit. The objective function is to minimize the total operating time of all the basic relays to avoid unnecessary delays. We have considered four models in this paper which are IEEE (3-bus, 4-bus, 6-bus, and 8-bus models. From the numerical results, it is obvious that the firefly algorithm with certain parameter settings performs better than the other state-of-the-art algorithms.

  19. Synthesis Study on Transitions in Signal Infrastructure and Control Algorithms for Connected and Automated Transportation

    Energy Technology Data Exchange (ETDEWEB)

    Aziz, H. M. Abdul [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Wang, Hong [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Young, Stan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Sperling, Joshua [National Renewable Energy Lab. (NREL), Golden, CO (United States); Beck, John [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2017-06-01

    Documenting existing state of practice is an initial step in developing future control infrastructure to be co-deployed for heterogeneous mix of connected and automated vehicles with human drivers while leveraging benefits to safety, congestion, and energy. With advances in information technology and extensive deployment of connected and automated vehicle technology anticipated over the coming decades, cities globally are making efforts to plan and prepare for these transitions. CAVs not only offer opportunities to improve transportation systems through enhanced safety and efficient operations of vehicles. There are also significant needs in terms of exploring how best to leverage vehicle-to-vehicle (V2V) technology, vehicle-to-infrastructure (V2I) technology and vehicle-to-everything (V2X) technology. Both Connected Vehicle (CV) and Connected and Automated Vehicle (CAV) paradigms feature bi-directional connectivity and share similar applications in terms of signal control algorithm and infrastructure implementation. The discussion in our synthesis study assumes the CAV/CV context where connectivity exists with or without automated vehicles. Our synthesis study explores the current state of signal control algorithms and infrastructure, reports the completed and newly proposed CV/CAV deployment studies regarding signal control schemes, reviews the deployment costs for CAV/AV signal infrastructure, and concludes with a discussion on the opportunities such as detector free signal control schemes and dynamic performance management for intersections, and challenges such as dependency on market adaptation and the need to build a fault-tolerant signal system deployment in a CAV/CV environment. The study will serve as an initial critical assessment of existing signal control infrastructure (devices, control instruments, and firmware) and control schemes (actuated, adaptive, and coordinated-green wave). Also, the report will help to identify the future needs for the signal

  20. Poisson Coordinates.

    Science.gov (United States)

    Li, Xian-Ying; Hu, Shi-Min

    2013-02-01

    Harmonic functions are the critical points of a Dirichlet energy functional, the linear projections of conformal maps. They play an important role in computer graphics, particularly for gradient-domain image processing and shape-preserving geometric computation. We propose Poisson coordinates, a novel transfinite interpolation scheme based on the Poisson integral formula, as a rapid way to estimate a harmonic function on a certain domain with desired boundary values. Poisson coordinates are an extension of the Mean Value coordinates (MVCs) which inherit their linear precision, smoothness, and kernel positivity. We give explicit formulas for Poisson coordinates in both continuous and 2D discrete forms. Superior to MVCs, Poisson coordinates are proved to be pseudoharmonic (i.e., they reproduce harmonic functions on n-dimensional balls). Our experimental results show that Poisson coordinates have lower Dirichlet energies than MVCs on a number of typical 2D domains (particularly convex domains). As well as presenting a formula, our approach provides useful insights for further studies on coordinates-based interpolation and fast estimation of harmonic functions.

  1. Lean automation development : applying lean principles to the automation development process

    OpenAIRE

    Granlund, Anna; Wiktorsson, Magnus; Grahn, Sten; Friedler, Niklas

    2014-01-01

    By a broad empirical study it is indicated that automation development show potential of improvement. In the paper, 13 lean product development principles are contrasted to the automation development process and it is suggested why and how these principles can facilitate, support and improve the automation development process. The paper summarises a description of what characterises a lean automation development process and what consequences it entails. Main differences compared to current pr...

  2. COORDINATION OF THE WORK OF BUSES IN CITY ROUTES

    Directory of Open Access Journals (Sweden)

    Fuad DASHDAMIROV

    2013-12-01

    Full Text Available The paper studied the work of bus routes passing through a street. Optimality criterion was chosen for the development of appropriate models of effective work of buses on the land. The paper proposes a new model costing time passengers at bus stops. A method of technique was developed to coordinate the buses running on the combined section of route.

  3. Optimization of pressurized water reactor shuffling by simulated annealing with heuristics

    International Nuclear Information System (INIS)

    Stevens, J.G.; Smith, K.S.; Rempe, K.R.; Downar, T.J.

    1995-01-01

    Simulated-annealing optimization of reactor core loading patterns is implemented with support for design heuristics during candidate pattern generation. The SIMAN optimization module uses the advanced nodal method of SIMULATE-3 and the full cross-section detail of CASMO-3 to evaluate accurately the neutronic performance of each candidate, resulting in high-quality patterns. The use of heuristics within simulated annealing is explored. Heuristics improve the consistency of optimization results for both fast- and slow-annealing runs with no penalty from the exclusion of unusual candidates. Thus, the heuristic application of designer judgment during automated pattern generation is shown to be effective. The capability of the SIMAN module to find and evaluate families of loading patterns that satisfy design constraints and have good objective performance within practical run times is demonstrated. The use of automated evaluations of successive cycles to explore multicycle effects of design decisions is discussed

  4. Rapid Optimal Generation Algorithm for Terrain Following Trajectory Based on Optimal Control

    Institute of Scientific and Technical Information of China (English)

    杨剑影; 张海; 谢邦荣; 尹健

    2004-01-01

    Based on the optimal control theory, a 3-dimensionnal direct generation algorithm is proposed for anti-ground low altitude penetration tasks under complex terrain. By optimizing the terrain following(TF) objective function,terrain coordinate system, missile dynamic model and control vector, the TF issue is turning into the improved optimal control problem whose mathmatical model is simple and need not solve the second order terrain derivative. Simulation results prove that this method is reasonable and feasible. The TF precision is in the scope from 0.3 m to 3.0 m,and the planning time is less than 30 min. This method have the strongpionts such as rapidness, precision and has great application value.

  5. Research on Supply Chain Coordination of Fresh Agricultural Products under Agricultural Insurance

    Directory of Open Access Journals (Sweden)

    Zhang Pei

    2017-01-01

    Full Text Available Based on the fact that the current fresh agricultural products are susceptible to natural risks and the coordination of supply chain is poor, This paper constructs the supply chain profit model under the two models of natural risk and agricultural insurance, Firstly, studying the coordination function of the supply chain system under Two-part Tariff; Then discussing the setting and claiming mechanism of agricultural insurance, compares the influence of agricultural insurance on supply chain profit and supply chain coordination; Finally, giving an example to validate the model results and give decision - making opinions. Research shows that the supply chain of fresh agricultural products can coordinated under Two-part Tariff, but the supply chain cooperation is poor in the natural risk , need to further stabilize and optimize the supply chain; When the risk factor is less than the non-participation insurance coefficient, not to participate in agricultural insurance is conducive to maintaining the coordination of the supply chain system; When the risk coefficient exceeds the non-participation insurance coefficient, the introduction of agricultural insurance can not only effectively manage the natural risks, but also help to improve the coordination of the supply chain system.

  6. Both Automation and Paper.

    Science.gov (United States)

    Purcell, Royal

    1988-01-01

    Discusses the concept of a paperless society and the current situation in library automation. Various applications of automation and telecommunications are addressed, and future library automation is considered. Automation at the Monroe County Public Library in Bloomington, Indiana, is described as an example. (MES)

  7. Engineering systems for novel automation methods

    International Nuclear Information System (INIS)

    Fischer, H.D.

    1997-01-01

    Modern automation methods of Optimal Control, or for state reconstruction or parameter identification, require a discrete dynamic path model. This is established among others by time and location discretisation of a system of partial differential equations. The digital wave filter principle is paricularly suitable for this purpose, since the numeric stability of the derived algorithms can be easily guaranteed, and their robustness as to effects of word length limitations can be proven. This principle is also particularly attractive in that it can be excellently integrated into currently existing engineering systems for instrumentation and control. (orig./CB) [de

  8. Automation and Intensity Modulated Radiation Therapy for Individualized High-Quality Tangent Breast Treatment Plans

    International Nuclear Information System (INIS)

    Purdie, Thomas G.; Dinniwell, Robert E.; Fyles, Anthony; Sharpe, Michael B.

    2014-01-01

    Purpose: To demonstrate the large-scale clinical implementation and performance of an automated treatment planning methodology for tangential breast intensity modulated radiation therapy (IMRT). Methods and Materials: Automated planning was used to prospectively plan tangential breast IMRT treatment for 1661 patients between June 2009 and November 2012. The automated planning method emulates the manual steps performed by the user during treatment planning, including anatomical segmentation, beam placement, optimization, dose calculation, and plan documentation. The user specifies clinical requirements of the plan to be generated through a user interface embedded in the planning system. The automated method uses heuristic algorithms to define and simplify the technical aspects of the treatment planning process. Results: Automated planning was used in 1661 of 1708 patients receiving tangential breast IMRT during the time interval studied. Therefore, automated planning was applicable in greater than 97% of cases. The time for treatment planning using the automated process is routinely 5 to 6 minutes on standard commercially available planning hardware. We have shown a consistent reduction in plan rejections from plan reviews through the standard quality control process or weekly quality review multidisciplinary breast rounds as we have automated the planning process for tangential breast IMRT. Clinical plan acceptance increased from 97.3% using our previous semiautomated inverse method to 98.9% using the fully automated method. Conclusions: Automation has become the routine standard method for treatment planning of tangential breast IMRT at our institution and is clinically feasible on a large scale. The method has wide clinical applicability and can add tremendous efficiency, standardization, and quality to the current treatment planning process. The use of automated methods can allow centers to more rapidly adopt IMRT and enhance access to the documented

  9. Influence factor on automated synthesis yield of 3'-deoxy-3'-[18F] fluorothymidine

    International Nuclear Information System (INIS)

    Zhang Jinming; Tian Jiahe; Liu Changbin; Liu Jian; Luo Zhigang

    2009-01-01

    3'-deoxy-3'-[ 18 F] fluorothymidine ( 18 F-FLT) was prepared from N-BOC precursor to improve the synthesis yield, chemical purity and radiochemical purity of 18 F-FLT by home-made automated synthesis module. The results showed that residual water in synthesis system and the amount of precursor could affect the synthesis yield dramatically. The more the amount of precursor, the higher the synthesis yield of N-BOC. The residual water can decrease the synthesis yield. In the presence of excess base, the precursor was consumed by elimination before substitution was completed. The precursor to base was optimal in 1 to 1. The balance of semi-preparatiove HPLC Column can affect purified the final 18 F-FLT product. The chemical purity of 18 F-FLT could be decreased with 8% EtOH as mobile phase in semi-preparatiove HPLC. The high chemical purity, radiochemical purity and synthesis yield could be obtained by optimized the parameter of synthesis with home-made automated synthesis module. (authors)

  10. Automated Groundwater Screening

    International Nuclear Information System (INIS)

    Taylor, Glenn A.; Collard, Leonard B.

    2005-01-01

    The Automated Intruder Analysis has been extended to include an Automated Ground Water Screening option. This option screens 825 radionuclides while rigorously applying the National Council on Radiation Protection (NCRP) methodology. An extension to that methodology is presented to give a more realistic screening factor for those radionuclides which have significant daughters. The extension has the promise of reducing the number of radionuclides which must be tracked by the customer. By combining the Automated Intruder Analysis with the Automated Groundwater Screening a consistent set of assumptions and databases is used. A method is proposed to eliminate trigger values by performing rigorous calculation of the screening factor thereby reducing the number of radionuclides sent to further analysis. Using the same problem definitions as in previous groundwater screenings, the automated groundwater screening found one additional nuclide, Ge-68, which failed the screening. It also found that 18 of the 57 radionuclides contained in NCRP Table 3.1 failed the screening. This report describes the automated groundwater screening computer application

  11. The use of transducers for automated radiopharmaceutical synthesis procedures

    International Nuclear Information System (INIS)

    Ruth, T.J.; Adam, M.J.; Morris, D.; Jivan, S.; Tyldesley, S.

    1991-01-01

    There are essentially two reasons why a synthetic procedure for producing a radiopharmaceutical is automated. The First is to reduce radiation exposure and the second is to increase reliability. Reducing radiation exposure can be accomplished in a number of ways. The most common approaches include the use of: hotcells with manipulators, remotely controlled solenoid valves behind shielding, either a PC or a PLC to control the solenoid valves, or robotics. The question of reliability impacts on each of these methods differently. The use of a hotcell with manipulators requires a highly skilled operator and in general is not suitable for microchemistry and very short half-lives. The remotely controlled system is prone to operator error, for example activating the wrong valving sequence. The computer controlled system is dependent on a feedback system if it is to operate open-quotes intelligentlyclose quotes; and finally the robotic system is dependent on feedbacks, as well as, careful, set-up within the robotic coordinate system. The remainder of this paper will discuss the feedback loops, required for the automated/robotic chemistry associated with the synthesis of positron emitting radiopharmaceuticals

  12. Automated tracking for advanced satellite laser ranging systems

    Science.gov (United States)

    McGarry, Jan F.; Degnan, John J.; Titterton, Paul J., Sr.; Sweeney, Harold E.; Conklin, Brion P.; Dunn, Peter J.

    1996-06-01

    NASA's Satellite Laser Ranging Network was originally developed during the 1970's to track satellites carrying corner cube reflectors. Today eight NASA systems, achieving millimeter ranging precision, are part of a global network of more than 40 stations that track 17 international satellites. To meet the tracking demands of a steadily growing satellite constellation within existing resources, NASA is embarking on a major automation program. While manpower on the current systems will be reduced to a single operator, the fully automated SLR2000 system is being designed to operate for months without human intervention. Because SLR2000 must be eyesafe and operate in daylight, tracking is often performed in a low probability of detection and high noise environment. The goal is to automatically select the satellite, setup the tracking and ranging hardware, verify acquisition, and close the tracking loop to optimize data yield. TO accomplish the autotracking tasks, we are investigating (1) improved satellite force models, (2) more frequent updates of orbital ephemerides, (3) lunar laser ranging data processing techniques to distinguish satellite returns from noise, and (4) angular detection and search techniques to acquire the satellite. A Monte Carlo simulator has been developed to allow optimization of the autotracking algorithms by modeling the relevant system errors and then checking performance against system truth. A combination of simulator and preliminary field results will be presented.

  13. Optimization of technological planning of the equipment in innovative project of modernization of machine-building production

    Directory of Open Access Journals (Sweden)

    Nasibullin D.R.

    2016-11-01

    Full Text Available this article describes the ways to improve the automated system of technological preparation of manufacturing. The method for optimizing the planning of technological equipment based on the use of artificial neural networks was developed for the automated system of technological preparation of manufacturing.

  14. On the use of polar coordinate system in the projective graphic drawings

    Directory of Open Access Journals (Sweden)

    Ivashchenko Andrey Viktorovich

    2016-11-01

    Full Text Available Projective graphics is a polyhedra simulation method, which is based on the use of trace diagrams of initial polyhedron. Previously developed computer software allows using Cartesian coordinates. In some cases it is advisable to use polar coordinate system for description of projective graphics drawings. Using the example of icosahedron the authors analyzed the advantages of using projective graphics drawings in the polar coordinate system. The transition to the polar coordinate system is a tool that allows using certain patterns of projective graphics drawings in the process of calculation. When using polar coordinate system the search of Polar correspondence for the directs is simplified. In order to analyze the two lines in the polar coordinate system it is enough to compare the corresponding coefficients of the equations of these lines. The authors consider a diagram of the icosahedron in polar coordinates, and a corresponding fragment of calculation program in the Mathematica system. Some examples of forming based on icosahedrons are offered. Optimization of computer programs using polar coordinate system will simplifies the calculations of projective graphics drawings, accelerates the process of constructing three-dimensional models, which expand the possibilities of selecting original solutions. Finally, the authors conclude that it is appropriate to use the polar coordinate system only in the construction of projective graphics diagrams of the planes system having rich symmetry. All Platonic and Archimedean solids, Catalan solid possess this property.

  15. Conjugate gradient optimization programs for shuttle reentry

    Science.gov (United States)

    Powers, W. F.; Jacobson, R. A.; Leonard, D. A.

    1972-01-01

    Two computer programs for shuttle reentry trajectory optimization are listed and described. Both programs use the conjugate gradient method as the optimization procedure. The Phase 1 Program is developed in cartesian coordinates for a rotating spherical earth, and crossrange, downrange, maximum deceleration, total heating, and terminal speed, altitude, and flight path angle are included in the performance index. The programs make extensive use of subroutines so that they may be easily adapted to other atmospheric trajectory optimization problems.

  16. Development of accurate standardized algorithms for conversion between SRP grid coordinates and latitude/longitude

    International Nuclear Information System (INIS)

    Looney, B.B.; Marsh, J.T. Jr.; Hayes, D.W.

    1987-01-01

    The Savannah Rive Plant (SRP) is a nuclear production facility operated by E.I. du Pont de Nemours and Co. for the United States Department of Energy. SRP is located along the Savannah River in South Carolina. Construction of SRP began in the early 1950's. At the time the plant was built, a local coordinate system was developed to assist in defining the locations of plant facilities. Over the years, large quantities of data have been developed using ''SRP Coordinates.'' These data include: building locations, plant boundaries, environmental sampling locations, waste disposal area locations, and a wide range of other geographical information. Currently, staff persons at SRP are organizing these data into automated information systems to allow more rapid, more robust and higher quality interpretation, interchange and presentation of spatial data. A key element in this process is the ability to incorporate outside data bases into the systems, as well as to share SRP data with interested organizations outside as SRP. Most geographical information outside of SRP is organized using latitude and longitude. Thus, straightforward, accurate and consistent algorithms to convert SRP Coordinates to/from latitude and longitude are needed. Appropriate algorithms are presented in this document

  17. Multi-agent System for Off-line Coordinated Motion Planning of Multiple Industrial Robots

    Directory of Open Access Journals (Sweden)

    Shital S. Chiddarwar

    2011-03-01

    Full Text Available This article presents an agent based framework for coordinated motion planning of multiple robots. The emerging paradigm of agent based systems is implemented to address various issues related to safe and fast task execution when multiple robots share a common workspace. In the proposed agent based framework, each issue vital for coordinated motion planning of multiple robots and every robot participating in coordinated task is considered as an agent. The identified agents are interfaced with each other in order to incorporate the desired flexibility in the developed framework. This framework gives a complete strategy for determination of optimal trajectories of robots working in coordination with due consideration to their kinematic, dynamic and payload constraint. The complete architecture of the proposed framework and the detailed discussion on various modules are covered in this paper.

  18. Decision and coordination of low-carbon supply chain considering technological spillover and environmental awareness.

    Science.gov (United States)

    Xu, Lang; Wang, Chuanxu; Li, Hui

    2017-06-08

    We focus on the impacts of technological spillovers and environmental awareness in a two-echelon supply chain with one-single supplier and one-single manufacturer to reduce carbon emission. In this supply chain, carbon abatement investment becomes one of key factors of cutting costs and improving profits, which is reducing production costs in the components and products-the investment from players in supply chain. On the basis of optimality theory, the centralized and decentralized models are respectively established to investigate the optimal decisions and profits. Further, setting the players' profits of the decentralized scenario as the disagreement points, we propose a bargaining-coordination contract through revenue-cost sharing to enhance the performance. Finally, by theoretical comparison and numerical analysis, the results show that: (i) The optimal profits of players and supply chain improve as technological spillovers and environmental awareness increase, and the profits of them in the bargaining-coordination contract are higher than that in the decentralized scenario; (ii) Technological spillovers between the players amplify the impact of "free-ride" behavior, in which the supplier always incentives the manufacturer to improve carbon emission intensity, but the cooperation will achieves and the profits will improve only when technological spillovers and environmental awareness are great; (iii) The contract can effectively achieve coordinated supply chain, and improve carbon abatement investment.

  19. Optimization in the design and control of robotic manipulators: A survey

    International Nuclear Information System (INIS)

    Rao, S.S.; Bhatti, P.K.

    1989-01-01

    Robotics is a relatively new and evolving technology being applied to manufacturing automation and is fast replacing the special-purpose machines or hard automation as it is often called. Demands for higher productivity, better and uniform quality products, and better working environments are primary reasons for its development. An industrial robot is a multifunctional and computer-controlled mechanical manipulator exhibiting a complex and highly nonlinear behavior. Even though most current robots have anthropomorphic configurations, they have far inferior manipulating abilities compared to humans. A great deal of research effort is presently being directed toward improving their overall performance by using optimal mechanical structures and control strategies. The optimal design of robot manipulators can include kinematic performance characteristics such as workspace, accuracy, repeatability, and redundancy. The static load capacity as well as dynamic criteria such as generalized inertia ellipsoid, dynamic manipulability, and vibratory response have also been considered in the design stages. The optimal control problems typically involve trajectory planning, time-optimal control, energy-optimal control, and mixed-optimal control. The constraints in a robot manipulator design problem usually involve link stresses, actuator torques, elastic deformation of links, and collision avoidance. This paper presents a review of the literature on the issues of optimum design and control of robotic manipulators and also the various optimization techniques currently available for application to robotics

  20. Evidence Report, Risk of Inadequate Design of Human and Automation/Robotic Integration

    Science.gov (United States)

    Zumbado, Jennifer Rochlis; Billman, Dorrit; Feary, Mike; Green, Collin

    2011-01-01

    The success of future exploration missions depends, even more than today, on effective integration of humans and technology (automation and robotics). This will not emerge by chance, but by design. Both crew and ground personnel will need to do more demanding tasks in more difficult conditions, amplifying the costs of poor design and the benefits of good design. This report has looked at the importance of good design and the risks from poor design from several perspectives: 1) If the relevant functions needed for a mission are not identified, then designs of technology and its use by humans are unlikely to be effective: critical functions will be missing and irrelevant functions will mislead or drain attention. 2) If functions are not distributed effectively among the (multiple) participating humans and automation/robotic systems, later design choices can do little to repair this: additional unnecessary coordination work may be introduced, workload may be redistributed to create problems, limited human attentional resources may be wasted, and the capabilities of both humans and technology underused. 3) If the design does not promote accurate understanding of the capabilities of the technology, the operators will not use the technology effectively: the system may be switched off in conditions where it would be effective, or used for tasks or in contexts where its effectiveness may be very limited. 4) If an ineffective interaction design is implemented and put into use, a wide range of problems can ensue. Many involve lack of transparency into the system: operators may be unable or find it very difficult to determine a) the current state and changes of state of the automation or robot, b) the current state and changes in state of the system being controlled or acted on, and c) what actions by human or by system had what effects. 5) If the human interfaces for operation and control of robotic agents are not designed to accommodate the unique points of view and

  1. A realization of an automated data flow for data collecting, processing, storing and retrieving

    International Nuclear Information System (INIS)

    Friedsam, H.; Pushor, R.; Ruland, R.

    1986-11-01

    GEONET is a database system developed at the Stanford Linear Accelerator Center for the alignment of the Stanford Linear Collider. It features an automated data flow, ranging from data collection using HP110 handheld computers to processing, storing and retrieving data and finally to adjusted coordinates. This paper gives a brief introduction to the SLC project and the applied survey methods. It emphasizes the hardware and software implementation of GEONET using a network of IBM PC/XT's. 14 refs., 4 figs

  2. Matlab enhanced multi-threaded tomography optimization sequence (MEMTOS)

    International Nuclear Information System (INIS)

    Lum, Edward S.; Pope, Chad L.

    2016-01-01

    Highlights: • Monte Carlo simulation of spent nuclear fuel assembly neutron computed tomography. • Optimized parallel calculations conducted from within the MATLAB environment. • Projection difference technique used to identify anomalies in spent nuclear fuel assemblies. - Abstract: One challenge associated with spent nuclear fuel assemblies is the lack of non-destructive analysis techniques to determine if fuel pins have been removed or replaced or if there are significant defects associated with fuel pins deep within a fuel assembly. Neutron computed tomography is a promising technique for addressing these qualitative issues. Monte Carlo simulation of spent nuclear fuel neutron computed tomography allows inexpensive process investigation and optimization. The main purpose of this work is to provide a fully automated advanced simulation framework for the analysis of spent nuclear fuel inspection using neutron computed tomography. The simulation framework, called Matlab Enhanced Multi-Threaded Tomography Optimization Sequence (MEMTOS) not only automates the simulation process, but also generates superior tomography image results. MEMTOS is written in the MATLAB scripting language and addresses file management, parallel Monte Carlo execution, results extraction, and tomography image generation. This paper describes the mathematical basis for neutron computed tomography, the Monte Carlo technique used to simulate neutron computed tomography, and the overall tomography simulation optimization algorithm. Sequence results presented include overall simulation speed enhancement, tomography and image results obtained for Experimental Breeder Reactor II spent fuel assemblies and light water reactor fuel assemblies. Optimization using a projection difference technique are also described.

  3. Automated model building

    CERN Document Server

    Caferra, Ricardo; Peltier, Nicholas

    2004-01-01

    This is the first book on automated model building, a discipline of automated deduction that is of growing importance Although models and their construction are important per se, automated model building has appeared as a natural enrichment of automated deduction, especially in the attempt to capture the human way of reasoning The book provides an historical overview of the field of automated deduction, and presents the foundations of different existing approaches to model construction, in particular those developed by the authors Finite and infinite model building techniques are presented The main emphasis is on calculi-based methods, and relevant practical results are provided The book is of interest to researchers and graduate students in computer science, computational logic and artificial intelligence It can also be used as a textbook in advanced undergraduate courses

  4. A Generic Methodology for Superstructure Optimization of Different Processing Networks

    DEFF Research Database (Denmark)

    Bertran, Maria-Ona; Frauzem, Rebecca; Zhang, Lei

    2016-01-01

    In this paper, we propose a generic computer-aided methodology for synthesis of different processing networks using superstructure optimization. The methodology can handle different network optimization problems of various application fields. It integrates databases with a common data architecture......, a generic model to represent the processing steps, and appropriate optimization tools. A special software interface has been created to automate the steps in the methodology workflow, allow the transfer of data between tools and obtain the mathematical representation of the problem as required...

  5. Adaptation and performance of the Cartesian coordinates fast multipole method for nanomagnetic simulations

    International Nuclear Information System (INIS)

    Zhang Wen; Haas, Stephan

    2009-01-01

    An implementation of the fast multiple method (FMM) is performed for magnetic systems with long-ranged dipolar interactions. Expansion in spherical harmonics of the original FMM is replaced by expansion of polynomials in Cartesian coordinates, which is considerably simpler. Under open boundary conditions, an expression for multipole moments of point dipoles in a cell is derived. These make the program appropriate for nanomagnetic simulations, including magnetic nanoparticles and ferrofluids. The performance is optimized in terms of cell size and parameter set (expansion order and opening angle) and the trade off between computing time and accuracy is quantitatively studied. A rule of thumb is proposed to decide the appropriate average number of dipoles in the smallest cells, and an optimal choice of parameter set is suggested. Finally, the superiority of Cartesian coordinate FMM is demonstrated by comparison to spherical harmonics FMM and FFT.

  6. Clinical implementation of stereotaxic brain implant optimization

    International Nuclear Information System (INIS)

    Rosenow, U.F.; Wojcicka, J.B.

    1991-01-01

    This optimization method for stereotaxic brain implants is based on seed/strand configurations of the basic type developed for the National Cancer Institute (NCI) atlas of regular brain implants. Irregular target volume shapes are determined from delineation in a stack of contrast enhanced computed tomography scans. The neurosurgeon may then select up to ten directions, or entry points, of surgical approach of which the program finds the optimal one under the criterion of smallest target volume diameter. Target volume cross sections are then reconstructed in 5-mm-spaced planes perpendicular to the implantation direction defined by the entry point and the target volume center. This information is used to define a closed line in an implant cross section along which peripheral seed strands are positioned and which has now an irregular shape. Optimization points are defined opposite peripheral seeds on the target volume surface to which the treatment dose rate is prescribed. Three different optimization algorithms are available: linear least-squares programming, quadratic programming with constraints, and a simplex method. The optimization routine is implemented into a commercial treatment planning system. It generates coordinate and source strength information of the optimized seed configurations for further dose rate distribution calculation with the treatment planning system, and also the coordinate settings for the stereotaxic Brown-Roberts-Wells (BRW) implantation device

  7. Multi-objective optimization design method of radiation shielding

    International Nuclear Information System (INIS)

    Yang Shouhai; Wang Weijin; Lu Daogang; Chen Yixue

    2012-01-01

    Due to the shielding design goals of diversification and uncertain process of many factors, it is necessary to develop an optimization design method of intelligent shielding by which the shielding scheme selection will be achieved automatically and the uncertainties of human impact will be reduced. For economical feasibility to achieve a radiation shielding design for automation, the multi-objective genetic algorithm optimization of screening code which combines the genetic algorithm and discrete-ordinate method was developed to minimize the costs, size, weight, and so on. This work has some practical significance for gaining the optimization design of shielding. (authors)

  8. Automated and dynamic scheduling for geodetic VLBI - A simulation study for AuScope and global networks

    Science.gov (United States)

    Iles, E. J.; McCallum, L.; Lovell, J. E. J.; McCallum, J. N.

    2018-02-01

    As we move into the next era of geodetic VLBI, the scheduling process is one focus for improvement in terms of increased flexibility and the ability to react with changing conditions. A range of simulations were conducted to ascertain the impact of scheduling on geodetic results such as Earth Orientation Parameters (EOPs) and station coordinates. The potential capabilities of new automated scheduling modes were also simulated, using the so-called 'dynamic scheduling' technique. The primary aim was to improve efficiency for both cost and time without losing geodetic precision, particularly to maximise the uses of the Australian AuScope VLBI array. We show that short breaks in observation will not significantly degrade the results of a typical 24 h experiment, whereas simply shortening observing time degrades precision exponentially. We also confirm the new automated, dynamic scheduling mode is capable of producing the same standard of result as a traditional schedule, with close to real-time flexibility. Further, it is possible to use the dynamic scheduler to augment the 3 station Australian AuScope array and thereby attain EOPs of the current global precision with only intermittent contribution from 2 additional stations. We thus confirm automated, dynamic scheduling bears great potential for flexibility and automation in line with aims for future continuous VLBI operations.

  9. - GEONET - A Realization of an Automated Data Flow for Data Collecting, Processing, Storing, and Retrieving

    International Nuclear Information System (INIS)

    Friedsam, Horst; Pushor, Robert; Ruland, Robert; SLAC

    2005-01-01

    GEONET is a database system developed at the Stanford Linear Accelerator Center for the alignment of the Stanford Linear Collider. It features an automated data flow, ranging from data collection using HP110 handheld computers to processing, storing and retrieving data and finally to adjusted coordinates. This paper gives a brief introduction to the SLC project and the applied survey methods. It emphasizes the hardware and software implementation of GEONET using a network of IBM PC/XT's

  10. 'Outbreak Gold Standard' selection to provide optimized threshold for infectious diseases early-alert based on China Infectious Disease Automated-alert and Response System.

    Science.gov (United States)

    Wang, Rui-Ping; Jiang, Yong-Gen; Zhao, Gen-Ming; Guo, Xiao-Qin; Michael, Engelgau

    2017-12-01

    The China Infectious Disease Automated-alert and Response System (CIDARS) was successfully implemented and became operational nationwide in 2008. The CIDARS plays an important role in and has been integrated into the routine outbreak monitoring efforts of the Center for Disease Control (CDC) at all levels in China. In the CIDARS, thresholds are determined using the "Mean+2SD‟ in the early stage which have limitations. This study compared the performance of optimized thresholds defined using the "Mean +2SD‟ method to the performance of 5 novel algorithms to select optimal "Outbreak Gold Standard (OGS)‟ and corresponding thresholds for outbreak detection. Data for infectious disease were organized by calendar week and year. The "Mean+2SD‟, C1, C2, moving average (MA), seasonal model (SM), and cumulative sum (CUSUM) algorithms were applied. Outbreak signals for the predicted value (Px) were calculated using a percentile-based moving window. When the outbreak signals generated by an algorithm were in line with a Px generated outbreak signal for each week, this Px was then defined as the optimized threshold for that algorithm. In this study, six infectious diseases were selected and classified into TYPE A (chickenpox and mumps), TYPE B (influenza and rubella) and TYPE C [hand foot and mouth disease (HFMD) and scarlet fever]. Optimized thresholds for chickenpox (P 55 ), mumps (P 50 ), influenza (P 40 , P 55 , and P 75 ), rubella (P 45 and P 75 ), HFMD (P 65 and P 70 ), and scarlet fever (P 75 and P 80 ) were identified. The C1, C2, CUSUM, SM, and MA algorithms were appropriate for TYPE A. All 6 algorithms were appropriate for TYPE B. C1 and CUSUM algorithms were appropriate for TYPE C. It is critical to incorporate more flexible algorithms as OGS into the CIDRAS and to identify the proper OGS and corresponding recommended optimized threshold by different infectious disease types.

  11. Design and implementation of an automated email notification system for results of tests pending at discharge.

    Science.gov (United States)

    Dalal, Anuj K; Schnipper, Jeffrey L; Poon, Eric G; Williams, Deborah H; Rossi-Roh, Kathleen; Macleay, Allison; Liang, Catherine L; Nolido, Nyryan; Budris, Jonas; Bates, David W; Roy, Christopher L

    2012-01-01

    Physicians are often unaware of the results of tests pending at discharge (TPADs). The authors designed and implemented an automated system to notify the responsible inpatient physician of the finalized results of TPADs using secure, network email. The system coordinates a series of electronic events triggered by the discharge time stamp and sends an email to the identified discharging attending physician once finalized results are available. A carbon copy is sent to the primary care physicians in order to facilitate communication and the subsequent transfer of responsibility. Logic was incorporated to suppress selected tests and to limit notification volume. The system was activated for patients with TPADs discharged by randomly selected inpatient-attending physicians during a 6-month pilot. They received approximately 1.6 email notifications per discharged patient with TPADs. Eighty-four per cent of inpatient-attending physicians receiving automated email notifications stated that they were satisfied with the system in a brief survey (59% survey response rate). Automated email notification is a useful strategy for managing results of TPADs.

  12. Divertor design through shape optimization

    International Nuclear Information System (INIS)

    Dekeyser, W.; Baelmans, M.; Reiter, D.

    2012-01-01

    Due to the conflicting requirements, complex physical processes and large number of design variables, divertor design for next step fusion reactors is a challenging problem, often relying on large numbers of computationally expensive numerical simulations. In this paper, we attempt to partially automate the design process by solving an appropriate shape optimization problem. Design requirements are incorporated in a cost functional which measures the performance of a certain design. By means of changes in the divertor shape, which in turn lead to changes in the plasma state, this cost functional can be minimized. Using advanced adjoint methods, optimal solutions are computed very efficiently. The approach is illustrated by designing divertor targets for optimal power load spreading, using a simplified edge plasma model (copyright 2012 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim) (orig.)

  13. High precision 3D coordinates location technology for pellet

    International Nuclear Information System (INIS)

    Fan Yong; Zhang Jiacheng; Zhou Jingbin; Tang Jun; Xiao Decheng; Wang Chuanke; Dong Jianjun

    2010-01-01

    In inertial confinement fusion (ICF) system, manual way has been used to collimate the pellet traditionally, which is time-consuming and low-level automated. A new method based on Binocular Vision is proposed, which can place the prospecting apparatus on the public diagnosis platform to reach relevant engineering target and uses the high precision two dimension calibration board. Iterative method is adopted to satisfy 0.1 pixel for corner extraction precision. Furthermore, SVD decomposition is used to remove the singularity corners and advanced Zhang's calibration method is applied to promote camera calibration precision. Experiments indicate that the RMS of three dimension coordinate measurement precision is 25 μm, and the max system RMS of distance measurement is better than 100 μm, satisfying the system index requirement. (authors)

  14. Automation of POST Cases via External Optimizer and "Artificial p2" Calculation

    Science.gov (United States)

    Dees, Patrick D.; Zwack, Mathew R.

    2017-01-01

    During early conceptual design of complex systems, speed and accuracy are often at odds with one another. While many characteristics of the design are fluctuating rapidly during this phase there is nonetheless a need to acquire accurate data from which to down-select designs as these decisions will have a large impact upon program life-cycle cost. Therefore enabling the conceptual designer to produce accurate data in a timely manner is tantamount to program viability. For conceptual design of launch vehicles, trajectory analysis and optimization is a large hurdle. Tools such as the industry standard Program to Optimize Simulated Trajectories (POST) have traditionally required an expert in the loop for setting up inputs, running the program, and analyzing the output. The solution space for trajectory analysis is in general non-linear and multi-modal requiring an experienced analyst to weed out sub-optimal designs in pursuit of the global optimum. While an experienced analyst presented with a vehicle similar to one which they have already worked on can likely produce optimal performance figures in a timely manner, as soon as the "experienced" or "similar" adjectives are invalid the process can become lengthy. In addition, an experienced analyst working on a similar vehicle may go into the analysis with preconceived ideas about what the vehicle's trajectory should look like which can result in sub-optimal performance being recorded. Thus, in any case but the ideal either time or accuracy can be sacrificed. In the authors' previous work a tool called multiPOST was created which captures the heuristics of a human analyst over the process of executing trajectory analysis with POST. However without the instincts of a human in the loop, this method relied upon Monte Carlo simulation to find successful trajectories. Overall the method has mixed results, and in the context of optimizing multiple vehicles it is inefficient in comparison to the method presented POST's internal

  15. Method for semi-automated microscopy of filtration-enriched circulating tumor cells

    International Nuclear Information System (INIS)

    Pailler, Emma; Oulhen, Marianne; Billiot, Fanny; Galland, Alexandre; Auger, Nathalie; Faugeroux, Vincent; Laplace-Builhé, Corinne; Besse, Benjamin; Loriot, Yohann; Ngo-Camus, Maud; Hemanda, Merouan; Lindsay, Colin R.; Soria, Jean-Charles; Vielh, Philippe; Farace, Françoise

    2016-01-01

    Circulating tumor cell (CTC)-filtration methods capture high numbers of CTCs in non-small-cell lung cancer (NSCLC) and metastatic prostate cancer (mPCa) patients, and hold promise as a non-invasive technique for treatment selection and disease monitoring. However filters have drawbacks that make the automation of microscopy challenging. We report the semi-automated microscopy method we developed to analyze filtration-enriched CTCs from NSCLC and mPCa patients. Spiked cell lines in normal blood and CTCs were enriched by ISET (isolation by size of epithelial tumor cells). Fluorescent staining was carried out using epithelial (pan-cytokeratins, EpCAM), mesenchymal (vimentin, N-cadherin), leukocyte (CD45) markers and DAPI. Cytomorphological staining was carried out with Mayer-Hemalun or Diff-Quik. ALK-, ROS1-, ERG-rearrangement were detected by filter-adapted-FISH (FA-FISH). Microscopy was carried out using an Ariol scanner. Two combined assays were developed. The first assay sequentially combined four-color fluorescent staining, scanning, automated selection of CD45 − cells, cytomorphological staining, then scanning and analysis of CD45 − cell phenotypical and cytomorphological characteristics. CD45 − cell selection was based on DAPI and CD45 intensity, and a nuclear area >55 μm 2 . The second assay sequentially combined fluorescent staining, automated selection of CD45 − cells, FISH scanning on CD45 − cells, then analysis of CD45 − cell FISH signals. Specific scanning parameters were developed to deal with the uneven surface of filters and CTC characteristics. Thirty z-stacks spaced 0.6 μm apart were defined as the optimal setting, scanning 82 %, 91 %, and 95 % of CTCs in ALK-, ROS1-, and ERG-rearranged patients respectively. A multi-exposure protocol consisting of three separate exposure times for green and red fluorochromes was optimized to analyze the intensity, size and thickness of FISH signals. The semi-automated microscopy method reported here

  16. Influencing Trust for Human-Automation Collaborative Scheduling of Multiple Unmanned Vehicles.

    Science.gov (United States)

    Clare, Andrew S; Cummings, Mary L; Repenning, Nelson P

    2015-11-01

    We examined the impact of priming on operator trust and system performance when supervising a decentralized network of heterogeneous unmanned vehicles (UVs). Advances in autonomy have enabled a future vision of single-operator control of multiple heterogeneous UVs. Real-time scheduling for multiple UVs in uncertain environments requires the computational ability of optimization algorithms combined with the judgment and adaptability of human supervisors. Because of system and environmental uncertainty, appropriate operator trust will be instrumental to maintain high system performance and prevent cognitive overload. Three groups of operators experienced different levels of trust priming prior to conducting simulated missions in an existing, multiple-UV simulation environment. Participants who play computer and video games frequently were found to have a higher propensity to overtrust automation. By priming gamers to lower their initial trust to a more appropriate level, system performance was improved by 10% as compared to gamers who were primed to have higher trust in the automation. Priming was successful at adjusting the operator's initial and dynamic trust in the automated scheduling algorithm, which had a substantial impact on system performance. These results have important implications for personnel selection and training for futuristic multi-UV systems under human supervision. Although gamers may bring valuable skills, they may also be potentially prone to automation bias. Priming during training and regular priming throughout missions may be one potential method for overcoming this propensity to overtrust automation. © 2015, Human Factors and Ergonomics Society.

  17. Channel Coordination in Logistics Service Supply Chain considering Fairness

    Directory of Open Access Journals (Sweden)

    Ningning Wang

    2016-01-01

    Full Text Available Logistics service supply chain (LSSC is a new type of service supply chain. This paper investigates the channel coordination issue in a two-echelon LSSC composed of one logistics service integrator (LSI and one functional logistics service provider (FLSP under fairness concerns. The models for a reservation price-only contract under disadvantageous inequality and advantageous inequality are established, respectively, in which the procurement cost, the potential shortage cost, and the operation cost are considered under stochastic market demand. Based on this model, the LSI’s optimal reservation quantity can be determined. Furthermore, we analyze the impact of fairness concerns and the related costs on channel performance and channel coordination. The results are presented in four aspects: (1 channel coordination of the LSSC can be achieved under certain conditions when the LSI experiences advantageous inequality; (2 the spiteful behavior of the LSI leads to the reduction of the channel profit, and channel coordination cannot be achieved when the LSI suffers from disadvantageous inequality; (3 the LSI’s reservation quantity and the channel profit are affected by the LSI’s fairness concerns; (4 motivated by the concerns of fairness, the LSI’s reservation quantity is related not only to his procurement cost and shortage cost but also to the FLSP’s operation cost.

  18. Lexical evolution rates derived from automated stability measures

    Science.gov (United States)

    Petroni, Filippo; Serva, Maurizio

    2010-03-01

    Phylogenetic trees can be reconstructed from the matrix which contains the distances between all pairs of languages in a family. Recently, we proposed a new method which uses normalized Levenshtein distances among words with the same meaning and averages over all the items of a given list. Decisions about the number of items in the input lists for language comparison have been debated since the beginning of glottochronology. The point is that words associated with some of the meanings have a rapid lexical evolution. Therefore, a large vocabulary comparison is only apparently more accurate than a smaller one, since many of the words do not carry any useful information. In principle, one should find the optimal length of the input lists, studying the stability of the different items. In this paper we tackle the problem with an automated methodology based only on our normalized Levenshtein distance. With this approach, the program of an automated reconstruction of language relationships is completed.

  19. Reorganization of finger coordination patterns through motor exploration in individuals after stroke.

    Science.gov (United States)

    Ranganathan, Rajiv

    2017-09-11

    Impairment of hand and finger function after stroke is common and affects the ability to perform activities of daily living. Even though many of these coordination deficits such as finger individuation have been well characterized, it is critical to understand how stroke survivors learn to explore and reorganize their finger coordination patterns for optimizing rehabilitation. In this study, I examine the use of a body-machine interface to assess how participants explore their movement repertoire, and how this changes with continued practice. Ten participants with chronic stroke wore a data glove and the finger joint angles were mapped on to the position of a cursor on a screen. The task of the participants was to move the cursor back and forth between two specified targets on a screen. Critically, the map between the finger movements and cursor motion was altered so that participants sometimes had to generate coordination patterns that required finger individuation. There were two phases to the experiment - an initial assessment phase on day 1, followed by a learning phase (days 2-5) where participants trained to reorganize their coordination patterns. Participants showed difficulty in performing tasks which had maps that required finger individuation, and the degree to which they explored their movement repertoire was directly related to clinical tests of hand function. However, over four sessions of practice, participants were able to learn to reorganize their finger movement coordination pattern and improve their performance. Moreover, training also resulted in improvements in movement repertoire outside of the context of the specific task during free exploration. Stroke survivors show deficits in movement repertoire in their paretic hand, but facilitating movement exploration during training can increase the movement repertoire. This suggests that exploration may be an important element of rehabilitation to regain optimal function.

  20. Performance of an automated electronic acute lung injury screening system in intensive care unit patients.

    Science.gov (United States)

    Koenig, Helen C; Finkel, Barbara B; Khalsa, Satjeet S; Lanken, Paul N; Prasad, Meeta; Urbani, Richard; Fuchs, Barry D

    2011-01-01

    Lung protective ventilation reduces mortality in patients with acute lung injury, but underrecognition of acute lung injury has limited its use. We recently validated an automated electronic acute lung injury surveillance system in patients with major trauma in a single intensive care unit. In this study, we assessed the system's performance as a prospective acute lung injury screening tool in a diverse population of intensive care unit patients. Patients were screened prospectively for acute lung injury over 21 wks by the automated system and by an experienced research coordinator who manually screened subjects for enrollment in Acute Respiratory Distress Syndrome Clinical Trials Network (ARDSNet) trials. Performance of the automated system was assessed by comparing its results with the manual screening process. Discordant results were adjudicated blindly by two physician reviewers. In addition, a sensitivity analysis using a range of assumptions was conducted to better estimate the system's performance. The Hospital of the University of Pennsylvania, an academic medical center and ARDSNet center (1994-2006). Intubated patients in medical and surgical intensive care units. None. Of 1270 patients screened, 84 were identified with acute lung injury (incidence of 6.6%). The automated screening system had a sensitivity of 97.6% (95% confidence interval, 96.8-98.4%) and a specificity of 97.6% (95% confidence interval, 96.8-98.4%). The manual screening algorithm had a sensitivity of 57.1% (95% confidence interval, 54.5-59.8%) and a specificity of 99.7% (95% confidence interval, 99.4-100%). Sensitivity analysis demonstrated a range for sensitivity of 75.0-97.6% of the automated system under varying assumptions. Under all assumptions, the automated system demonstrated higher sensitivity than and comparable specificity to the manual screening method. An automated electronic system identified patients with acute lung injury with high sensitivity and specificity in diverse