WorldWideScience

Sample records for automated optimal coordination

  1. Optimization of strong and weak coordinates

    NARCIS (Netherlands)

    Swart, M.; Bickelhaupt, F.M.

    2006-01-01

    We present a new scheme for the geometry optimization of equilibrium and transition state structures that can be used for both strong and weak coordinates. We use a screening function that depends on atom-pair distances to differentiate strong coordinates from weak coordinates. This differentiation

  2. Optimization of automation: III. Development of optimization method for determining automation rate in nuclear power plants

    International Nuclear Information System (INIS)

    Lee, Seung Min; Kim, Jong Hyun; Kim, Man Cheol; Seong, Poong Hyun

    2016-01-01

    Highlights: • We propose an appropriate automation rate that enables the best human performance. • We analyze the shortest working time considering Situation Awareness Recovery (SAR). • The optimized automation rate is estimated by integrating the automation and ostracism rate estimation methods. • The process to derive the optimized automation rate is demonstrated through case studies. - Abstract: Automation has been introduced in various industries, including the nuclear field, because it is commonly believed that automation promises greater efficiency, lower workloads, and fewer operator errors through reducing operator errors and enhancing operator and system performance. However, the excessive introduction of automation has deteriorated operator performance due to the side effects of automation, which are referred to as Out-of-the-Loop (OOTL), and this is critical issue that must be resolved. Thus, in order to determine the optimal level of automation introduction that assures the best human operator performance, a quantitative method of optimizing the automation is proposed in this paper. In order to propose the optimization method for determining appropriate automation levels that enable the best human performance, the automation rate and ostracism rate, which are estimation methods that quantitatively analyze the positive and negative effects of automation, respectively, are integrated. The integration was conducted in order to derive the shortest working time through considering the concept of situation awareness recovery (SAR), which states that the automation rate with the shortest working time assures the best human performance. The process to derive the optimized automation rate is demonstrated through an emergency operation scenario-based case study. In this case study, four types of procedures are assumed through redesigning the original emergency operating procedure according to the introduced automation and ostracism levels. Using the

  3. Cooperative Optimal Coordination for Distributed Energy Resources

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Tao; Wu, Di; Ren, Wei; Wang, Hong; Hong, Yiguang; Johansson, Karl

    2017-12-12

    In this paper, we consider the optimal coordination problem for distributed energy resources (DERs) including distributed generators and energy storage devices. We propose an algorithm based on the push-sum and gradient method to optimally coordinate storage devices and distributed generators in a distributed manner. In the proposed algorithm, each DER only maintains a set of variables and updates them through information exchange with a few neighbors over a time-varying directed communication network. We show that the proposed distributed algorithm solves the optimal DER coordination problem if the time-varying directed communication network is uniformly jointly strongly connected, which is a mild condition on the connectivity of communication topologies. The proposed distributed algorithm is illustrated and validated by numerical simulations.

  4. Optimal Coordination of Automatic Line Switches for Distribution Systems

    Directory of Open Access Journals (Sweden)

    Jyh-Cherng Gu

    2012-04-01

    Full Text Available For the Taiwan Power Company (Taipower, the margins of coordination times between the lateral circuit breakers (LCB of underground 4-way automatic line switches and the protection equipment of high voltage customers are often too small. This could lead to sympathy tripping by the feeder circuit breaker (FCB of the distribution feeder and create difficulties in protection coordination between upstream and downstream protection equipment, identification of faults, and restoration operations. In order to solve the problem, it is necessary to reexamine the protection coordination between LCBs and high voltage customers’ protection equipment, and between LCBs and FCBs, in order to bring forth new proposals for settings and operations. This paper applies linear programming to optimize the protection coordination of protection devices, and proposes new time current curves (TCCs for the overcurrent (CO and low-energy overcurrent (LCO relays used in normally open distribution systems by performing simulations in the Electrical Transient Analyzer Program (ETAP environment. The simulation results show that the new TCCs solve the coordination problems among high voltage customer, lateral, feeder, bus-interconnection, and distribution transformer. The new proposals also satisfy the requirements of Taipower on protection coordination of the distribution feeder automation system (DFAS. Finally, the authors believe that the system configuration, operation experience, and relevant criteria mentioned in this paper may serve as valuable references for other companies or utilities when building DFAS of their own.

  5. Automated quantitative analysis of coordinated locomotor behaviour in rats.

    Science.gov (United States)

    Tanger, H J; Vanwersch, R A; Wolthuis, O L

    1984-03-01

    Disturbances of motor coordination are usually difficult to quantify. Therefore, a method was developed for the automated quantitative analysis of the movements of the dyed paws of stepping rats, registered by a colour TV camera. The signals from the TV-video system were converted by an electronic interface into voltages proportional to the X- and Y-coordinates of the paws, from which a desktop computer calculated the movements of these paws in time and distance. Application 1 analysed the steps of a rat walking in a hollow rotating wheel. The results showed low variability of the walking pattern, the method was insensitive to low doses of alcohol, but was suitable to quantify overt, e.g. neurotoxic, locomotor disturbances or recovery thereof. In application 2 hurdles were placed in a similar hollow wheel and the rats were trained to step from the top of one hurdle to another. Physostigmine-induced disturbances of this acquired complex motor task could be detected at doses far below those that cause overt symptoms.

  6. Automated Cloud Observation for Ground Telescope Optimization

    Science.gov (United States)

    Lane, B.; Jeffries, M. W., Jr.; Therien, W.; Nguyen, H.

    As the number of man-made objects placed in space each year increases with advancements in commercial, academic and industry, the number of objects required to be detected, tracked, and characterized continues to grow at an exponential rate. Commercial companies, such as ExoAnalytic Solutions, have deployed ground based sensors to maintain track custody of these objects. For the ExoAnalytic Global Telescope Network (EGTN), observation of such objects are collected at the rate of over 10 million unique observations per month (as of September 2017). Currently, the EGTN does not optimally collect data on nights with significant cloud levels. However, a majority of these nights prove to be partially cloudy providing clear portions in the sky for EGTN sensors to observe. It proves useful for a telescope to utilize these clear areas to continue resident space object (RSO) observation. By dynamically updating the tasking with the varying cloud positions, the number of observations could potentially increase dramatically due to increased persistence, cadence, and revisit. This paper will discuss the recent algorithms being implemented within the EGTN, including the motivation, need, and general design. The use of automated image processing as well as various edge detection methods, including Canny, Sobel, and Marching Squares, on real-time large FOV images of the sky enhance the tasking and scheduling of a ground based telescope is discussed in Section 2. Implementations of these algorithms on single and expanding to multiple telescopes, will be explored. Results of applying these algorithms to the EGTN in real-time and comparison to non-optimized EGTN tasking is presented in Section 3. Finally, in Section 4 we explore future work in applying these throughout the EGTN as well as other optical telescopes.

  7. Distributed optimal coordination for distributed energy resources in power systems

    DEFF Research Database (Denmark)

    Wu, Di; Yang, Tao; Stoorvogel, A.

    2017-01-01

    Driven by smart grid technologies, distributed energy resources (DERs) have been rapidly developing in recent years for improving reliability and efficiency of distribution systems. Emerging DERs require effective and efficient coordination in order to reap their potential benefits. In this paper......, we consider an optimal DER coordination problem over multiple time periods subject to constraints at both system and device levels. Fully distributed algorithms are proposed to dynamically and automatically coordinate distributed generators with multiple/single storages. With the proposed algorithms...

  8. Automated Cache Performance Analysis And Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Mohror, Kathryn [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-12-23

    While there is no lack of performance counter tools for coarse-grained measurement of cache activity, there is a critical lack of tools for relating data layout to cache behavior to application performance. Generally, any nontrivial optimizations are either not done at all, or are done ”by hand” requiring significant time and expertise. To the best of our knowledge no tool available to users measures the latency of memory reference instructions for partic- ular addresses and makes this information available to users in an easy-to-use and intuitive way. In this project, we worked to enable the Open|SpeedShop performance analysis tool to gather memory reference latency information for specific instructions and memory ad- dresses, and to gather and display this information in an easy-to-use and intuitive way to aid performance analysts in identifying problematic data structures in their codes. This tool was primarily designed for use in the supercomputer domain as well as grid, cluster, cloud-based parallel e-commerce, and engineering systems and middleware. Ultimately, we envision a tool to automate optimization of application cache layout and utilization in the Open|SpeedShop performance analysis tool. To commercialize this soft- ware, we worked to develop core capabilities for gathering enhanced memory usage per- formance data from applications and create and apply novel methods for automatic data structure layout optimizations, tailoring the overall approach to support existing supercom- puter and cluster programming models and constraints. In this Phase I project, we focused on infrastructure necessary to gather performance data and present it in an intuitive way to users. With the advent of enhanced Precise Event-Based Sampling (PEBS) counters on recent Intel processor architectures and equivalent technology on AMD processors, we are now in a position to access memory reference information for particular addresses. Prior to the introduction of PEBS counters

  9. Optimizing a Drone Network to Deliver Automated External Defibrillators.

    Science.gov (United States)

    Boutilier, Justin J; Brooks, Steven C; Janmohamed, Alyf; Byers, Adam; Buick, Jason E; Zhan, Cathy; Schoellig, Angela P; Cheskes, Sheldon; Morrison, Laurie J; Chan, Timothy C Y

    2017-06-20

    Public access defibrillation programs can improve survival after out-of-hospital cardiac arrest, but automated external defibrillators (AEDs) are rarely available for bystander use at the scene. Drones are an emerging technology that can deliver an AED to the scene of an out-of-hospital cardiac arrest for bystander use. We hypothesize that a drone network designed with the aid of a mathematical model combining both optimization and queuing can reduce the time to AED arrival. We applied our model to 53 702 out-of-hospital cardiac arrests that occurred in the 8 regions of the Toronto Regional RescuNET between January 1, 2006, and December 31, 2014. Our primary analysis quantified the drone network size required to deliver an AED 1, 2, or 3 minutes faster than historical median 911 response times for each region independently. A secondary analysis quantified the reduction in drone resources required if RescuNET was treated as a large coordinated region. The region-specific analysis determined that 81 bases and 100 drones would be required to deliver an AED ahead of median 911 response times by 3 minutes. In the most urban region, the 90th percentile of the AED arrival time was reduced by 6 minutes and 43 seconds relative to historical 911 response times in the region. In the most rural region, the 90th percentile was reduced by 10 minutes and 34 seconds. A single coordinated drone network across all regions required 39.5% fewer bases and 30.0% fewer drones to achieve similar AED delivery times. An optimized drone network designed with the aid of a novel mathematical model can substantially reduce the AED delivery time to an out-of-hospital cardiac arrest event. © 2017 American Heart Association, Inc.

  10. Automated Robust Maneuver Design and Optimization

    Data.gov (United States)

    National Aeronautics and Space Administration — NASA is seeking improvements to the current technologies related to Position, Navigation and Timing. In particular, it is desired to automate precise maneuver...

  11. Kinematically optimal robot placement for minimum time coordinated motion

    Energy Technology Data Exchange (ETDEWEB)

    Feddema, J.T.

    1995-10-01

    This paper describes an algorithm for determining the optimal placement of a robotic manipulator within a workcell for minimum time coordinated motion. The algorithm uses a simple principle of coordinated motion to estimate the time of a joint interpolated motion. Specifically, the coordinated motion profile is limited by the slowest axis. Two and six degree of freedom (DOF) examples are presented. In experimental tests on a FANUC S-800 arm, the optimal placement of the robot can improve cycle time of a robotic operation by as much as 25%. In high volume processes where the robot motion is currently the limiting factor, this increased throughput can result in substantial cost savings.

  12. Optimal caliper placement: manual vs automated methods.

    Science.gov (United States)

    Yazdi, B; Zanker, P; Wanger, P; Sonek, J; Pintoffl, K; Hoopmann, M; Kagan, K O

    2014-02-01

    To examine the inter- and intra-operator repeatability of manual placement of callipers in the assessment of basic biometric measurements and to compare the results to an automated calliper placement system. Stored ultrasound images of 95 normal fetuses between 19 and 25 weeks' gestation were used. Five operators (two experts, one resident and two students) were asked to measure the BPD, OFD and FL two times manually and automatically. For each operator, intra-operator repeatability of the manual and automated measurements was assessed by within operator standard deviation. For the assessment of the interoperator repeatability, the mean of the four manual measurements by the two experts was used as the gold standard.The relative bias of the manual measurement of the three non-expert operators and the operator-independent automated measurement were compared with the gold standard measurement by means and 95% confidence interval. In 88.4% of the 95 cases, the automated measurement algorithm was able to obtain appropriate measurements of the BPD, OFD, AC and FL. Within operator standard deviations of the manual measurements ranged between 0.15 and 1.56, irrespective of the experience of the operator.Using the automated biometric measurement system, there was no difference between the measurements of each operator. As far as the inter-operator repeatability is concerned, the difference between the manual measurements of the two students, the resident, and the gold standard was between -0.10 and 2.53 mm. The automated measurements tended to be closer to the gold standard but did not reach statistical significance. In about 90% of the cases, it was possible to obtain basic biometric measurements with an automated system. The use of automated measurements resulted in a significant improvement of the intra-operator but not of the inter-operator repeatability, but measurements were not significantly closer to the gold standard of expert examiners. This article is protected

  13. Automated design optimization and trade studies using STK scenarios

    Science.gov (United States)

    Rivera, Mark A.; Hill, Jennifer

    2005-05-01

    Optimizing a constellation of space, air, and ground assets is typically a man-in-the-loop intensive and iterative process. Designers must originate a few baseline concepts and then intuitively explore design variations within a large multi-dimensional trade space. To keep the search manageable, options and variations are often severely limited. There is a clear advantage to automate, in an intelligent and efficient manner, this search for optimal design solutions. Such automation would greatly open the analyzable options, increasing insight, improving solutions, and saving money. For this reason, Boeing has initiated a software application that automates STK® scenarios and intelligently searches for optimal solutions. Now in the post-prototype stages of development, "AVA" has provided extremely valuable solutions and insight into several space-based architectures and their proposed payloads. This paper will discuss the current state of the AVA tool, methods, applicability, and the potential for future growth.

  14. Automated firewall analytics design, configuration and optimization

    CERN Document Server

    Al-Shaer, Ehab

    2014-01-01

    This book provides a comprehensive and in-depth study of automated firewall policy analysis for designing, configuring and managing distributed firewalls in large-scale enterpriser networks. It presents methodologies, techniques and tools for researchers as well as professionals to understand the challenges and improve the state-of-the-art of managing firewalls systematically in both research and application domains. Chapters explore set-theory, managing firewall configuration globally and consistently, access control list with encryption, and authentication such as IPSec policies. The author

  15. Predictive Analytics for Coordinated Optimization in Distribution Systems

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Rui [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2018-04-13

    This talk will present NREL's work on developing predictive analytics that enables the optimal coordination of all the available resources in distribution systems to achieve the control objectives of system operators. Two projects will be presented. One focuses on developing short-term state forecasting-based optimal voltage regulation in distribution systems; and the other one focuses on actively engaging electricity consumers to benefit distribution system operations.

  16. Automated beam steering using optimal control

    Energy Technology Data Exchange (ETDEWEB)

    Allen, C. K. (Christopher K.)

    2004-01-01

    We present a steering algorithm which, with the aid of a model, allows the user to specify beam behavior throughout a beamline, rather than just at specified beam position monitor (BPM) locations. The model is used primarily to compute the values of the beam phase vectors from BPM measurements, and to define cost functions that describe the steering objectives. The steering problem is formulated as constrained optimization problem; however, by applying optimal control theory we can reduce it to an unconstrained optimization whose dimension is the number of control signals.

  17. Optimized and Automated design of Plasma Diagnostics for Additive Manufacture

    Science.gov (United States)

    Stuber, James; Quinley, Morgan; Melnik, Paul; Sieck, Paul; Smith, Trevor; Chun, Katherine; Woodruff, Simon

    2016-10-01

    Despite having mature designs, diagnostics are usually custom designed for each experiment. Most of the design can be now be automated to reduce costs (engineering labor, and capital cost). We present results from scripted physics modeling and parametric engineering design for common optical and mechanical components found in many plasma diagnostics and outline the process for automated design optimization that employs scripts to communicate data from online forms through proprietary and open-source CAD and FE codes to provide a design that can be sent directly to a printer. As a demonstration of design automation, an optical beam dump, baffle and optical components are designed via an automated process and printed. Supported by DOE SBIR Grant DE-SC0011858.

  18. Controller Design Automation for Aeroservoelastic Design Optimization of Wind Turbines

    NARCIS (Netherlands)

    Ashuri, T.; Van Bussel, G.J.W.; Zaayer, M.B.; Van Kuik, G.A.M.

    2010-01-01

    The purpose of this paper is to integrate the controller design of wind turbines with structure and aerodynamic analysis and use the final product in the design optimization process (DOP) of wind turbines. To do that, the controller design is automated and integrated with an aeroelastic simulation

  19. Coordinated Optimal Operation Method of the Regional Energy Internet

    Directory of Open Access Journals (Sweden)

    Rishang Long

    2017-05-01

    Full Text Available The development of the energy internet has become one of the key ways to solve the energy crisis. This paper studies the system architecture, energy flow characteristics and coordinated optimization method of the regional energy internet. Considering the heat-to-electric ratio of a combined cooling, heating and power unit, energy storage life and real-time electricity price, a double-layer optimal scheduling model is proposed, which includes economic and environmental benefit in the upper layer and energy efficiency in the lower layer. A particle swarm optimizer–individual variation ant colony optimization algorithm is used to solve the computational efficiency and accuracy. Through the calculation and simulation of the simulated system, the energy savings, level of environmental protection and economic optimal dispatching scheme are realized.

  20. Vibrational self-consistent field theory using optimized curvilinear coordinates

    Science.gov (United States)

    Bulik, Ireneusz W.; Frisch, Michael J.; Vaccaro, Patrick H.

    2017-07-01

    A vibrational SCF model is presented in which the functions forming the single-mode functions in the product wavefunction are expressed in terms of internal coordinates and the coordinates used for each mode are optimized variationally. This model involves no approximations to the kinetic energy operator and does not require a Taylor-series expansion of the potential. The non-linear optimization of coordinates is found to give much better product wavefunctions than the limited variations considered in most previous applications of SCF methods to vibrational problems. The approach is tested using published potential energy surfaces for water, ammonia, and formaldehyde. Variational flexibility allowed in the current ansätze results in excellent zero-point energies expressed through single-product states and accurate fundamental transition frequencies realized by short configuration-interaction expansions. Fully variational optimization of single-product states for excited vibrational levels also is discussed. The highlighted methodology constitutes an excellent starting point for more sophisticated treatments, as the bulk characteristics of many-mode coupling are accounted for efficiently in terms of compact wavefunctions (as evident from the accurate prediction of transition frequencies).

  1. Automated Design Framework for Synthetic Biology Exploiting Pareto Optimality.

    Science.gov (United States)

    Otero-Muras, Irene; Banga, Julio R

    2017-07-21

    In this work we consider Pareto optimality for automated design in synthetic biology. We present a generalized framework based on a mixed-integer dynamic optimization formulation that, given design specifications, allows the computation of Pareto optimal sets of designs, that is, the set of best trade-offs for the metrics of interest. We show how this framework can be used for (i) forward design, that is, finding the Pareto optimal set of synthetic designs for implementation, and (ii) reverse design, that is, analyzing and inferring motifs and/or design principles of gene regulatory networks from the Pareto set of optimal circuits. Finally, we illustrate the capabilities and performance of this framework considering four case studies. In the first problem we consider the forward design of an oscillator. In the remaining problems, we illustrate how to apply the reverse design approach to find motifs for stripe formation, rapid adaption, and fold-change detection, respectively.

  2. Towards Automated Design, Analysis and Optimization of Declarative Curation Workflows

    Directory of Open Access Journals (Sweden)

    Tianhong Song

    2014-10-01

    Full Text Available Data curation is increasingly important. Our previous work on a Kepler curation package has demonstrated advantages that come from automating data curation pipelines by using workflow systems. However, manually designed curation workflows can be error-prone and inefficient due to a lack of user understanding of the workflow system, misuse of actors, or human error. Correcting problematic workflows is often very time-consuming. A more proactive workflow system can help users avoid such pitfalls. For example, static analysis before execution can be used to detect the potential problems in a workflow and help the user to improve workflow design. In this paper, we propose a declarative workflow approach that supports semi-automated workflow design, analysis and optimization. We show how the workflow design engine helps users to construct data curation workflows, how the workflow analysis engine detects different design problems of workflows and how workflows can be optimized by exploiting parallelism.

  3. OPTIMIZATION OF DUTY METEOROLOGIST WORK USING FORECASTER AUTOMATED WORKSTATION

    OpenAIRE

    Pylypovych, Hryhorii H.; Shevchenko, Viktor L.; Drovnin, Andrii S.; Musin, Rafil R.; Oliinyk, Oleksandr L.

    2014-01-01

    The priority directions of science and technology in Ukraine for the period till 2020 indicated information and communication technologies, so optimization of duty meteorologist work should include a number of specific measures to minimize the human factor in the chain “observation – processing – forecasting - transfer – bringing – to consumers and actual prognostic meteorological information”. Work in technical areas of activity is increasingly becoming automated, so meteorologists should be...

  4. Simulation based optimization on automated fibre placement process

    Science.gov (United States)

    Lei, Shi

    2018-02-01

    In this paper, a software simulation (Autodesk TruPlan & TruFiber) based method is proposed to optimize the automate fibre placement (AFP) process. Different types of manufacturability analysis are introduced to predict potential defects. Advanced fibre path generation algorithms are compared with respect to geometrically different parts. Major manufacturing data have been taken into consideration prior to the tool paths generation to achieve high success rate of manufacturing.

  5. Coordinated Optimization of Visual Cortical Maps (II) Numerical Studies

    Science.gov (United States)

    Reichl, Lars; Heide, Dominik; Löwel, Siegrid; Crowley, Justin C.; Kaschube, Matthias; Wolf, Fred

    2012-01-01

    In the juvenile brain, the synaptic architecture of the visual cortex remains in a state of flux for months after the natural onset of vision and the initial emergence of feature selectivity in visual cortical neurons. It is an attractive hypothesis that visual cortical architecture is shaped during this extended period of juvenile plasticity by the coordinated optimization of multiple visual cortical maps such as orientation preference (OP), ocular dominance (OD), spatial frequency, or direction preference. In part (I) of this study we introduced a class of analytically tractable coordinated optimization models and solved representative examples, in which a spatially complex organization of the OP map is induced by interactions between the maps. We found that these solutions near symmetry breaking threshold predict a highly ordered map layout. Here we examine the time course of the convergence towards attractor states and optima of these models. In particular, we determine the timescales on which map optimization takes place and how these timescales can be compared to those of visual cortical development and plasticity. We also assess whether our models exhibit biologically more realistic, spatially irregular solutions at a finite distance from threshold, when the spatial periodicities of the two maps are detuned and when considering more than 2 feature dimensions. We show that, although maps typically undergo substantial rearrangement, no other solutions than pinwheel crystals and stripes dominate in the emerging layouts. Pinwheel crystallization takes place on a rather short timescale and can also occur for detuned wavelengths of different maps. Our numerical results thus support the view that neither minimal energy states nor intermediate transient states of our coordinated optimization models successfully explain the architecture of the visual cortex. We discuss several alternative scenarios that may improve the agreement between model solutions and biological

  6. A mixed optimization method for automated design of fuselage structures.

    Science.gov (United States)

    Sobieszczanski, J.; Loendorf, D.

    1972-01-01

    A procedure for automating the design of transport aircraft fuselage structures has been developed and implemented in the form of an operational program. The structure is designed in two stages. First, an overall distribution of structural material is obtained by means of optimality criteria to meet strength and displacement constraints. Subsequently, the detailed design of selected rings and panels consisting of skin and stringers is performed by mathematical optimization accounting for a set of realistic design constraints. The practicality and computer efficiency of the procedure is demonstrated on cylindrical and area-ruled large transport fuselages.

  7. Automated Multivariate Optimization Tool for Energy Analysis: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Ellis, P. G.; Griffith, B. T.; Long, N.; Torcellini, P. A.; Crawley, D.

    2006-07-01

    Building energy simulations are often used for trial-and-error evaluation of ''what-if'' options in building design--a limited search for an optimal solution, or ''optimization''. Computerized searching has the potential to automate the input and output, evaluate many options, and perform enough simulations to account for the complex interactions among combinations of options. This paper describes ongoing efforts to develop such a tool. The optimization tool employs multiple modules, including a graphical user interface, a database, a preprocessor, the EnergyPlus simulation engine, an optimization engine, and a simulation run manager. Each module is described and the overall application architecture is summarized.

  8. Weighted Constraint Satisfaction for Smart Home Automation and Optimization

    Directory of Open Access Journals (Sweden)

    Noel Nuo Wi Tay

    2016-01-01

    Full Text Available Automation of the smart home binds together services of hardware and software to provide support for its human inhabitants. The rise of web technologies offers applicable concepts and technologies for service composition that can be exploited for automated planning of the smart home, which can be further enhanced by implementation based on service oriented architecture (SOA. SOA supports loose coupling and late binding of devices, enabling a more declarative approach in defining services and simplifying home configurations. One such declarative approach is to represent and solve automated planning through constraint satisfaction problem (CSP, which has the advantage of handling larger domains of home states. But CSP uses hard constraints and thus cannot perform optimization and handle contradictory goals and partial goal fulfillment, which are practical issues smart environments will face if humans are involved. This paper extends this approach to Weighted Constraint Satisfaction Problem (WCSP. Branch and bound depth first search is used, where its lower bound is estimated by bacterial memetic algorithm (BMA on a relaxed version of the original optimization problem. Experiments up to 16-step planning of home services demonstrate the applicability and practicality of the approach, with the inclusion of local search for trivial service combinations in BMA that produces performance enhancements. Besides, this work aims to set the groundwork for further research in the field.

  9. Implementation and optimization of automated dispensing cabinet technology.

    Science.gov (United States)

    McCarthy, Bryan C; Ferker, Michael

    2016-10-01

    A multifaceted automated dispensing cabinet (ADC) optimization initiative at a large hospital is described. The ADC optimization project, which was launched approximately six weeks after activation of ADCs in 30 patient care unit medication rooms of a newly established adult hospital, included (1) adjustment of par inventory levels (desired on-hand quantities of medications) and par reorder quantities to reduce the risk of ADC supply exhaustion and improve restocking efficiency, (2) expansion of ADC "common stock" (medications assigned to ADC inventories) to increase medication availability at the point of care, and (3) removal of some infrequently prescribed medications from ADCs to reduce the likelihood of product expiration. The purpose of the project was to address organizational concerns regarding widespread ADC medication stockouts, growing reliance on cart-fill medication delivery systems, and suboptimal medication order turnaround times. Leveraging of the ADC technology platform's reporting functionalities for enhanced inventory control yielded a number of benefits, including cost savings resulting from reduced pharmacy technician labor requirements (estimated at $2,728 annually), a substantial reduction in the overall weekly stockout percentage (from 3.2% before optimization to 0.5% eight months after optimization), an improvement in the average medication turnaround time, and estimated cost avoidance of $19,660 attributed to the reduced potential for product expiration. Efforts to optimize ADCs through par level optimization, expansion of common stock, and removal of infrequently used medications reduced pharmacy technician labor, decreased stockout percentages, generated opportunities for cost avoidance, and improved medication turnaround times. Copyright © 2016 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  10. A Framework for Matching User Needs to an Optimal Level of Office Automation

    Science.gov (United States)

    1988-06-01

    This thesis introduces the concepts of determining an organization’s optimal office automation strategy by investigating seven characteristics...technology, environment, and employee skill. These seven characteristics form the input into an office automation framework which mathematically...determines which of three office automation strategies is best for a particular organization. These three strategy levels are called low level operational

  11. Automating Initial Guess Generation for High Fidelity Trajectory Optimization Tools

    Science.gov (United States)

    Villa, Benjamin; Lantoine, Gregory; Sims, Jon; Whiffen, Gregory

    2013-01-01

    Many academic studies in spaceflight dynamics rely on simplified dynamical models, such as restricted three-body models or averaged forms of the equations of motion of an orbiter. In practice, the end result of these preliminary orbit studies needs to be transformed into more realistic models, in particular to generate good initial guesses for high-fidelity trajectory optimization tools like Mystic. This paper reviews and extends some of the approaches used in the literature to perform such a task, and explores the inherent trade-offs of such a transformation with a view toward automating it for the case of ballistic arcs. Sample test cases in the libration point regimes and small body orbiter transfers are presented.

  12. IPO: a tool for automated optimization of XCMS parameters.

    Science.gov (United States)

    Libiseller, Gunnar; Dvorzak, Michaela; Kleb, Ulrike; Gander, Edgar; Eisenberg, Tobias; Madeo, Frank; Neumann, Steffen; Trausinger, Gert; Sinner, Frank; Pieber, Thomas; Magnes, Christoph

    2015-04-16

    Untargeted metabolomics generates a huge amount of data. Software packages for automated data processing are crucial to successfully process these data. A variety of such software packages exist, but the outcome of data processing strongly depends on algorithm parameter settings. If they are not carefully chosen, suboptimal parameter settings can easily lead to biased results. Therefore, parameter settings also require optimization. Several parameter optimization approaches have already been proposed, but a software package for parameter optimization which is free of intricate experimental labeling steps, fast and widely applicable is still missing. We implemented the software package IPO ('Isotopologue Parameter Optimization') which is fast and free of labeling steps, and applicable to data from different kinds of samples and data from different methods of liquid chromatography - high resolution mass spectrometry and data from different instruments. IPO optimizes XCMS peak picking parameters by using natural, stable (13)C isotopic peaks to calculate a peak picking score. Retention time correction is optimized by minimizing relative retention time differences within peak groups. Grouping parameters are optimized by maximizing the number of peak groups that show one peak from each injection of a pooled sample. The different parameter settings are achieved by design of experiments, and the resulting scores are evaluated using response surface models. IPO was tested on three different data sets, each consisting of a training set and test set. IPO resulted in an increase of reliable groups (146% - 361%), a decrease of non-reliable groups (3% - 8%) and a decrease of the retention time deviation to one third. IPO was successfully applied to data derived from liquid chromatography coupled to high resolution mass spectrometry from three studies with different sample types and different chromatographic methods and devices. We were also able to show the potential of IPO to

  13. Automated quantitative analysis to assess motor function in different rat models of impaired coordination and ataxia.

    Science.gov (United States)

    Kyriakou, Elisavet I; van der Kieft, Jan G; de Heer, Raymond C; Spink, Andrew; Nguyen, Huu Phuc; Homberg, Judith R; van der Harst, Johanneke E

    2016-08-01

    An objective and automated method for assessing alterations in gait and motor coordination in different animal models is important for proper gait analysis. The CatWalk system has been used in pain research, ischemia, arthritis, spinal cord injury and some animal models for neurodegenerative diseases. Our goals were to obtain a comprehensive gait analysis of three different rat models and to identify which motor coordination parameters are affected and are the most suitable and sensitive to describe and detect ataxia with a secondary focus on possible training effects. Both static and dynamic parameters showed significant differences in all three models: enriched housed rats show higher walking and swing speed and longer stride length, ethanol-induced ataxia affects mainly the hind part of the body, and the SCA17 rats show coordination disturbances. Coordination changes were revealed only in the case of the ethanol-induced ataxia and the SCA17 rat model. Although training affected some gait parameters, it did not obscure group differences when those were present. To our knowledge, a comparative gait assessment in rats with enriched housing conditions, ethanol-induced ataxia and SCA17 has not been presented before. There is no gold standard for the use of CatWalk. Dependent on the specific effects expected, the protocol can be adjusted. By including all sessions in the analysis, any training effect should be detectable and the development of the performance over the sessions can provide insight in effects attributed to intervention, treatment or injury. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Self-Organization and Self-Coordination in Welding Automation with Collaborating Teams of Industrial Robots

    Directory of Open Access Journals (Sweden)

    Günther Starke

    2016-11-01

    Full Text Available In welding automation, growing interest can be recognized in applying teams of industrial robots to perform manufacturing processes through collaboration. Although robot teamwork can increase profitability and cost-effectiveness in production, the programming of the robots is still a problem. It is extremely time consuming and requires special expertise in synchronizing the activities of the robots to avoid any collision. Therefore, a research project has been initiated to solve those problems. This paper will present strategies, concepts, and research results in applying robot operating system (ROS and ROS-based solutions to overcome existing technical deficits through the integration of self-organization capabilities, autonomous path planning, and self-coordination of the robots’ work. The new approach should contribute to improving the application of robot teamwork and collaboration in the manufacturing sector at a higher level of flexibility and reduced need for human intervention.

  15. Application of Advanced Particle Swarm Optimization Techniques to Wind-thermal Coordination

    DEFF Research Database (Denmark)

    Singh, Sri Niwas; Østergaard, Jacob; Yadagiri, J.

    2009-01-01

    wind-thermal coordination algorithm is necessary to determine the optimal proportion of wind and thermal generator capacity that can be integrated into the system. In this paper, four versions of Particle Swarm Optimization (PSO) techniques are proposed for solving wind-thermal coordination problem...

  16. Automated magnetic divertor design for optimal power exhaust

    International Nuclear Information System (INIS)

    Blommaert, Maarten

    2017-01-01

    The so-called divertor is the standard particle and power exhaust system of nuclear fusion tokamaks. In essence, the magnetic configuration hereby 'diverts' the plasma to a specific divertor structure. The design of this divertor is still a key issue to be resolved to evolve from experimental fusion tokamaks to commercial power plants. The focus of this dissertation is on one particular design requirement: avoiding excessive heat loads on the divertor structure. The divertor design process is assisted by plasma edge transport codes that simulate the plasma and neutral particle transport in the edge of the reactor. These codes are computationally extremely demanding, not in the least due to the complex collisional processes between plasma and neutrals that lead to strong radiation sinks and macroscopic heat convection near the vessel walls. One way of improving the heat exhaust is by modifying the magnetic confinement that governs the plasma flow. In this dissertation, automated design of the magnetic configuration is pursued using adjoint based optimization methods. A simple and fast perturbation model is used to compute the magnetic field in the vacuum vessel. A stable optimal design method of the nested type is then elaborated that strictly accounts for several nonlinear design constraints and code limitations. Using appropriate cost function definitions, the heat is spread more uniformly over the high-heat load plasma-facing components in a practical design example. Furthermore, practical in-parts adjoint sensitivity calculations are presented that provide a way to an efficient optimization procedure. Results are elaborated for a fictituous JET (Joint European Torus) case. The heat load is strongly reduced by exploiting an expansion of the magnetic flux towards the solid divertor structure. Subsequently, shortcomings of the perturbation model for magnetic field calculations are discussed in comparison to a free boundary equilibrium (FBE) simulation

  17. Automated magnetic divertor design for optimal power exhaust

    Energy Technology Data Exchange (ETDEWEB)

    Blommaert, Maarten

    2017-07-01

    The so-called divertor is the standard particle and power exhaust system of nuclear fusion tokamaks. In essence, the magnetic configuration hereby 'diverts' the plasma to a specific divertor structure. The design of this divertor is still a key issue to be resolved to evolve from experimental fusion tokamaks to commercial power plants. The focus of this dissertation is on one particular design requirement: avoiding excessive heat loads on the divertor structure. The divertor design process is assisted by plasma edge transport codes that simulate the plasma and neutral particle transport in the edge of the reactor. These codes are computationally extremely demanding, not in the least due to the complex collisional processes between plasma and neutrals that lead to strong radiation sinks and macroscopic heat convection near the vessel walls. One way of improving the heat exhaust is by modifying the magnetic confinement that governs the plasma flow. In this dissertation, automated design of the magnetic configuration is pursued using adjoint based optimization methods. A simple and fast perturbation model is used to compute the magnetic field in the vacuum vessel. A stable optimal design method of the nested type is then elaborated that strictly accounts for several nonlinear design constraints and code limitations. Using appropriate cost function definitions, the heat is spread more uniformly over the high-heat load plasma-facing components in a practical design example. Furthermore, practical in-parts adjoint sensitivity calculations are presented that provide a way to an efficient optimization procedure. Results are elaborated for a fictituous JET (Joint European Torus) case. The heat load is strongly reduced by exploiting an expansion of the magnetic flux towards the solid divertor structure. Subsequently, shortcomings of the perturbation model for magnetic field calculations are discussed in comparison to a free boundary equilibrium (FBE) simulation

  18. Optimal Coordinated Strategy Analysis for the Procurement Logistics of a Steel Group

    Directory of Open Access Journals (Sweden)

    Lianbo Deng

    2014-01-01

    Full Text Available This paper focuses on the optimization of an internal coordinated procurement logistics system in a steel group and the decision on the coordinated procurement strategy by minimizing the logistics costs. Considering the coordinated procurement strategy and the procurement logistics costs, the aim of the optimization model was to maximize the degree of quality satisfaction and to minimize the procurement logistics costs. The model was transformed into a single-objective model and solved using a simulated annealing algorithm. In the algorithm, the supplier of each subsidiary was selected according to the evaluation result for independent procurement. Finally, the effect of different parameters on the coordinated procurement strategy was analysed. The results showed that the coordinated strategy can clearly save procurement costs; that the strategy appears to be more cooperative when the quality requirement is not stricter; and that the coordinated costs have a strong effect on the coordinated procurement strategy.

  19. Development of An Optimization Method for Determining Automation Rate in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Lee, Seung Min; Seong, Poong Hyun; Kim, Jong Hyun

    2014-01-01

    Since automation was introduced in various industrial fields, it has been known that automation provides positive effects like greater efficiency and fewer human errors, and negative effect defined as out-of-the-loop (OOTL). Thus, before introducing automation in nuclear field, the estimation of the positive and negative effects of automation on human operators should be conducted. In this paper, by focusing on CPS, the optimization method to find an appropriate proportion of automation is suggested by integrating the suggested cognitive automation rate and the concepts of the level of ostracism. The cognitive automation rate estimation method was suggested to express the reduced amount of human cognitive loads, and the level of ostracism was suggested to express the difficulty in obtaining information from the automation system and increased uncertainty of human operators' diagnosis. The maximized proportion of automation that maintains the high level of attention for monitoring the situation is derived by an experiment, and the automation rate is estimated by the suggested automation rate estimation method. It is expected to derive an appropriate inclusion proportion of the automation system avoiding the OOTL problem and having maximum efficacy at the same time

  20. Segmentation optimization and stratified object-based analysis for semi-automated geomorphological mapping

    NARCIS (Netherlands)

    Anders, N.S.; Seijmonsbergen, A.C.; Bouten, W.

    2011-01-01

    Semi-automated geomorphological mapping techniques are gradually replacing classical techniques due to increasing availability of high-quality digital topographic data. In order to efficiently analyze such large amounts of data, there is a need for optimizing the processing of automated mapping

  1. Design of an optimal automation system : Finding a balance between a human's task engagement and exhaustion

    NARCIS (Netherlands)

    Klein, Michel; van Lambalgen, Rianne

    2011-01-01

    In demanding tasks, human performance can seriously degrade as a consequence of increased workload and limited resources. In such tasks it is very important to maintain an optimal performance quality, therefore automation assistance is required. On the other hand, automation can also impose

  2. On the use of PGD for optimal control applied to automated fibre placement

    Science.gov (United States)

    Bur, N.; Joyot, P.

    2017-10-01

    Automated Fibre Placement (AFP) is an incipient manufacturing process for composite structures. Despite its concep-tual simplicity it involves many complexities related to the necessity of melting the thermoplastic at the interface tape-substrate, ensuring the consolidation that needs the diffusion of molecules and control the residual stresses installation responsible of the residual deformations of the formed parts. The optimisation of the process and the determination of the process window cannot be achieved in a traditional way since it requires a plethora of trials/errors or numerical simulations, because there are many parameters involved in the characterisation of the material and the process. Using reduced order modelling such as the so called Proper Generalised Decomposition method, allows the construction of multi-parametric solution taking into account many parameters. This leads to virtual charts that can be explored on-line in real time in order to perform process optimisation or on-line simulation-based control. Thus, for a given set of parameters, determining the power leading to an optimal temperature becomes easy. However, instead of controlling the power knowing the temperature field by particularizing an abacus, we propose here an approach based on optimal control: we solve by PGD a dual problem from heat equation and optimality criteria. To circumvent numerical issue due to ill-conditioned system, we propose an algorithm based on Uzawa's method. That way, we are able to solve the dual problem, setting the desired state as an extra-coordinate in the PGD framework. In a single computation, we get both the temperature field and the required heat flux to reach a parametric optimal temperature on a given zone.

  3. STARS IDENTIFICATION AT THE ASTRONOMICAL COORDINATES DETERMINATION BY MEANS OF AN AUTOMATED ZENITH TELESCOPE

    Directory of Open Access Journals (Sweden)

    S. V. Gayvoronsky

    2015-01-01

    Full Text Available Scope of research. The paper deals with two approaches to the stars identification: an algorithm of similar triangles and an algorithm of interstellar angular distances. Method. Comparative analysis of the considered algorithms is performed using experimental data obtained by the prototype of zenith telescope as applied to the problem of coordinates determination by automated zenith telescope. Main results. The analysis has revealed that identification method based on the interstellar angular distances provides star identification with higher reliability and several times faster than the algorithm of similar triangles. However, the algorithm of interstellar angular distances is sensitive to the lens focal length, so a combined stars identification method is proposed. The idea of this method is to integrate the two above algorithms in order to calculate the lens focal length and to identify directly the stars. Practical significance. The combined method gives the possibility for valid identification of the stars visible in the field of view with comparatively short processing time whether the lens focal length is available or not.

  4. Topological Optimization and Automated Construction for Lightweight Structures

    Data.gov (United States)

    National Aeronautics and Space Administration — The author proposes the development of an automated construction system for HEDS applications which will implement a game-changing material resource strategy...

  5. An Automated DAKOTA and VULCAN-CFD Framework with Application to Supersonic Facility Nozzle Flowpath Optimization

    Science.gov (United States)

    Axdahl, Erik L.

    2015-01-01

    Removing human interaction from design processes by using automation may lead to gains in both productivity and design precision. This memorandum describes efforts to incorporate high fidelity numerical analysis tools into an automated framework and applying that framework to applications of practical interest. The purpose of this effort was to integrate VULCAN-CFD into an automated, DAKOTA-enabled framework with a proof-of-concept application being the optimization of supersonic test facility nozzles. It was shown that the optimization framework could be deployed on a high performance computing cluster with the flow of information handled effectively to guide the optimization process. Furthermore, the application of the framework to supersonic test facility nozzle flowpath design and optimization was demonstrated using multiple optimization algorithms.

  6. Application of a Continuous Particle Swarm Optimization (CPSO for the Optimal Coordination of Overcurrent Relays Considering a Penalty Method

    Directory of Open Access Journals (Sweden)

    Abdul Wadood

    2018-04-01

    Full Text Available In an electrical power system, the coordination of the overcurrent relays plays an important role in protecting the electrical system by providing primary as well as backup protection. To reduce power outages, the coordination between these relays should be kept at the optimum value to minimize the total operating time and ensure that the least damage occurs under fault conditions. It is also imperative to ensure that the relay setting does not create an unintentional operation and consecutive sympathy trips. In a power system protection coordination problem, the objective function to be optimized is the sum of the total operating time of all main relays. In this paper, the coordination of overcurrent relays in a ring fed distribution system is formulated as an optimization problem. Coordination is performed using proposed continuous particle swarm optimization. In order to enhance and improve the quality of this solution a local search algorithm (LSA is implanted into the original particle swarm algorithm (PSO and, in addition to the constraints, these are amalgamated into the fitness function via the penalty method. The results achieved from the continuous particle swarm optimization algorithm (CPSO are compared with other evolutionary optimization algorithms (EA and this comparison showed that the proposed scheme is competent in dealing with the relevant problems. From further analyzing the obtained results, it was found that the continuous particle swarm approach provides the most globally optimum solution.

  7. Review of Automated Design and Optimization of MEMS

    DEFF Research Database (Denmark)

    Achiche, Sofiane; Fan, Zhun; Bolognini, Francesca

    2007-01-01

    carried out. This paper presents a review of these techniques. The design task of MEMS is usually divided into four main stages: System Level, Device Level, Physical Level and the Process Level. The state of the art o automated MEMS design in each of these levels is investigated.......In recent years MEMS saw a very rapid development. Although many advances have been reached, due to the multiphysics nature of MEMS, their design is still a difficult task carried on mainly by hand calculation. In order to help to overtake such difficulties, attempts to automate MEMS design were...

  8. A new hybrid optimization algorithm CRO-DE for optimal coordination of overcurrent relays in complex power systems

    Directory of Open Access Journals (Sweden)

    Mohamed Zellagui

    2017-09-01

    Full Text Available The paper presents a new hybrid global optimization algorithm based on Chemical Reaction based Optimization (CRO and Di¤erential evolution (DE algorithm for nonlinear constrained optimization problems. This approach proposed for the optimal coordination and setting relays of directional overcurrent relays in complex power systems. In protection coordination problem, the objective function to be minimized is the sum of the operating time of all main relays. The optimization problem is subject to a number of constraints which are mainly focused on the operation of the backup relay, which should operate if a primary relay fails to respond to the fault near to it, Time Dial Setting (TDS, Plug Setting (PS and the minimum operating time of a relay. The hybrid global proposed optimization algorithm aims to minimize the total operating time of each protection relay. Two systems are used as case study to check the effeciency of the optimization algorithm which are IEEE 4-bus and IEEE 6-bus models. Results are obtained and presented for CRO and DE and hybrid CRO-DE algorithms. The obtained results for the studied cases are compared with those results obtained when using other optimization algorithms which are Teaching Learning-Based Optimization (TLBO, Chaotic Differential Evolution Algorithm (CDEA and Modiffied Differential Evolution Algorithm (MDEA, and Hybrid optimization algorithms (PSO-DE, IA-PSO, and BFOA-PSO. From analysing the obtained results, it has been concluded that hybrid CRO-DO algorithm provides the most optimum solution with the best convergence rate.

  9. Hybrid optimal online-overnight charging coordination of plug-in electric vehicles in smart grid

    Science.gov (United States)

    Masoum, Mohammad A. S.; Nabavi, Seyed M. H.

    2016-10-01

    Optimal coordinated charging of plugged-in electric vehicles (PEVs) in smart grid (SG) can be beneficial for both consumers and utilities. This paper proposes a hybrid optimal online followed by overnight charging coordination of high and low priority PEVs using discrete particle swarm optimization (DPSO) that considers the benefits of both consumers and electric utilities. Objective functions are online minimization of total cost (associated with grid losses and energy generation) and overnight valley filling through minimization of the total load levels. The constraints include substation transformer loading, node voltage regulations and the requested final battery state of charge levels (SOCreq). The main challenge is optimal selection of the overnight starting time (toptimal-overnight,start) to guarantee charging of all vehicle batteries to the SOCreq levels before the requested plug-out times (treq) which is done by simultaneously solving the online and overnight objective functions. The online-overnight PEV coordination approach is implemented on a 449-node SG; results are compared for uncoordinated and coordinated battery charging as well as a modified strategy using cost minimizations for both online and overnight coordination. The impact of toptimal-overnight,start on performance of the proposed PEV coordination is investigated.

  10. Optimal Trajectory Planning and Coordinated Tracking Control Method of Tethered Space Robot Based on Velocity Impulse

    OpenAIRE

    Huang, Panfeng; Xu, Xiudong; Meng, Zhongjie

    2014-01-01

    The tethered space robot (TSR) is a new concept of space robot which consists of a robot platform, space tether and operation robot. This paper presents a multi-objective optimal trajectory planning and a coordinated tracking control scheme for TSR based on velocity impulse in the approaching phase. Both total velocity impulse and flight time are included in this optimization. The non-dominated sorting genetic algorithm is employed to obtain the optimal trajectory Pareto solution using the TS...

  11. Coordinated Target Tracking via a Hybrid Optimization Approach

    Directory of Open Access Journals (Sweden)

    Yin Wang

    2017-02-01

    Full Text Available Recent advances in computer science and electronics have greatly expanded the capabilities of unmanned aerial vehicles (UAV in both defense and civil applications, such as moving ground object tracking. Due to the uncertainties of the application environments and objects’ motion, it is difficult to maintain the tracked object always within the sensor coverage area by using a single UAV. Hence, it is necessary to deploy a group of UAVs to improve the robustness of the tracking. This paper investigates the problem of tracking ground moving objects with a group of UAVs using gimbaled sensors under flight dynamic and collision-free constraints. The optimal cooperative tracking path planning problem is solved using an evolutionary optimization technique based on the framework of chemical reaction optimization (CRO. The efficiency of the proposed method was demonstrated through a series of comparative simulations. The results show that the cooperative tracking paths determined by the newly developed method allows for longer sensor coverage time under flight dynamic restrictions and safety conditions.

  12. Coordinated Target Tracking via a Hybrid Optimization Approach.

    Science.gov (United States)

    Wang, Yin; Cao, Yan

    2017-02-27

    Recent advances in computer science and electronics have greatly expanded the capabilities of unmanned aerial vehicles (UAV) in both defense and civil applications, such as moving ground object tracking. Due to the uncertainties of the application environments and objects' motion, it is difficult to maintain the tracked object always within the sensor coverage area by using a single UAV. Hence, it is necessary to deploy a group of UAVs to improve the robustness of the tracking. This paper investigates the problem of tracking ground moving objects with a group of UAVs using gimbaled sensors under flight dynamic and collision-free constraints. The optimal cooperative tracking path planning problem is solved using an evolutionary optimization technique based on the framework of chemical reaction optimization (CRO). The efficiency of the proposed method was demonstrated through a series of comparative simulations. The results show that the cooperative tracking paths determined by the newly developed method allows for longer sensor coverage time under flight dynamic restrictions and safety conditions.

  13. Optimal Trajectory Planning and Coordinated Tracking Control Method of Tethered Space Robot Based on Velocity Impulse

    Directory of Open Access Journals (Sweden)

    Panfeng Huang

    2014-09-01

    Full Text Available The tethered space robot (TSR is a new concept of space robot which consists of a robot platform, space tether and operation robot. This paper presents a multi-objective optimal trajectory planning and a coordinated tracking control scheme for TSR based on velocity impulse in the approaching phase. Both total velocity impulse and flight time are included in this optimization. The non-dominated sorting genetic algorithm is employed to obtain the optimal trajectory Pareto solution using the TSR dynamic model and optimal trajectory planning model. The coordinated tracking control scheme utilizes optimal velocity impulse. Furthermore, the PID controller is designed in order to compensate for the distance measurement errors. The PID control force is optimized and distributed to thrusters and the space tether using a simulated annealing algorithm. The attitude interferential torque of the space tether is compensated a using time-delay algorithm through reaction wheels. The simulation results show that the multi-objective optimal trajectory planning method can reveal the relationships among flight time, fuel consumption, planar view angle and velocity impulse number. This method can provide a series of optimal trajectory according to a number of special tasks. The coordinated control scheme can significantly save thruster fuel for tracking the optimal trajectory, restrain the attitude interferential torque produced by space tether and maintain the relative attitude stability of the operation robot.

  14. Routing Optimization of Intelligent Vehicle in Automated Warehouse

    Directory of Open Access Journals (Sweden)

    Yan-cong Zhou

    2014-01-01

    Full Text Available Routing optimization is a key technology in the intelligent warehouse logistics. In order to get an optimal route for warehouse intelligent vehicle, routing optimization in complex global dynamic environment is studied. A new evolutionary ant colony algorithm based on RFID and knowledge-refinement is proposed. The new algorithm gets environmental information timely through the RFID technology and updates the environment map at the same time. It adopts elite ant kept, fallback, and pheromones limitation adjustment strategy. The current optimal route in population space is optimized based on experiential knowledge. The experimental results show that the new algorithm has higher convergence speed and can jump out the U-type or V-type obstacle traps easily. It can also find the global optimal route or approximate optimal one with higher probability in the complex dynamic environment. The new algorithm is proved feasible and effective by simulation results.

  15. Automated sizing of large structures by mixed optimization methods

    Science.gov (United States)

    Sobieszczanski, J.; Loendorf, D.

    1973-01-01

    A procedure for automating the sizing of wing-fuselage airframes was developed and implemented in the form of an operational program. The program combines fully stressed design to determine an overall material distribution with mass-strength and mathematical programming methods to design structural details accounting for realistic design constraints. The practicality and efficiency of the procedure is demonstrated for transport aircraft configurations. The methodology is sufficiently general to be applicable to other large and complex structures.

  16. Optimal Energy Management of Multi-Microgrids with Sequentially Coordinated Operations

    Directory of Open Access Journals (Sweden)

    Nah-Oak Song

    2015-08-01

    Full Text Available We propose an optimal electric energy management of a cooperative multi-microgrid community with sequentially coordinated operations. The sequentially coordinated operations are suggested to distribute computational burden and yet to make the optimal 24 energy management of multi-microgrids possible. The sequential operations are mathematically modeled to find the optimal operation conditions and illustrated with physical interpretation of how to achieve optimal energy management in the cooperative multi-microgrid community. This global electric energy optimization of the cooperative community is realized by the ancillary internal trading between the microgrids in the cooperative community which reduces the extra cost from unnecessary external trading by adjusting the electric energy production amounts of combined heat and power (CHP generators and amounts of both internal and external electric energy trading of the cooperative community. A simulation study is also conducted to validate the proposed mathematical energy management models.

  17. Optimizing centrifugation of coagulation samples in laboratory automation.

    Science.gov (United States)

    Suchsland, Juliane; Friedrich, Nele; Grotevendt, Anne; Kallner, Anders; Lüdemann, Jan; Nauck, Matthias; Petersmann, Astrid

    2014-08-01

    High acceleration centrifugation conditions are used in laboratory automation systems to reduce the turnaround time (TAT) of clinical chemistry samples, but not of coagulation samples. This often requires separate sample flows. The CLSI guideline and manufacturers recommendations for coagulation assays aim at reducing platelet counts. For measurement of prothrombin time (PT) and activated partial thromboplastin time (APTT) platelet counts (Plt) below 200×10(9)/L are recommended. Other coagulation assays may require even lower platelet counts, e.g., less than 10 × 10(9)/L. Unifying centrifugation conditions can facilitate the integration of coagulation samples in the overall workflow of a laboratory automation system. We evaluated centrifugation conditions of coagulation samples by using high acceleration centrifugation conditions (5 min; 3280×g) in a single and two consecutive runs. RESULTS of coagulation assays [PT, APTT, coagulation factor VIII (F. VIII) and protein S] and platelet counts were compared after the first and second centrifugation. Platelet counts below 200×10(9)/L were obtained in all samples after the first centrifugation and less than 10 × 10(9)/L was obtained in 73% of the samples after a second centrifugation. Passing-Bablok regression analyses showed an equal performance of PT, APTT and F. VIII after first and second centrifugation whereas protein S measurements require a second centrifugation. Coagulation samples can be integrated into the workflow of a laboratory automation system using high acceleration centrifugation. A single centrifugation was sufficient for PT, APTT and F. VIII whereas two successive centrifugations appear to be sufficient for protein S activity.

  18. An Automated Tool for Optimizing Waste Transportation Routing and Scheduling

    International Nuclear Information System (INIS)

    Berry, L.E.; Branch, R.D.; White, H.A.; Whitehead, H. D. Jr.; Becker, B.D.

    2006-01-01

    An automated software tool has been developed and implemented to increase the efficiency and overall life-cycle productivity of site cleanup by scheduling vehicle and container movement between waste generators and disposal sites on the Department of Energy's Oak Ridge Reservation. The software tool identifies the best routes or accepts specifically requested routes and transit times, looks at fleet availability, selects the most cost effective route for each waste stream, and creates a transportation schedule in advance of waste movement. This tool was accepted by the customer and has been implemented. (authors)

  19. An Automated Analysis-Synthesis Package for Design Optimization ...

    African Journals Online (AJOL)

    90 standards is developed for the design optimization of framed structures - continuous beams, plane and space trusses and rigid frames, grids and composite truss-rigid frames. The package will enable the structural engineer to effectively and ...

  20. Optimal Multiuser Zero Forcing with Per-Antenna Power Constraints for Network MIMO Coordination

    Directory of Open Access Journals (Sweden)

    Kaviani Saeed

    2011-01-01

    Full Text Available We consider a multicell multiple-input multiple-output (MIMO coordinated downlink transmission, also known as network MIMO, under per-antenna power constraints. We investigate a simple multiuser zero-forcing (ZF linear precoding technique known as block diagonalization (BD for network MIMO. The optimal form of BD with per-antenna power constraints is proposed. It involves a novel approach of optimizing the precoding matrices over the entire null space of other users' transmissions. An iterative gradient descent method is derived by solving the dual of the throughput maximization problem, which finds the optimal precoding matrices globally and efficiently. The comprehensive simulations illustrate several network MIMO coordination advantages when the optimal BD scheme is used. Its achievable throughput is compared with the capacity region obtained through the recently established duality concept under per-antenna power constraints.

  1. Coordinated trajectory planning of dual-arm space robot using constrained particle swarm optimization

    Science.gov (United States)

    Wang, Mingming; Luo, Jianjun; Yuan, Jianping; Walter, Ulrich

    2018-05-01

    Application of the multi-arm space robot will be more effective than single arm especially when the target is tumbling. This paper investigates the application of particle swarm optimization (PSO) strategy to coordinated trajectory planning of the dual-arm space robot in free-floating mode. In order to overcome the dynamics singularities issue, the direct kinematics equations in conjunction with constrained PSO are employed for coordinated trajectory planning of dual-arm space robot. The joint trajectories are parametrized with Bézier curve to simplify the calculation. Constrained PSO scheme with adaptive inertia weight is implemented to find the optimal solution of joint trajectories while specific objectives and imposed constraints are satisfied. The proposed method is not sensitive to the singularity issue due to the application of forward kinematic equations. Simulation results are presented for coordinated trajectory planning of two kinematically redundant manipulators mounted on a free-floating spacecraft and demonstrate the effectiveness of the proposed method.

  2. Automated IMRT planning with regional optimization using planning scripts.

    Science.gov (United States)

    Xhaferllari, Ilma; Wong, Eugene; Bzdusek, Karl; Lock, Michael; Chen, Jeff

    2013-01-07

    Intensity-modulated radiation therapy (IMRT) has become a standard technique in radiation therapy for treating different types of cancers. Various class solutions have been developed for simple cases (e.g., localized prostate, whole breast) to generate IMRT plans efficiently. However, for more complex cases (e.g., head and neck, pelvic nodes), it can be time-consuming for a planner to generate optimized IMRT plans. To generate optimal plans in these more complex cases which generally have multiple target volumes and organs at risk, it is often required to have additional IMRT optimization structures such as dose limiting ring structures, adjust beam geometry, select inverse planning objectives and associated weights, and additional IMRT objectives to reduce cold and hot spots in the dose distribution. These parameters are generally manually adjusted with a repeated trial and error approach during the optimization process. To improve IMRT planning efficiency in these more complex cases, an iterative method that incorporates some of these adjustment processes automatically in a planning script is designed, implemented, and validated. In particular, regional optimization has been implemented in an iterative way to reduce various hot or cold spots during the optimization process that begins with defining and automatic segmentation of hot and cold spots, introducing new objectives and their relative weights into inverse planning, and turn this into an iterative process with termination criteria. The method has been applied to three clinical sites: prostate with pelvic nodes, head and neck, and anal canal cancers, and has shown to reduce IMRT planning time significantly for clinical applications with improved plan quality. The IMRT planning scripts have been used for more than 500 clinical cases.

  3. A coordinated dispatch model for electricity and heat in a Microgrid via particle swarm optimization

    DEFF Research Database (Denmark)

    Xu, Lizhong; Yang, Guangya; Xu, Zhao

    2013-01-01

    . Particle swarm optimization (PSO) is employed to solve this model for the operation schedule to minimize the total operational cost of Microgrid by coordinating the CHP, electric heater, boiler and heat storage. The efficacy of the model and methodology is verified with different operation scenarios....

  4. Optimizing the response to surveillance alerts in automated surveillance systems.

    Science.gov (United States)

    Izadi, Masoumeh; Buckeridge, David L

    2011-02-28

    Although much research effort has been directed toward refining algorithms for disease outbreak alerting, considerably less attention has been given to the response to alerts generated from statistical detection algorithms. Given the inherent inaccuracy in alerting, it is imperative to develop methods that help public health personnel identify optimal policies in response to alerts. This study evaluates the application of dynamic decision making models to the problem of responding to outbreak detection methods, using anthrax surveillance as an example. Adaptive optimization through approximate dynamic programming is used to generate a policy for decision making following outbreak detection. We investigate the degree to which the model can tolerate noise theoretically, in order to keep near optimal behavior. We also evaluate the policy from our model empirically and compare it with current approaches in routine public health practice for investigating alerts. Timeliness of outbreak confirmation and total costs associated with the decisions made are used as performance measures. Using our approach, on average, 80 per cent of outbreaks were confirmed prior to the fifth day of post-attack with considerably less cost compared to response strategies currently in use. Experimental results are also provided to illustrate the robustness of the adaptive optimization approach and to show the realization of the derived error bounds in practice. Copyright © 2011 John Wiley & Sons, Ltd.

  5. Coordinated Optimization of Distributed Energy Resources and Smart Loads in Distribution Systems: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Rui; Zhang, Yingchen

    2016-08-01

    Distributed energy resources (DERs) and smart loads have the potential to provide flexibility to the distribution system operation. A coordinated optimization approach is proposed in this paper to actively manage DERs and smart loads in distribution systems to achieve the optimal operation status. A three-phase unbalanced Optimal Power Flow (OPF) problem is developed to determine the output from DERs and smart loads with respect to the system operator's control objective. This paper focuses on coordinating PV systems and smart loads to improve the overall voltage profile in distribution systems. Simulations have been carried out in a 12-bus distribution feeder and results illustrate the superior control performance of the proposed approach.

  6. Study on Electricity Purchase Optimization in Coordination of Electricity and Carbon Trading

    Science.gov (United States)

    Liu, Dunnan; Meng, Yaru; Zhang, Shuo

    2017-07-01

    With the establishment of carbon emissions trading market in China, the power industry has become an important part of the market participants. The power grid enterprises need to optimize their own strategies in the new environment of electricity market and carbon market coordination. First, the influence of electricity and carbon trading coordination on electricity purchase strategy for grid enterprises was analysed in the paper. Then a power purchase optimization model was presented, which used the minimum cost of low carbon, energy saving and environment protection as the goal, the power generation capacity, installed capacity and pollutant emission as the constraints. Finally, a provincial power grid was taken as an example to analyse the model, and the optimization order of power purchase was obtained, which provided a new idea for the low carbon development of power grid enterprises.

  7. Automated Design and Optimization of Pebble-bed Reactor Cores

    International Nuclear Information System (INIS)

    Gougar, Hans D.; Ougouag, Abderrafi M.; Terry, William K.

    2010-01-01

    We present a conceptual design approach for high-temperature gas-cooled reactors using recirculating pebble-bed cores. The design approach employs PEBBED, a reactor physics code specifically designed to solve for and analyze the asymptotic burnup state of pebble-bed reactors, in conjunction with a genetic algorithm to obtain a core that maximizes a fitness value that is a function of user-specified parameters. The uniqueness of the asymptotic core state and the small number of independent parameters that define it suggest that core geometry and fuel cycle can be efficiently optimized toward a specified objective. PEBBED exploits a novel representation of the distribution of pebbles that enables efficient coupling of the burnup and neutron diffusion solvers. With this method, even complex pebble recirculation schemes can be expressed in terms of a few parameters that are amenable to modern optimization techniques. With PEBBED, the user chooses the type and range of core physics parameters that represent the design space. A set of traits, each with acceptable and preferred values expressed by a simple fitness function, is used to evaluate the candidate reactor cores. The stochastic search algorithm automatically drives the generation of core parameters toward the optimal core as defined by the user. The optimized design can then be modeled and analyzed in greater detail using higher resolution and more computationally demanding tools to confirm the desired characteristics. For this study, the design of pebble-bed high temperature reactor concepts subjected to demanding physical constraints demonstrated the efficacy of the PEBBED algorithm.

  8. Automation for pattern library creation and in-design optimization

    Science.gov (United States)

    Deng, Rock; Zou, Elain; Hong, Sid; Wang, Jinyan; Zhang, Yifan; Sweis, Jason; Lai, Ya-Chieh; Ding, Hua; Huang, Jason

    2015-03-01

    contain remedies built in so that fixing happens either automatically or in a guided manner. Building a comprehensive library of patterns is a very difficult task especially when a new technology node is being developed or the process keeps changing. The main dilemma is not having enough representative layouts to use for model simulation where pattern locations can be marked and extracted. This paper will present an automatic pattern library creation flow by using a few known yield detractor patterns to systematically expand the pattern library and generate optimized patterns. We will also look at the specific fixing hints in terms of edge movements, additive, or subtractive changes needed during optimization. Optimization will be shown for both the digital physical implementation and custom design methods.

  9. Automated inverse optimization facilitates lower doses to normal tissue in pancreatic stereotactic body radiotherapy.

    Science.gov (United States)

    Mihaylov, Ivaylo B; Mellon, Eric A; Yechieli, Raphael; Portelance, Lorraine

    2018-01-01

    Inverse planning is trial-and-error iterative process. This work introduces a fully automated inverse optimization approach, where the treatment plan is closely tailored to the unique patient anatomy. The auto-optimization is applied to pancreatic stereotactic body radiotherapy (SBRT). The automation is based on stepwise reduction of dose-volume histograms (DVHs). Five uniformly spaced points, from 1% to 70% of the organ at risk (OAR) volumes, are used. Doses to those DVH points are iteratively decreased through multiple optimization runs. With each optimization run the doses to the OARs are decreased, while the dose homogeneity over the target is increased. The iterative process is terminated when a pre-specified dose heterogeneity over the target is reached. Twelve pancreatic cases were retrospectively studied. Doses to the target, maximum doses to duodenum, bowel, stomach, and spinal cord were evaluated. In addition, mean doses to liver and kidneys were tallied. The auto-optimized plans were compared to the actual treatment plans, which are based on national protocols. The prescription dose to 95% of the planning target volume (PTV) is the same for the treatment and the auto-optimized plans. The average difference for maximum doses to duodenum, bowel, stomach, and spinal cord are -4.6 Gy, -1.8 Gy, -1.6 Gy, and -2.4 Gy respectively. The negative sign indicates lower doses with the auto-optimization. The average differences in the mean doses to liver and kidneys are -0.6 Gy, and -1.1 Gy to -1.5 Gy respectively. Automated inverse optimization holds great potential for personalization and tailoring of radiotherapy to particular patient anatomies. It can be utilized for normal tissue sparing or for an isotoxic dose escalation.

  10. Coordinated Optimization of Visual Cortical Maps (I) Symmetry-based Analysis

    Science.gov (United States)

    Reichl, Lars; Heide, Dominik; Löwel, Siegrid; Crowley, Justin C.; Kaschube, Matthias; Wolf, Fred

    2012-01-01

    In the primary visual cortex of primates and carnivores, functional architecture can be characterized by maps of various stimulus features such as orientation preference (OP), ocular dominance (OD), and spatial frequency. It is a long-standing question in theoretical neuroscience whether the observed maps should be interpreted as optima of a specific energy functional that summarizes the design principles of cortical functional architecture. A rigorous evaluation of this optimization hypothesis is particularly demanded by recent evidence that the functional architecture of orientation columns precisely follows species invariant quantitative laws. Because it would be desirable to infer the form of such an optimization principle from the biological data, the optimization approach to explain cortical functional architecture raises the following questions: i) What are the genuine ground states of candidate energy functionals and how can they be calculated with precision and rigor? ii) How do differences in candidate optimization principles impact on the predicted map structure and conversely what can be learned about a hypothetical underlying optimization principle from observations on map structure? iii) Is there a way to analyze the coordinated organization of cortical maps predicted by optimization principles in general? To answer these questions we developed a general dynamical systems approach to the combined optimization of visual cortical maps of OP and another scalar feature such as OD or spatial frequency preference. From basic symmetry assumptions we obtain a comprehensive phenomenological classification of possible inter-map coupling energies and examine representative examples. We show that each individual coupling energy leads to a different class of OP solutions with different correlations among the maps such that inferences about the optimization principle from map layout appear viable. We systematically assess whether quantitative laws resembling

  11. Fast automated airborne electromagnetic data interpretation using parallelized particle swarm optimization

    Science.gov (United States)

    Desmarais, Jacques K.; Spiteri, Raymond J.

    2017-12-01

    A parallelized implementation of the particle swarm optimization algorithm is developed. We use the optimization procedure to speed up a previously published algorithm for airborne electromagnetic data interpretation. This algorithm is the only parametrized automated procedure for extracting the three-dimensionally varying geometrical parameters of conductors embedded in a resistive environment, such as igneous and metamorphic terranes. When compared to the original algorithm, the new optimization procedure is faster by two orders of magnitude (factor of 100). Synthetic model tests show that for the chosen system architecture and objective function, the particle swarm optimization approach depends very weakly on the rate of communication of the processors. Optimal wall-clock times are obtained using three processors. The increased performance means that the algorithm can now easily be used for fast routine interpretation of airborne electromagnetic surveys consisting of several anomalies, as is displayed by a test on MEGATEM field data collected at the Chibougamau site, Québec.

  12. Dynamic Coordinated Shifting Control of Automated Mechanical Transmissions without a Clutch in a Plug-In Hybrid Electric Vehicle

    Directory of Open Access Journals (Sweden)

    Xinlei Liu

    2012-08-01

    Full Text Available On the basis of the shifting process of automated mechanical transmissions (AMTs for traditional hybrid electric vehicles (HEVs, and by combining the features of electric machines with fast response speed, the dynamic model of the hybrid electric AMT vehicle powertrain is built up, the dynamic characteristics of each phase of shifting process are analyzed, and a control strategy in which torque and speed of the engine and electric machine are coordinatively controlled to achieve AMT shifting control for a plug-in hybrid electric vehicle (PHEV without clutch is proposed. In the shifting process, the engine and electric machine are well controlled, and the shift jerk and power interruption and restoration time are reduced. Simulation and real car test results show that the proposed control strategy can more efficiently improve the shift quality for PHEVs equipped with AMTs.

  13. Energy Coordinative Optimization of Wind-Storage-Load Microgrids Based on Short-Term Prediction

    Directory of Open Access Journals (Sweden)

    Changbin Hu

    2015-02-01

    Full Text Available According to the topological structure of wind-storage-load complementation microgrids, this paper proposes a method for energy coordinative optimization which focuses on improvement of the economic benefits of microgrids in the prediction framework. First of all, the external characteristic mathematical model of distributed generation (DG units including wind turbines and storage batteries are established according to the requirements of the actual constraints. Meanwhile, using the minimum consumption costs from the external grid as the objective function, a grey prediction model with residual modification is introduced to output the predictive wind turbine power and load at specific periods. Second, based on the basic framework of receding horizon optimization, an intelligent genetic algorithm (GA is applied to figure out the optimum solution in the predictive horizon for the complex non-linear coordination control model of microgrids. The optimum results of the GA are compared with the receding solution of mixed integer linear programming (MILP. The obtained results show that the method is a viable approach for energy coordinative optimization of microgrid systems for energy flow and reasonable schedule. The effectiveness and feasibility of the proposed method is verified by examples.

  14. Optimization of axial enrichment distribution for BWR fuels using scoping libraries and block coordinate descent method

    Energy Technology Data Exchange (ETDEWEB)

    Tung, Wu-Hsiung, E-mail: wstong@iner.gov.tw; Lee, Tien-Tso; Kuo, Weng-Sheng; Yaur, Shung-Jung

    2017-03-15

    Highlights: • An optimization method for axial enrichment distribution in a BWR fuel was developed. • Block coordinate descent method is employed to search for optimal solution. • Scoping libraries are used to reduce computational effort. • Optimization search space consists of enrichment difference parameters. • Capability of the method to find optimal solution is demonstrated. - Abstract: An optimization method has been developed to search for the optimal axial enrichment distribution in a fuel assembly for a boiling water reactor core. The optimization method features: (1) employing the block coordinate descent method to find the optimal solution in the space of enrichment difference parameters, (2) using scoping libraries to reduce the amount of CASMO-4 calculation, and (3) integrating a core critical constraint into the objective function that is used to quantify the quality of an axial enrichment design. The objective function consists of the weighted sum of core parameters such as shutdown margin and critical power ratio. The core parameters are evaluated by using SIMULATE-3, and the cross section data required for the SIMULATE-3 calculation are generated by using CASMO-4 and scoping libraries. The application of the method to a 4-segment fuel design (with the highest allowable segment enrichment relaxed to 5%) demonstrated that the method can obtain an axial enrichment design with improved thermal limit ratios and objective function value while satisfying the core design constraints and core critical requirement through the use of an objective function. The use of scoping libraries effectively reduced the number of CASMO-4 calculation, from 85 to 24, in the 4-segment optimization case. An exhausted search was performed to examine the capability of the method in finding the optimal solution for a 4-segment fuel design. The results show that the method found a solution very close to the optimum obtained by the exhausted search. The number of

  15. Automated array-CGH optimized for archival formalin-fixed, paraffin-embedded tumor material

    Directory of Open Access Journals (Sweden)

    Nederlof Petra M

    2007-03-01

    Full Text Available Abstract Background Array Comparative Genomic Hybridization (aCGH is a rapidly evolving technology that still lacks complete standardization. Yet, it is of great importance to obtain robust and reproducible data to enable meaningful multiple hybridization comparisons. Special difficulties arise when aCGH is performed on archival formalin-fixed, paraffin-embedded (FFPE tissue due to its variable DNA quality. Recently, we have developed an effective DNA quality test that predicts suitability of archival samples for BAC aCGH. Methods In this report, we first used DNA from a cancer cell-line (SKBR3 to optimize the aCGH protocol for automated hybridization, and subsequently optimized and validated the procedure for FFPE breast cancer samples. We aimed for highest throughput, accuracy, and reproducibility applicable to FFPE samples, which can also be important in future diagnostic use. Results Our protocol of automated array-CGH on archival FFPE ULS-labeled DNA showed very similar results compared with published data and our previous manual hybridization method. Conclusion This report combines automated aCGH on unamplified archival FFPE DNA using non-enzymatic ULS labeling, and describes an optimized protocol for this combination resulting in improved quality and reproducibility.

  16. Optimal criteria for microscopic review of urinalysis following use of automated urine analyzer.

    Science.gov (United States)

    Khejonnit, Varanya; Pratumvinit, Busadee; Reesukumal, Kanit; Meepanya, Suriya; Pattanavin, Chanutchaya; Wongkrajang, Preechaya

    2015-01-15

    The Sysmex UX-2000 is a new, fully automated integrated urine analyzer. This device analyzes all physical and chemical characteristics of urine and sediments in urine on single platform. Because sediment analysis by fluorescent flow cytometry has limited ability to classify some formed elements present in urine (e.g., casts), laboratories should develop criteria for manual microscopic examination of urinalysis following the use of the automated urine analyzer. 399 urine samples were collected from routine workload. All samples were analyzed on the automated analyzer and were then compared to the results of the manual microscopic method to establish optimal criteria. Another set of 599 samples was then used to validate the optimized criteria. The efficiency of criteria and review rate were calculated. The false-positive and false-negative cases were enumerated and clarified. We can set 11 rules which are related to the parameters categorized by the UX-2000, including cells, casts, crystals, organisms, sperm, and flags. After optimizing the rules, the review rate was 54.1% and the false-negative rate was 2.8%. The combination of both UX-2000 and manual microscopic method obtain the best results. The UX-2000 improves efficiency by reducing the time and labor associated with the specimen analysis process. Copyright © 2014 Elsevier B.V. All rights reserved.

  17. Design optimization of single mixed refrigerant LNG process using a hybrid modified coordinate descent algorithm

    Science.gov (United States)

    Qyyum, Muhammad Abdul; Long, Nguyen Van Duc; Minh, Le Quang; Lee, Moonyong

    2018-01-01

    Design optimization of the single mixed refrigerant (SMR) natural gas liquefaction (LNG) process involves highly non-linear interactions between decision variables, constraints, and the objective function. These non-linear interactions lead to an irreversibility, which deteriorates the energy efficiency of the LNG process. In this study, a simple and highly efficient hybrid modified coordinate descent (HMCD) algorithm was proposed to cope with the optimization of the natural gas liquefaction process. The single mixed refrigerant process was modeled in Aspen Hysys® and then connected to a Microsoft Visual Studio environment. The proposed optimization algorithm provided an improved result compared to the other existing methodologies to find the optimal condition of the complex mixed refrigerant natural gas liquefaction process. By applying the proposed optimization algorithm, the SMR process can be designed with the 0.2555 kW specific compression power which is equivalent to 44.3% energy saving as compared to the base case. Furthermore, in terms of coefficient of performance (COP), it can be enhanced up to 34.7% as compared to the base case. The proposed optimization algorithm provides a deep understanding of the optimization of the liquefaction process in both technical and numerical perspectives. In addition, the HMCD algorithm can be employed to any mixed refrigerant based liquefaction process in the natural gas industry.

  18. An Integrative Behavioral Health Care Model Using Automated SBIRT and Care Coordination in Community Health Care.

    Science.gov (United States)

    Dwinnells, Ronald; Misik, Lauren

    2017-10-01

    Efficient and effective integration of behavioral health programs in a community health care practice emphasizes patient-centered medical home principles to improve quality of care. A prospective, 3-period, interrupted time series study was used to explore which of 3 different integrative behavioral health care screening and management processes were the most efficient and effective in prompting behavioral health screening, identification, interventions, and referrals in a community health practice. A total of 99.5% ( P behavioral health screenings; brief intervention rates nearly doubled to 83% ( P behavioral health care coordination.

  19. Optimizing transformations for automated, high throughput analysis of flow cytometry data

    Directory of Open Access Journals (Sweden)

    Weng Andrew

    2010-11-01

    Full Text Available Abstract Background In a high throughput setting, effective flow cytometry data analysis depends heavily on proper data preprocessing. While usual preprocessing steps of quality assessment, outlier removal, normalization, and gating have received considerable scrutiny from the community, the influence of data transformation on the output of high throughput analysis has been largely overlooked. Flow cytometry measurements can vary over several orders of magnitude, cell populations can have variances that depend on their mean fluorescence intensities, and may exhibit heavily-skewed distributions. Consequently, the choice of data transformation can influence the output of automated gating. An appropriate data transformation aids in data visualization and gating of cell populations across the range of data. Experience shows that the choice of transformation is data specific. Our goal here is to compare the performance of different transformations applied to flow cytometry data in the context of automated gating in a high throughput, fully automated setting. We examine the most common transformations used in flow cytometry, including the generalized hyperbolic arcsine, biexponential, linlog, and generalized Box-Cox, all within the BioConductor flowCore framework that is widely used in high throughput, automated flow cytometry data analysis. All of these transformations have adjustable parameters whose effects upon the data are non-intuitive for most users. By making some modelling assumptions about the transformed data, we develop maximum likelihood criteria to optimize parameter choice for these different transformations. Results We compare the performance of parameter-optimized and default-parameter (in flowCore data transformations on real and simulated data by measuring the variation in the locations of cell populations across samples, discovered via automated gating in both the scatter and fluorescence channels. We find that parameter-optimized

  20. Coordination Between Unmanned Aerial and Ground Vehicles: A Taxonomy and Optimization Perspective.

    Science.gov (United States)

    Chen, Jie; Zhang, Xing; Xin, Bin; Fang, Hao

    2016-04-01

    The coordination between unmanned aerial vehicles (UAVs) and unmanned ground vehicles (UGVs) is a proactive research topic whose great value of application has attracted vast attention. This paper outlines the motivations for studying the cooperative control of UAVs and UGVs, and attempts to make a comprehensive investigation and analysis on recent research in this field. First, a taxonomy for classification of existing unmanned aerial and ground vehicles systems (UAGVSs) is proposed, and a generalized optimization framework is developed to allow the decision-making problems for different types of UAGVSs to be described in a unified way. By following the proposed taxonomy, we show how different types of UAGVSs can be built to realize the goal of a common task, that is target tracking, and how optimization problems can be formulated for a UAGVS to perform specific tasks. This paper presents an optimization perspective to model and analyze different types of UAGVSs, and serves as a guidance and reference for developing UAGVSs.

  1. Optimal allocation of fault current limiters for sustaining overcurrent relays coordination in a power system with distributed generation

    Directory of Open Access Journals (Sweden)

    A. Elmitwally

    2015-12-01

    Full Text Available This paper addresses the problem of overcurrent relays (OCRs coordination in the presence of DGs. OCRs are optimally set to work in a coordinated manner to isolate faults with minimal impacts on customers. The penetration of DGs into the power system changes the fault current levels seen by the OCRs. This can deteriorate the coordinated operation of OCRs. Operation time difference between backup and main relays can be below the standard limit or even the backup OCR can incorrectly work before the main OCR. Though resetting of OCRs is tedious especially in large systems, it cannot alone restore the original coordinated operation in the presence of DGs. The paper investigates the optimal utilization of fault current limiters (FCLs to maintain the directional OCRs coordinated operation without any need to OCRs resetting irrespective of DGs status. It is required to maintain the OCRs coordination at minimum cost of prospective FCLs. Hence, the FCLs location and sizing problem is formulated as a constrained multi-objective optimization problem. Multi-objective particle swarm optimization is adopted for solving the optimization problem to determine the optimal locations and sizes of FCLs. The proposed algorithm is applied to meshed and radial power systems at different DGs arrangements using different types of FCLs. Moreover, the OCRs coordination problem is studied when the system includes both directional and non-directional OCRs. Comparative analysis of results is provided.

  2. Determination of the Optimized Automation Rate considering Effects of Automation on Human Operators in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Lee, Seung Min; Seong, Poong Hyun; Kim, Jong Hyun; Kim, Man Cheol

    2015-01-01

    Automation refers to the use of a device or a system to perform a function previously performed by a human operator. It is introduced to reduce the human errors and to enhance the performance in various industrial fields, including the nuclear industry. However, these positive effects are not always achieved in complex systems such as nuclear power plants (NPPs). An excessive introduction of automation can generate new roles for human operators and change activities in unexpected ways. As more automation systems are accepted, the ability of human operators to detect automation failures and resume manual control is diminished. This disadvantage of automation is called the Out-of-the- Loop (OOTL) problem. We should consider the positive and negative effects of automation at the same time to determine the appropriate level of the introduction of automation. Thus, in this paper, we suggest an estimation method to consider the positive and negative effects of automation at the same time to determine the appropriate introduction of automation. This concept is limited in that it does not consider the effects of automation on human operators. Thus, a new estimation method for automation rate was suggested to overcome this problem

  3. Optimal Control of Wind Farms for Coordinated TSO-DSO Reactive Power Management

    Directory of Open Access Journals (Sweden)

    David Sebastian Stock

    2018-01-01

    Full Text Available The growing importance of renewable generation connected to distribution grids requires an increased coordination between transmission system operators (TSOs and distribution system operators (DSOs for reactive power management. This work proposes a practical and effective interaction method based on sequential optimizations to evaluate the reactive flexibility potential of distribution networks and to dispatch them along with traditional synchronous generators, keeping to a minimum the information exchange. A modular optimal power flow (OPF tool featuring multi-objective optimization is developed for this purpose. The proposed method is evaluated for a model of a real German 110 kV grid with 1.6 GW of installed wind power capacity and a reduced order model of the surrounding transmission system. Simulations show the benefit of involving wind farms in reactive power support reducing losses both at distribution and transmission level. Different types of setpoints are investigated, showing the feasibility for the DSO to fulfill also individual voltage and reactive power targets over multiple connection points. Finally, some suggestions are presented to achieve a fair coordination, combining both TSO and DSO requirements.

  4. Optimization of automation: I. Estimation method of cognitive automation rates reflecting the effects of automation on human operators in nuclear power plants

    International Nuclear Information System (INIS)

    Lee, Seung Min; Kim, Jong Hyun; Seong, Poong Hyun

    2014-01-01

    Highlights: • We propose an estimation method of the automation rate by taking the advantages of automation as the estimation measures. • We conduct the experiments to examine the validity of the suggested method. • The higher the cognitive automation rate is, the greater the decreased rate of the working time will be. • The usefulness of the suggested estimation method is proved by statistical analyses. - Abstract: Since automation was introduced in various industrial fields, the concept of the automation rate has been used to indicate the inclusion proportion of automation among all work processes or facilities. Expressions of the inclusion proportion of automation are predictable, as is the ability to express the degree of the enhancement of human performance. However, many researchers have found that a high automation rate does not guarantee high performance. Therefore, to reflect the effects of automation on human performance, this paper proposes a new estimation method of the automation rate that considers the effects of automation on human operators in nuclear power plants (NPPs). Automation in NPPs can be divided into two types: system automation and cognitive automation. Some general descriptions and characteristics of each type of automation are provided, and the advantages of automation are investigated. The advantages of each type of automation are used as measures of the estimation method of the automation rate. One advantage was found to be a reduction in the number of tasks, and another was a reduction in human cognitive task loads. The system and the cognitive automation rate were proposed as quantitative measures by taking advantage of the aforementioned benefits. To quantify the required human cognitive task loads and thus suggest the cognitive automation rate, Conant’s information-theory-based model was applied. The validity of the suggested method, especially as regards the cognitive automation rate, was proven by conducting

  5. Electrical defibrillation optimization: An automated, iterative parallel finite-element approach

    Energy Technology Data Exchange (ETDEWEB)

    Hutchinson, S.A.; Shadid, J.N. [Sandia National Lab., Albuquerque, NM (United States); Ng, K.T. [New Mexico State Univ., Las Cruces, NM (United States); Nadeem, A. [Univ. of Pittsburgh, PA (United States)

    1997-04-01

    To date, optimization of electrode systems for electrical defibrillation has been limited to hand-selected electrode configurations. In this paper we present an automated approach which combines detailed, three-dimensional (3-D) finite element torso models with optimization techniques to provide a flexible analysis and design tool for electrical defibrillation optimization. Specifically, a parallel direct search (PDS) optimization technique is used with a representative objective function to find an electrode configuration which corresponds to the satisfaction of a postulated defibrillation criterion with a minimum amount of power and a low possibility of myocardium damage. For adequate representation of the thoracic inhomogeneities, 3-D finite-element torso models are used in the objective function computations. The CPU-intensive finite-element calculations required for the objective function evaluation have been implemented on a message-passing parallel computer in order to complete the optimization calculations in a timely manner. To illustrate the optimization procedure, it has been applied to a representative electrode configuration for transmyocardial defibrillation, namely the subcutaneous patch-right ventricular catheter (SP-RVC) system. Sensitivity of the optimal solutions to various tissue conductivities has been studied. 39 refs., 9 figs., 2 tabs.

  6. SIFT optimization and automation for matching images from multiple temporal sources

    Science.gov (United States)

    Castillo-Carrión, Sebastián; Guerrero-Ginel, José-Emilio

    2017-05-01

    Scale Invariant Feature Transformation (SIFT) was applied to extract tie-points from multiple source images. Although SIFT is reported to perform reliably under widely different radiometric and geometric conditions, using the default input parameters resulted in too few points being found. We found that the best solution was to focus on large features as these are more robust and not prone to scene changes over time, which constitutes a first approach to the automation of processes using mapping applications such as geometric correction, creation of orthophotos and 3D models generation. The optimization of five key SIFT parameters is proposed as a way of increasing the number of correct matches; the performance of SIFT is explored in different images and parameter values, finding optimization values which are corroborated using different validation imagery. The results show that the optimization model improves the performance of SIFT in correlating multitemporal images captured from different sources.

  7. A novel optimal coordinated control strategy for the updated robot system for single port surgery.

    Science.gov (United States)

    Bai, Weibang; Cao, Qixin; Leng, Chuntao; Cao, Yang; Fujie, Masakatsu G; Pan, Tiewen

    2017-09-01

    Research into robotic systems for single port surgery (SPS) has become widespread around the world in recent years. A new robot arm system for SPS was developed, but its positioning platform and other hardware components were not efficient. Special features of the developed surgical robot system make good teleoperation with safety and efficiency difficult. A robot arm is combined and used as new positioning platform, and the remote center motion is realized by a new method using active motion control. A new mapping strategy based on kinematics computation and a novel optimal coordinated control strategy based on real-time approaching to a defined anthropopathic criterion configuration that is referred to the customary ease state of human arms and especially the configuration of boxers' habitual preparation posture are developed. The hardware components, control architecture, control system, and mapping strategy of the robotic system has been updated. A novel optimal coordinated control strategy is proposed and tested. The new robot system can be more dexterous, intelligent, convenient and safer for preoperative positioning and intraoperative adjustment. The mapping strategy can achieve good following and representation for the slave manipulator arms. And the proposed novel control strategy can enable them to complete tasks with higher maneuverability, lower possibility of self-interference and singularity free while teleoperating. Copyright © 2017 John Wiley & Sons, Ltd.

  8. Optimization and coordination of South-to-North Water Diversion supply chain with strategic customer behavior

    Directory of Open Access Journals (Sweden)

    Zhi-song Chen

    2012-12-01

    Full Text Available The South-to-North Water Diversion (SNWD Project is a significant engineering project meant to solve water shortage problems in North China. Faced with market operations management of the water diversion system, this study defined the supply chain system for the SNWD Project, considering the actual project conditions, built a decentralized decision model and a centralized decision model with strategic customer behavior (SCB using a floating pricing mechanism (FPM, and constructed a coordination mechanism via a revenue-sharing contract. The results suggest the following: (1 owing to water shortage supplements and the excess water sale policy provided by the FPM, the optimal ordering quantity of water resources is less than that without the FPM, and the optimal profits of the whole supply chain, supplier, and external distributor are higher than they would be without the FPM; (2 wholesale pricing and supplementary wholesale pricing with SCB are higher than those without SCB, and the optimal profits of the whole supply chain, supplier, and external distributor are higher than they would be without SCB; and (3 considering SCB and introducing the FPM help increase the optimal profits of the whole supply chain, supplier, and external distributor, and improve the efficiency of water resources usage.

  9. Selection of an optimal neural network architecture for computer-aided detection of microcalcifications - Comparison of automated optimization techniques

    International Nuclear Information System (INIS)

    Gurcan, Metin N.; Sahiner, Berkman; Chan Heangping; Hadjiiski, Lubomir; Petrick, Nicholas

    2001-01-01

    Many computer-aided diagnosis (CAD) systems use neural networks (NNs) for either detection or classification of abnormalities. Currently, most NNs are 'optimized' by manual search in a very limited parameter space. In this work, we evaluated the use of automated optimization methods for selecting an optimal convolution neural network (CNN) architecture. Three automated methods, the steepest descent (SD), the simulated annealing (SA), and the genetic algorithm (GA), were compared. We used as an example the CNN that classifies true and false microcalcifications detected on digitized mammograms by a prescreening algorithm. Four parameters of the CNN architecture were considered for optimization, the numbers of node groups and the filter kernel sizes in the first and second hidden layers, resulting in a search space of 432 possible architectures. The area A z under the receiver operating characteristic (ROC) curve was used to design a cost function. The SA experiments were conducted with four different annealing schedules. Three different parent selection methods were compared for the GA experiments. An available data set was split into two groups with approximately equal number of samples. By using the two groups alternately for training and testing, two different cost surfaces were evaluated. For the first cost surface, the SD method was trapped in a local minimum 91% (392/432) of the time. The SA using the Boltzman schedule selected the best architecture after evaluating, on average, 167 architectures. The GA achieved its best performance with linearly scaled roulette-wheel parent selection; however, it evaluated 391 different architectures, on average, to find the best one. The second cost surface contained no local minimum. For this surface, a simple SD algorithm could quickly find the global minimum, but the SA with the very fast reannealing schedule was still the most efficient. The same SA scheme, however, was trapped in a local minimum on the first cost

  10. Modular high power diode lasers with flexible 3D multiplexing arrangement optimized for automated manufacturing

    Science.gov (United States)

    Könning, Tobias; Bayer, Andreas; Plappert, Nora; Faßbender, Wilhelm; Dürsch, Sascha; Küster, Matthias; Hubrich, Ralf; Wolf, Paul; Köhler, Bernd; Biesenbach, Jens

    2018-02-01

    A novel 3-dimensional arrangement of mirrors is used to re-arrange beams from 1-D and 2-D high power diode laser arrays. The approach allows for a variety of stacking geometries, depending on individual requirements. While basic building blocks, including collimating optics, always remain the same, most adaptations can be realized by simple rearrangement of a few optical components. Due to fully automated alignment processes, the required changes can be realized in software by changing coordinates, rather than requiring customized mechanical components. This approach minimizes development costs due to its flexibility, while reducing overall product cost by using similar building blocks for a variety of products and utilizing a high grade of automation. The modules can be operated with industrial grade water, lowering overall system and maintenance cost. Stackable macro coolers are used as the smallest building block of the system. Each cooler can hold up to five diode laser bars. Micro optical components, collimating the beam, are mounted directly to the cooler. All optical assembly steps are fully automated. Initially, the beams from all laser bars propagate in the same direction. Key to the concept is an arrangement of deflectors, which re-arrange the beams into a 2-D array of the desired shape and high fill factor. Standard multiplexing techniques like polarization- or wavelengths-multiplexing have been implemented as well. A variety of fiber coupled modules ranging from a few hundred watts of optical output power to multiple kilowatts of power, as well as customized laser spot geometries like uniform line sources, have been realized.

  11. Automated Discovery of Elementary Chemical Reaction Steps Using Freezing String and Berny Optimization Methods.

    Science.gov (United States)

    Suleimanov, Yury V; Green, William H

    2015-09-08

    We present a simple protocol which allows fully automated discovery of elementary chemical reaction steps using in cooperation double- and single-ended transition-state optimization algorithms--the freezing string and Berny optimization methods, respectively. To demonstrate the utility of the proposed approach, the reactivity of several single-molecule systems of combustion and atmospheric chemistry importance is investigated. The proposed algorithm allowed us to detect without any human intervention not only "known" reaction pathways, manually detected in the previous studies, but also new, previously "unknown", reaction pathways which involve significant atom rearrangements. We believe that applying such a systematic approach to elementary reaction path finding will greatly accelerate the discovery of new chemistry and will lead to more accurate computer simulations of various chemical processes.

  12. Global optimal hybrid geometric active contour for automated lung segmentation on CT images.

    Science.gov (United States)

    Zhang, Weihang; Wang, Xue; Zhang, Pengbo; Chen, Junfeng

    2017-12-01

    Lung segmentation on thoracic CT images plays an important role in early detection, diagnosis and 3D visualization of lung cancer. The segmentation accuracy, stability, and efficiency of serial CT scans have a significant impact on the performance of computer-aided detection. This paper proposes a global optimal hybrid geometric active contour model for automated lung segmentation on CT images. Firstly, the combination of global region and edge information leads to high segmentation accuracy in lung regions with weak boundaries or narrow bands. Secondly, due to the global optimality of energy functional, the proposed model is robust to the initial position of level set function and requires fewer iterations. Thus, the stability and efficiency of lung segmentation on serial CT slices can be greatly improved by taking advantage of the information between adjacent slices. In addition, to achieve the whole process of automated segmentation for lung cancer, two assistant algorithms based on prior shape and anatomical knowledge are proposed. The algorithms not only automatically separate the left and right lungs, but also include juxta-pleural tumors into the segmentation result. The proposed method was quantitatively validated on subjects from the publicly available LIDC-IDRI and our own data sets. Exhaustive experimental results demonstrate the superiority and competency of our method, especially compared with the typical edge-based geometric active contour model. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Coordination Between the Sexes Constrains the Optimization of Reproductive Timing in Honey Bee Colonies.

    Science.gov (United States)

    Lemanski, Natalie J; Fefferman, Nina H

    2017-06-01

    Honeybees are an excellent model system for examining how trade-offs shape reproductive timing in organisms with seasonal environments. Honeybee colonies reproduce two ways: producing swarms comprising a queen and thousands of workers or producing males (drones). There is an energetic trade-off between producing workers, which contribute to colony growth, and drones, which contribute only to reproduction. The timing of drone production therefore determines both the drones' likelihood of mating and when colonies reach sufficient size to swarm. Using a linear programming model, we ask when a colony should produce drones and swarms to maximize reproductive success. We find the optimal behavior for each colony is to produce all drones prior to swarming, an impossible solution on a population scale because queens and drones would never co-occur. Reproductive timing is therefore not solely determined by energetic trade-offs but by the game theoretic problem of coordinating the production of reproductives among colonies.

  14. MAS-based Distributed Coordinated Control and Optimization in Microgrid and Microgrid Clusters: A Comprehensive Overview

    DEFF Research Database (Denmark)

    Han, Yang; Zhang, Ke; Hong, Li

    2018-01-01

    the power and energy, stabilize voltage and frequency, achieve economic and coordinated operation among the MGs and MGCs. However, the complex and diverse combinations of distributed generations in multi-agent system increase the complexity of system control and operation. In order to design the optimized...... the consensus is a vital problem in the complex dynamical systems, the distributed MAS-based consensus protocols are systematically reviewed. On the other hand, the communication delay issue, which is inevitable no matter in the low- or high-bandwidth communication networks, is crucial to maintain stability...... of the MGs and MGCs with fixed and random delays. Various control strategies to compensate the effect of communication delays have been reviewed, such as the neural network-based predictive control, the weighted average predictive control, the gain scheduling scheme and synchronization schemes based...

  15. An automated analysis workflow for optimization of force-field parameters using neutron scattering data

    Energy Technology Data Exchange (ETDEWEB)

    Lynch, Vickie E.; Borreguero, Jose M. [Neutron Data Analysis & Visualization Division, Oak Ridge National Laboratory, Oak Ridge, TN, 37831 (United States); Bhowmik, Debsindhu [Computational Sciences & Engineering Division, Oak Ridge National Laboratory, Oak Ridge, TN, 37831 (United States); Ganesh, Panchapakesan; Sumpter, Bobby G. [Center for Nanophase Material Sciences, Oak Ridge National Laboratory, Oak Ridge, TN, 37831 (United States); Computational Sciences & Engineering Division, Oak Ridge National Laboratory, Oak Ridge, TN, 37831 (United States); Proffen, Thomas E. [Neutron Data Analysis & Visualization Division, Oak Ridge National Laboratory, Oak Ridge, TN, 37831 (United States); Goswami, Monojoy, E-mail: goswamim@ornl.gov [Center for Nanophase Material Sciences, Oak Ridge National Laboratory, Oak Ridge, TN, 37831 (United States); Computational Sciences & Engineering Division, Oak Ridge National Laboratory, Oak Ridge, TN, 37831 (United States)

    2017-07-01

    Graphical abstract: - Highlights: • An automated workflow to optimize force-field parameters. • Used the workflow to optimize force-field parameter for a system containing nanodiamond and tRNA. • The mechanism relies on molecular dynamics simulation and neutron scattering experimental data. • The workflow can be generalized to any other experimental and simulation techniques. - Abstract: Large-scale simulations and data analysis are often required to explain neutron scattering experiments to establish a connection between the fundamental physics at the nanoscale and data probed by neutrons. However, to perform simulations at experimental conditions it is critical to use correct force-field (FF) parameters which are unfortunately not available for most complex experimental systems. In this work, we have developed a workflow optimization technique to provide optimized FF parameters by comparing molecular dynamics (MD) to neutron scattering data. We describe the workflow in detail by using an example system consisting of tRNA and hydrophilic nanodiamonds in a deuterated water (D{sub 2}O) environment. Quasi-elastic neutron scattering (QENS) data show a faster motion of the tRNA in the presence of nanodiamond than without the ND. To compare the QENS and MD results quantitatively, a proper choice of FF parameters is necessary. We use an efficient workflow to optimize the FF parameters between the hydrophilic nanodiamond and water by comparing to the QENS data. Our results show that we can obtain accurate FF parameters by using this technique. The workflow can be generalized to other types of neutron data for FF optimization, such as vibrational spectroscopy and spin echo.

  16. A Bi-Level Particle Swarm Optimization Algorithm for Solving Unit Commitment Problems with Wind-EVs Coordinated Dispatch

    Science.gov (United States)

    Song, Lei; Zhang, Bo

    2017-07-01

    Nowadays, the grid faces much more challenges caused by wind power and the accessing of electric vehicles (EVs). Based on the potentiality of coordinated dispatch, a model of wind-EVs coordinated dispatch was developed. Then, A bi-level particle swarm optimization algorithm for solving the model was proposed in this paper. The application of this algorithm to 10-unit test system carried out that coordinated dispatch can benefit the power system from the following aspects: (1) Reducing operating costs; (2) Improving the utilization of wind power; (3) Stabilizing the peak-valley difference.

  17. Simple Automated NGS Library Construction Using Optimized NEBNext(R) Reagents and a Caliper Sciclone

    Science.gov (United States)

    Dimalanta, E.; Stewart, F.; Barry, A.; Meek, I.; Apone, L.; Liu, P.; Munafo, D.; Davis, T.; Sumner, Christine

    2012-01-01

    While next generation sequencing technologies are continually evolving to increase the data output, sequence-ready library preparation significantly lags behind in scale. The multi-step scheme of library construction and gel-based size selection limits the number of samples that can be processed manually without introducing handling errors. Moreover, processing multiple samples is extremely time consuming. Our objective here was to address these issues by developing an automated library construction process for NGS platforms. Specifically, we optimized a library construction workflow utilizing NEBNextâ reagents in conjunction with the Sciclone NGS liquid handling workstation. In addition, specific reagent configuration designs were tested for ease-of-use. Key considerations in the design of the reagent kits included the elimination of manual pipetting steps in setting up the instrument, reagent storage compatibility, the premixing of components for the various enzymatic steps and the reduction of reagent dead-volume. As a result of this work, we have developed a cost-effective automated process that is scalable from 8-96 samples with minimal hands on time.

  18. Plug-and-play monitoring and performance optimization for industrial automation processes

    CERN Document Server

    Luo, Hao

    2017-01-01

    Dr.-Ing. Hao Luo demonstrates the developments of advanced plug-and-play (PnP) process monitoring and control systems for industrial automation processes. With aid of the so-called Youla parameterization, a novel PnP process monitoring and control architecture (PnP-PMCA) with modularized components is proposed. To validate the developments, a case study on an industrial rolling mill benchmark is performed, and the real-time implementation on a laboratory brushless DC motor is presented. Contents PnP Process Monitoring and Control Architecture Real-Time Configuration Techniques for PnP Process Monitoring Real-Time Configuration Techniques for PnP Performance Optimization Benchmark Study and Real-Time Implementation Target Groups Researchers and students of Automation and Control Engineering Practitioners in the area of Industrial and Production Engineering The Author Hao Luo received the Ph.D. degree at the Institute for Automatic Control and Complex Systems (AKS) at the University of Duisburg-Essen, Germany, ...

  19. Automated optimization and construction of chemometric models based on highly variable raw chromatographic data.

    Science.gov (United States)

    Sinkov, Nikolai A; Johnston, Brandon M; Sandercock, P Mark L; Harynuk, James J

    2011-07-04

    Direct chemometric interpretation of raw chromatographic data (as opposed to integrated peak tables) has been shown to be advantageous in many circumstances. However, this approach presents two significant challenges: data alignment and feature selection. In order to interpret the data, the time axes must be precisely aligned so that the signal from each analyte is recorded at the same coordinates in the data matrix for each and every analyzed sample. Several alignment approaches exist in the literature and they work well when the samples being aligned are reasonably similar. In cases where the background matrix for a series of samples to be modeled is highly variable, the performance of these approaches suffers. Considering the challenge of feature selection, when the raw data are used each signal at each time is viewed as an individual, independent variable; with the data rates of modern chromatographic systems, this generates hundreds of thousands of candidate variables, or tens of millions of candidate variables if multivariate detectors such as mass spectrometers are utilized. Consequently, an automated approach to identify and select appropriate variables for inclusion in a model is desirable. In this research we present an alignment approach that relies on a series of deuterated alkanes which act as retention anchors for an alignment signal, and couple this with an automated feature selection routine based on our novel cluster resolution metric for the construction of a chemometric model. The model system that we use to demonstrate these approaches is a series of simulated arson debris samples analyzed by passive headspace extraction, GC-MS, and interpreted using partial least squares discriminant analysis (PLS-DA). Copyright © 2011 Elsevier B.V. All rights reserved.

  20. Microseismic event location using global optimization algorithms: An integrated and automated workflow

    Science.gov (United States)

    Lagos, Soledad R.; Velis, Danilo R.

    2018-02-01

    We perform the location of microseismic events generated in hydraulic fracturing monitoring scenarios using two global optimization techniques: Very Fast Simulated Annealing (VFSA) and Particle Swarm Optimization (PSO), and compare them against the classical grid search (GS). To this end, we present an integrated and optimized workflow that concatenates into an automated bash script the different steps that lead to the microseismic events location from raw 3C data. First, we carry out the automatic detection, denoising and identification of the P- and S-waves. Secondly, we estimate their corresponding backazimuths using polarization information, and propose a simple energy-based criterion to automatically decide which is the most reliable estimate. Finally, after taking proper care of the size of the search space using the backazimuth information, we perform the location using the aforementioned algorithms for 2D and 3D usual scenarios of hydraulic fracturing processes. We assess the impact of restricting the search space and show the advantages of using either VFSA or PSO over GS to attain significant speed-ups.

  1. SBROME: a scalable optimization and module matching framework for automated biosystems design.

    Science.gov (United States)

    Huynh, Linh; Tsoukalas, Athanasios; Köppe, Matthias; Tagkopoulos, Ilias

    2013-05-17

    The development of a scalable framework for biodesign automation is a formidable challenge given the expected increase in part availability and the ever-growing complexity of synthetic circuits. To allow for (a) the use of previously constructed and characterized circuits or modules and (b) the implementation of designs that can scale up to hundreds of nodes, we here propose a divide-and-conquer Synthetic Biology Reusable Optimization Methodology (SBROME). An abstract user-defined circuit is first transformed and matched against a module database that incorporates circuits that have previously been experimentally characterized. Then the resulting circuit is decomposed to subcircuits that are populated with the set of parts that best approximate the desired function. Finally, all subcircuits are subsequently characterized and deposited back to the module database for future reuse. We successfully applied SBROME toward two alternative designs of a modular 3-input multiplexer that utilize pre-existing logic gates and characterized biological parts.

  2. Automated procedure for selection of optimal refueling policies for light water reactors

    International Nuclear Information System (INIS)

    Lin, B.I.; Zolotar, B.; Weisman, J.

    1979-01-01

    An automated procedure determining a minimum cost refueling policy has been developed for light water reactors. The procedure is an extension of the equilibrium core approach previously devised for pressurized water reactors (PWRs). Use of 1 1/2-group theory has improved the accuracy of the nuclear model and eliminated tedious fitting of albedos. A simple heuristic algorithm for locating a good starting policy has materially reduced PWR computing time. Inclusion of void effects and use of the Haling principle for axial flux calculations extended the nuclear model to boiling water reactors (BWRs). A good initial estimate of the refueling policy is obtained by recognizing that a nearly uniform distribution of reactivity provides low-power peaking. The initial estimate is improved upon by interchanging groups of four assemblies and is subsequently refined by interchanging individual assemblies. The method yields very favorable results, is simpler than previously proposed BWR fuel optimization schemes, and retains power cost as the objective function

  3. Optimizing the balance between task automation and human manual control in simulated submarine track management.

    Science.gov (United States)

    Chen, Stephanie I; Visser, Troy A W; Huf, Samuel; Loft, Shayne

    2017-09-01

    Automation can improve operator performance and reduce workload, but can also degrade operator situation awareness (SA) and the ability to regain manual control. In 3 experiments, we examined the extent to which automation could be designed to benefit performance while ensuring that individuals maintained SA and could regain manual control. Participants completed a simulated submarine track management task under varying task load. The automation was designed to facilitate information acquisition and analysis, but did not make task decisions. Relative to a condition with no automation, the continuous use of automation improved performance and reduced subjective workload, but degraded SA. Automation that was engaged and disengaged by participants as required (adaptable automation) moderately improved performance and reduced workload relative to no automation, but degraded SA. Automation engaged and disengaged based on task load (adaptive automation) provided no benefit to performance or workload, and degraded SA relative to no automation. Automation never led to significant return-to-manual deficits. However, all types of automation led to degraded performance on a nonautomated task that shared information processing requirements with automated tasks. Given these outcomes, further research is urgently required to establish how to design automation to maximize performance while keeping operators cognitively engaged. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  4. Design And Modeling An Automated Digsilent Power System For Optimal New Load Locations

    Directory of Open Access Journals (Sweden)

    Mohamed Saad

    2015-08-01

    Full Text Available Abstract The electric power utilities seek to take advantage of novel approaches to meet growing energy demand. Utilities are under pressure to evolve their classical topologies to increase the usage of distributed generation. Currently the electrical power engineers in many regions of the world are implementing manual methods to measure power consumption for farther assessment of voltage violation. Such process proved to be time consuming costly and inaccurate. Also demand response is a grid management technique where retail or wholesale customers are requested either electronically or manually to reduce their load. Therefore this paper aims to design and model an automated power system for optimal new load locations using DPL DIgSILENT Programming Language. This study is a diagnostic approach that assists system operator about any voltage violation cases that would happen during adding new load to the grid. The process of identifying the optimal bus bar location involves a complicated calculation of the power consumptions at each load bus As a result the DPL program would consider all the IEEE 30 bus internal networks data then a load flow simulation will be executed. To add the new load to the first bus in the network. Therefore the developed model will simulate the new load at each available bus bar in the network and generate three analytical reports for each case that captures the overunder voltage and the loading elements among the grid.

  5. Optimization of Control Points Number at Coordinate Measurements based on the Monte-Carlo Method

    Science.gov (United States)

    Korolev, A. A.; Kochetkov, A. V.; Zakharov, O. V.

    2018-01-01

    Improving the quality of products causes an increase in the requirements for the accuracy of the dimensions and shape of the surfaces of the workpieces. This, in turn, raises the requirements for accuracy and productivity of measuring of the workpieces. The use of coordinate measuring machines is currently the most effective measuring tool for solving similar problems. The article proposes a method for optimizing the number of control points using Monte Carlo simulation. Based on the measurement of a small sample from batches of workpieces, statistical modeling is performed, which allows one to obtain interval estimates of the measurement error. This approach is demonstrated by examples of applications for flatness, cylindricity and sphericity. Four options of uniform and uneven arrangement of control points are considered and their comparison is given. It is revealed that when the number of control points decreases, the arithmetic mean decreases, the standard deviation of the measurement error increases and the probability of the measurement α-error increases. In general, it has been established that it is possible to repeatedly reduce the number of control points while maintaining the required measurement accuracy.

  6. Optimal coordinated scheduling of combined heat and power fuel cell, wind, and photovoltaic units in micro grids considering uncertainties

    International Nuclear Information System (INIS)

    Bornapour, Mosayeb; Hooshmand, Rahmat-Allah; Khodabakhshian, Amin; Parastegari, Moein

    2016-01-01

    In this paper, a stochastic model is proposed for coordinated scheduling of combined heat and power units in micro grid considering wind turbine and photovoltaic units. Uncertainties of electrical market price; the speed of wind and solar radiation are considered using a scenario-based method. In the method, scenarios are generated using roulette wheel mechanism based on probability distribution functions of input random variables. Using this method, the probabilistic specifics of the problem are distributed and the problem is converted to a deterministic one. The type of the objective function, coordinated scheduling of combined heat and power, wind turbine, and photovoltaic units change this problem to a mixed integer nonlinear one. Therefore to solve this problem modified particle swarm optimization algorithm is employed. The mentioned uncertainties lead to an increase in profit. Moreover, the optimal coordinated scheduling of renewable energy resources and thermal units in micro grids increase the total profit. In order to evaluate the performance of the proposed method, its performance is executed on modified 33 bus distributed system as a micro grid. - Highlights: • Stochastic model is proposed for coordinated scheduling of renewable energy sources. • The effect of combined heat and power is considered. • Maximizing profits of micro grid is considered as objective function. • Considering the uncertainties of problem lead to profit increasing. • Optimal scheduling of renewable energy sources and thermal units increases profit.

  7. Automated selection of the optimal cardiac phase for single-beat coronary CT angiography reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Stassi, D.; Ma, H.; Schmidt, T. G., E-mail: taly.gilat-schmidt@marquette.edu [Department of Biomedical Engineering, Marquette University, Milwaukee, Wisconsin 53201 (United States); Dutta, S.; Soderman, A.; Pazzani, D.; Gros, E.; Okerlund, D. [GE Healthcare, Waukesha, Wisconsin 53188 (United States)

    2016-01-15

    Purpose: Reconstructing a low-motion cardiac phase is expected to improve coronary artery visualization in coronary computed tomography angiography (CCTA) exams. This study developed an automated algorithm for selecting the optimal cardiac phase for CCTA reconstruction. The algorithm uses prospectively gated, single-beat, multiphase data made possible by wide cone-beam imaging. The proposed algorithm differs from previous approaches because the optimal phase is identified based on vessel image quality (IQ) directly, compared to previous approaches that included motion estimation and interphase processing. Because there is no processing of interphase information, the algorithm can be applied to any sampling of image phases, making it suited for prospectively gated studies where only a subset of phases are available. Methods: An automated algorithm was developed to select the optimal phase based on quantitative IQ metrics. For each reconstructed slice at each reconstructed phase, an image quality metric was calculated based on measures of circularity and edge strength of through-plane vessels. The image quality metric was aggregated across slices, while a metric of vessel-location consistency was used to ignore slices that did not contain through-plane vessels. The algorithm performance was evaluated using two observer studies. Fourteen single-beat cardiac CT exams (Revolution CT, GE Healthcare, Chalfont St. Giles, UK) reconstructed at 2% intervals were evaluated for best systolic (1), diastolic (6), or systolic and diastolic phases (7) by three readers and the algorithm. Pairwise inter-reader and reader-algorithm agreement was evaluated using the mean absolute difference (MAD) and concordance correlation coefficient (CCC) between the reader and algorithm-selected phases. A reader-consensus best phase was determined and compared to the algorithm selected phase. In cases where the algorithm and consensus best phases differed by more than 2%, IQ was scored by three

  8. Multiscale Collaborative Optimization of Processing Parameters for Carbon Fiber/Epoxy Laminates Fabricated by High-Speed Automated Fiber Placement

    Directory of Open Access Journals (Sweden)

    Zhenyu Han

    2016-01-01

    Full Text Available Processing optimization is an important means to inhibit manufacturing defects efficiently. However, processing optimization used by experiments or macroscopic theories in high-speed automated fiber placement (AFP suffers from some restrictions, because multiscale effect of laying tows and their manufacturing defects could not be considered. In this paper, processing parameters, including compaction force, laying speed, and preheating temperature, are optimized by multiscale collaborative optimization in AFP process. Firstly, rational model between cracks and strain energy is revealed in order that the formative possibility of cracks could be assessed by using strain energy or its density. Following that, an antisequential hierarchical multiscale collaborative optimization method is presented to resolve multiscale effect of structure and mechanical properties for laying tows or cracks in high-speed automated fiber placement process. According to the above method and taking carbon fiber/epoxy tow as an example, multiscale mechanical properties of laying tow under different processing parameters are investigated through simulation, which includes recoverable strain energy (ALLSE of macroscale, strain energy density (SED of mesoscale, and interface absorbability and matrix fluidity of microscale. Finally, response surface method (RSM is used to optimize the processing parameters. Two groups of processing parameters, which have higher desirability, are obtained to achieve the purpose of multiscale collaborative optimization.

  9. Patient Dose Optimization in Fluoroscopically Guided Interventional Procedures. Final Report of a Coordinated Research Project

    International Nuclear Information System (INIS)

    2010-01-01

    In recent years, many surgical procedures have increasingly been replaced by interventional procedures that guide catheters into the arteries under X ray fluoroscopic guidance to perform a variety of operations such as ballooning, embolization, implantation of stents etc. The radiation exposure to patients and staff in such procedures is much higher than in simple radiographic examinations like X ray of chest or abdomen such that radiation induced skin injuries to patients and eye lens opacities among workers have been reported in the 1990's and after. Interventional procedures have grown both in frequency and importance during the last decade. This Coordinated Research Project (CRP) and TECDOC were developed within the International Atomic Energy Agency's (IAEA) framework of statutory responsibility to provide for the worldwide application of the standards for the protection of people against exposure to ionizing radiation. The CRP took place between 2003 and 2005 in six countries, with a view of optimizing the radiation protection of patients undergoing interventional procedures. The Fundamental Safety Principles and the International Basic Safety Standards for Protection against Ionizing Radiation (BSS) issued by the IAEA and co-sponsored by the Food and Agriculture Organization of the United Nations (FAO), the International Labour Organization (ILO), the World Health Organization (WHO), the Pan American Health Organization (PAHO) and the Nuclear Energy Agency (NEA), among others, require the radiation protection of patients undergoing medical exposures through justification of the procedures involved and through optimization. In keeping with its responsibility on the application of standards, the IAEA programme on Radiological Protection of Patients encourages the reduction of patient doses. To facilitate this, it has issued specific advice on the application of the BSS in the field of radiology in Safety Reports Series No. 39 and the three volumes on Radiation

  10. Optimizing human-system interface automation design based on a skill-rule-knowledge framework

    International Nuclear Information System (INIS)

    Lin, Chiuhsiang Joe; Yenn, T.-C.; Yang, C.-W.

    2010-01-01

    This study considers the technological change that has occurred in complex systems within the past 30 years. The role of human operators in controlling and interacting with complex systems following the technological change was also investigated. Modernization of instrumentation and control systems and components leads to a new issue of human-automation interaction, in which human operational performance must be considered in automated systems. The human-automation interaction can differ in its types and levels. A system design issue is usually realized: given these technical capabilities, which system functions should be automated and to what extent? A good automation design can be achieved by making an appropriate human-automation function allocation. To our knowledge, only a few studies have been published on how to achieve appropriate automation design with a systematic procedure. Further, there is a surprising lack of information on examining and validating the influences of levels of automation (LOAs) on instrumentation and control systems in the advanced control room (ACR). The study we present in this paper proposed a systematic framework to help in making an appropriate decision towards types of automation (TOA) and LOAs based on a 'Skill-Rule-Knowledge' (SRK) model. From the evaluating results, it was shown that the use of either automatic mode or semiautomatic mode is insufficient to prevent human errors. For preventing the occurrences of human errors and ensuring the safety in ACR, the proposed framework can be valuable for making decisions in human-automation allocation.

  11. Computer-automated multi-disciplinary analysis and design optimization of internally cooled turbine blades

    Science.gov (United States)

    Martin, Thomas Joseph

    This dissertation presents the theoretical methodology, organizational strategy, conceptual demonstration and validation of a fully automated computer program for the multi-disciplinary analysis, inverse design and optimization of convectively cooled axial gas turbine blades and vanes. Parametric computer models of the three-dimensional cooled turbine blades and vanes were developed, including the automatic generation of discretized computational grids. Several new analysis programs were written and incorporated with existing computational tools to provide computer models of the engine cycle, aero-thermodynamics, heat conduction and thermofluid physics of the internally cooled turbine blades and vanes. A generalized information transfer protocol was developed to provide the automatic mapping of geometric and boundary condition data between the parametric design tool and the numerical analysis programs. A constrained hybrid optimization algorithm controlled the overall operation of the system and guided the multi-disciplinary internal turbine cooling design process towards the objectives and constraints of engine cycle performance, aerodynamic efficiency, cooling effectiveness and turbine blade and vane durability. Several boundary element computer programs were written to solve the steady-state non-linear heat conduction equation inside the internally cooled and thermal barrier-coated turbine blades and vanes. The boundary element method (BEM) did not require grid generation inside the internally cooled turbine blades and vanes, so the parametric model was very robust. Implicit differentiations of the BEM thermal and thereto-elastic analyses were done to compute design sensitivity derivatives faster and more accurately than via explicit finite differencing. A factor of three savings of computer processing time was realized for two-dimensional thermal optimization problems, and a factor of twenty was obtained for three-dimensional thermal optimization problems

  12. Chemical Reactor Automation as a way to Optimize a Laboratory Scale Polymerization Process

    Science.gov (United States)

    Cruz-Campa, Jose L.; Saenz de Buruaga, Isabel; Lopez, Raymundo

    2004-10-01

    The automation of the registration and control of variables involved in a chemical reactor improves the reaction process by making it faster, optimized and without the influence of human error. The objective of this work is to register and control the involved variables (temperatures, reactive fluxes, weights, etc) in an emulsion polymerization reaction. The programs and control algorithms were developed in the language G in LabVIEW®. The designed software is able to send and receive RS232 codified data from the devices (pumps, temperature sensors, mixer, balances, and so on) to and from a personal Computer. The transduction from digital information to movement or measurement actions of the devices is done by electronic components included in the devices. Once the programs were done and proved, chemical reactions of emulsion polymerization were made to validate the system. Moreover, some advanced heat-estimation algorithms were implemented in order to know the heat caused by the reaction and the estimation and control of chemical variables in-line. All the information gotten from the reaction is stored in the PC. The information is then available and ready to use in any commercial data processor software. This work is now being used in a Research Center in order to make emulsion polymerizations under efficient and controlled conditions with reproducible results. The experiences obtained from this project may be used in the implementation of chemical estimation algorithms at pilot plant or industrial scale.

  13. An optimization design proposal of automated guided vehicles for mixed type transportation in hospital environments.

    Science.gov (United States)

    González, Domingo; Romero, Luis; Espinosa, María Del Mar; Domínguez, Manuel

    2017-01-01

    The aim of this paper is to present an optimization proposal in the automated guided vehicles design used in hospital logistics, as well as to analyze the impact of its implementation in a real environment. This proposal is based on the design of those elements that would allow the vehicles to deliver an extra cart by the towing method. So, the proposal intention is to improve the productivity and the performance of the current vehicles by using a transportation method of combined carts. The study has been developed following concurrent engineering premises from three different viewpoints. First, the sequence of operations has been described, and second, a proposal of design of the equipment has been undertaken. Finally, the impact of the proposal has been analyzed according to real data from the Hospital Universitario Rio Hortega in Valladolid (Spain). In this particular case, by the implementation of the analyzed proposal in the hospital a reduction of over 35% of the current time of use can be achieved. This result may allow adding new tasks to the vehicles, and according to this, both a new kind of vehicle and a specific module can be developed in order to get a better performance.

  14. Optimal affinity ranking for automated virtual screening validated in prospective D3R grand challenges

    Science.gov (United States)

    Wingert, Bentley M.; Oerlemans, Rick; Camacho, Carlos J.

    2018-01-01

    The goal of virtual screening is to generate a substantially reduced and enriched subset of compounds from a large virtual chemistry space. Critical in these efforts are methods to properly rank the binding affinity of compounds. Prospective evaluations of ranking strategies in the D3R grand challenges show that for targets with deep pockets the best correlations (Spearman ρ 0.5) were obtained by our submissions that docked compounds to the holo-receptors with the most chemically similar ligand. On the other hand, for targets with open pockets using multiple receptor structures is not a good strategy. Instead, docking to a single optimal receptor led to the best correlations (Spearman ρ 0.5), and overall performs better than any other method. Yet, choosing a suboptimal receptor for crossdocking can significantly undermine the affinity rankings. Our submissions that evaluated the free energy of congeneric compounds were also among the best in the community experiment. Error bars of around 1 kcal/mol are still too large to significantly improve the overall rankings. Collectively, our top of the line predictions show that automated virtual screening with rigid receptors perform better than flexible docking and other more complex methods.

  15. Extending the NIF DISCO framework to automate complex workflow: coordinating the harvest and integration of data from diverse neuroscience information resources.

    Science.gov (United States)

    Marenco, Luis N; Wang, Rixin; Bandrowski, Anita E; Grethe, Jeffrey S; Shepherd, Gordon M; Miller, Perry L

    2014-01-01

    This paper describes how DISCO, the data aggregator that supports the Neuroscience Information Framework (NIF), has been extended to play a central role in automating the complex workflow required to support and coordinate the NIF's data integration capabilities. The NIF is an NIH Neuroscience Blueprint initiative designed to help researchers access the wealth of data related to the neurosciences available via the Internet. A central component is the NIF Federation, a searchable database that currently contains data from 231 data and information resources regularly harvested, updated, and warehoused in the DISCO system. In the past several years, DISCO has greatly extended its functionality and has evolved to play a central role in automating the complex, ongoing process of harvesting, validating, integrating, and displaying neuroscience data from a growing set of participating resources. This paper provides an overview of DISCO's current capabilities and discusses a number of the challenges and future directions related to the process of coordinating the integration of neuroscience data within the NIF Federation.

  16. Extending the NIF DISCO framework to automate complex workflow: coordinating the harvest and integration of data from diverse neuroscience information resources

    Science.gov (United States)

    Marenco, Luis N.; Wang, Rixin; Bandrowski, Anita E.; Grethe, Jeffrey S.; Shepherd, Gordon M.; Miller, Perry L.

    2014-01-01

    This paper describes how DISCO, the data aggregator that supports the Neuroscience Information Framework (NIF), has been extended to play a central role in automating the complex workflow required to support and coordinate the NIF’s data integration capabilities. The NIF is an NIH Neuroscience Blueprint initiative designed to help researchers access the wealth of data related to the neurosciences available via the Internet. A central component is the NIF Federation, a searchable database that currently contains data from 231 data and information resources regularly harvested, updated, and warehoused in the DISCO system. In the past several years, DISCO has greatly extended its functionality and has evolved to play a central role in automating the complex, ongoing process of harvesting, validating, integrating, and displaying neuroscience data from a growing set of participating resources. This paper provides an overview of DISCO’s current capabilities and discusses a number of the challenges and future directions related to the process of coordinating the integration of neuroscience data within the NIF Federation. PMID:25018728

  17. Fine-tuning optimal porous coordination polymers using functional alkyl groups for CH4 purification

    NARCIS (Netherlands)

    Cheng, F.; Li, Q.; Duan, J.; Hosono, N.; Noro, S.-I.; Krishna, R.; Lyu, H.; Kusaka, S.; Jin, W.; Kitagawa, S.

    2017-01-01

    Nano-porous coordination polymers (nano-PCPs), as a new class of crystalline material, have become a lucrative topic in coordination chemistry due to the facile tunability of their functional pore environments. However, elucidating the pathways for the rational design and preparation of nano-PCPs

  18. Vibrational quasi-degenerate perturbation theory with optimized coordinates: applications to ethylene and trans-1,3-butadiene.

    Science.gov (United States)

    Yagi, Kiyoshi; Otaki, Hiroki

    2014-02-28

    A perturbative extension to optimized coordinate vibrational self-consistent field (oc-VSCF) is proposed based on the quasi-degenerate perturbation theory (QDPT). A scheme to construct the degenerate space (P space) is developed, which incorporates degenerate configurations and alleviates the divergence of perturbative expansion due to localized coordinates in oc-VSCF (e.g., local O-H stretching modes of water). An efficient configuration selection scheme is also implemented, which screens out the Hamiltonian matrix element between the P space configuration (p) and the complementary Q space configuration (q) based on a difference in their quantum numbers (λpq = ∑s|ps - qs|). It is demonstrated that the second-order vibrational QDPT based on optimized coordinates (oc-VQDPT2) smoothly converges with respect to the order of the mode coupling, and outperforms the conventional one based on normal coordinates. Furthermore, an improved, fast algorithm is developed for optimizing the coordinates. First, the minimization of the VSCF energy is conducted in a restricted parameter space, in which only a portion of pairs of coordinates is selectively transformed. A rational index is devised for this purpose, which identifies the important coordinate pairs to mix from others that may remain unchanged based on the magnitude of harmonic coupling induced by the transformation. Second, a cubic force field (CFF) is employed in place of a quartic force field, which bypasses intensive procedures that arise due to the presence of the fourth-order force constants. It is found that oc-VSCF based on CFF together with the pair selection scheme yields the coordinates similar in character to the conventional ones such that the final vibrational energy is affected very little while gaining an order of magnitude acceleration. The proposed method is applied to ethylene and trans-1,3-butadiene. An accurate, multi-resolution potential, which combines the MP2 and coupled-cluster with singles

  19. Vibrational quasi-degenerate perturbation theory with optimized coordinates: Applications to ethylene and trans-1,3-butadiene

    Energy Technology Data Exchange (ETDEWEB)

    Yagi, Kiyoshi, E-mail: kiyoshi.yagi@riken.jp; Otaki, Hiroki [Theoretical Molecular Science Laboratory, RIKEN, 2-1 Hirosawa, Wako, Saitama 351-0198 (Japan)

    2014-02-28

    A perturbative extension to optimized coordinate vibrational self-consistent field (oc-VSCF) is proposed based on the quasi-degenerate perturbation theory (QDPT). A scheme to construct the degenerate space (P space) is developed, which incorporates degenerate configurations and alleviates the divergence of perturbative expansion due to localized coordinates in oc-VSCF (e.g., local O–H stretching modes of water). An efficient configuration selection scheme is also implemented, which screens out the Hamiltonian matrix element between the P space configuration (p) and the complementary Q space configuration (q) based on a difference in their quantum numbers (λ{sub pq} = ∑{sub s}|p{sub s} − q{sub s}|). It is demonstrated that the second-order vibrational QDPT based on optimized coordinates (oc-VQDPT2) smoothly converges with respect to the order of the mode coupling, and outperforms the conventional one based on normal coordinates. Furthermore, an improved, fast algorithm is developed for optimizing the coordinates. First, the minimization of the VSCF energy is conducted in a restricted parameter space, in which only a portion of pairs of coordinates is selectively transformed. A rational index is devised for this purpose, which identifies the important coordinate pairs to mix from others that may remain unchanged based on the magnitude of harmonic coupling induced by the transformation. Second, a cubic force field (CFF) is employed in place of a quartic force field, which bypasses intensive procedures that arise due to the presence of the fourth-order force constants. It is found that oc-VSCF based on CFF together with the pair selection scheme yields the coordinates similar in character to the conventional ones such that the final vibrational energy is affected very little while gaining an order of magnitude acceleration. The proposed method is applied to ethylene and trans-1,3-butadiene. An accurate, multi-resolution potential, which combines the MP2 and

  20. Distributed Learning, Extremum Seeking, and Model-Free Optimization for the Resilient Coordination of Multi-Agent Adversarial Groups

    Science.gov (United States)

    2016-09-07

    use of a class of receding- horizon type of algorithms to overcome the effect of a type of uncoordinated attackers on a multi-vehicle-operator group...science, which accounts for both the aspects of resilience under adversaries, and learning via extremum seeking, and distributed optimization techniques...receding- horizon control and distributed parameter learning for the robust coordination of multi-agent systems. A study of the tradeoffs in costs

  1. Development of a neuro-fuzzy technique for automated parameter optimization of inverse treatment planning

    Directory of Open Access Journals (Sweden)

    Wenz Frederik

    2009-09-01

    Full Text Available Abstract Background Parameter optimization in the process of inverse treatment planning for intensity modulated radiation therapy (IMRT is mainly conducted by human planners in order to create a plan with the desired dose distribution. To automate this tedious process, an artificial intelligence (AI guided system was developed and examined. Methods The AI system can automatically accomplish the optimization process based on prior knowledge operated by several fuzzy inference systems (FIS. Prior knowledge, which was collected from human planners during their routine trial-and-error process of inverse planning, has first to be "translated" to a set of "if-then rules" for driving the FISs. To minimize subjective error which could be costly during this knowledge acquisition process, it is necessary to find a quantitative method to automatically accomplish this task. A well-developed machine learning technique, based on an adaptive neuro fuzzy inference system (ANFIS, was introduced in this study. Based on this approach, prior knowledge of a fuzzy inference system can be quickly collected from observation data (clinically used constraints. The learning capability and the accuracy of such a system were analyzed by generating multiple FIS from data collected from an AI system with known settings and rules. Results Multiple analyses showed good agreements of FIS and ANFIS according to rules (error of the output values of ANFIS based on the training data from FIS of 7.77 ± 0.02% and membership functions (3.9%, thus suggesting that the "behavior" of an FIS can be propagated to another, based on this process. The initial experimental results on a clinical case showed that ANFIS is an effective way to build FIS from practical data, and analysis of ANFIS and FIS with clinical cases showed good planning results provided by ANFIS. OAR volumes encompassed by characteristic percentages of isodoses were reduced by a mean of between 0 and 28%. Conclusion The

  2. Development of a neuro-fuzzy technique for automated parameter optimization of inverse treatment planning.

    Science.gov (United States)

    Stieler, Florian; Yan, Hui; Lohr, Frank; Wenz, Frederik; Yin, Fang-Fang

    2009-09-25

    Parameter optimization in the process of inverse treatment planning for intensity modulated radiation therapy (IMRT) is mainly conducted by human planners in order to create a plan with the desired dose distribution. To automate this tedious process, an artificial intelligence (AI) guided system was developed and examined. The AI system can automatically accomplish the optimization process based on prior knowledge operated by several fuzzy inference systems (FIS). Prior knowledge, which was collected from human planners during their routine trial-and-error process of inverse planning, has first to be "translated" to a set of "if-then rules" for driving the FISs. To minimize subjective error which could be costly during this knowledge acquisition process, it is necessary to find a quantitative method to automatically accomplish this task. A well-developed machine learning technique, based on an adaptive neuro fuzzy inference system (ANFIS), was introduced in this study. Based on this approach, prior knowledge of a fuzzy inference system can be quickly collected from observation data (clinically used constraints). The learning capability and the accuracy of such a system were analyzed by generating multiple FIS from data collected from an AI system with known settings and rules. Multiple analyses showed good agreements of FIS and ANFIS according to rules (error of the output values of ANFIS based on the training data from FIS of 7.77 +/- 0.02%) and membership functions (3.9%), thus suggesting that the "behavior" of an FIS can be propagated to another, based on this process. The initial experimental results on a clinical case showed that ANFIS is an effective way to build FIS from practical data, and analysis of ANFIS and FIS with clinical cases showed good planning results provided by ANFIS. OAR volumes encompassed by characteristic percentages of isodoses were reduced by a mean of between 0 and 28%. The study demonstrated a feasible way to automatically perform

  3. Development of a neuro-fuzzy technique for automated parameter optimization of inverse treatment planning

    International Nuclear Information System (INIS)

    Stieler, Florian; Yan, Hui; Lohr, Frank; Wenz, Frederik; Yin, Fang-Fang

    2009-01-01

    Parameter optimization in the process of inverse treatment planning for intensity modulated radiation therapy (IMRT) is mainly conducted by human planners in order to create a plan with the desired dose distribution. To automate this tedious process, an artificial intelligence (AI) guided system was developed and examined. The AI system can automatically accomplish the optimization process based on prior knowledge operated by several fuzzy inference systems (FIS). Prior knowledge, which was collected from human planners during their routine trial-and-error process of inverse planning, has first to be 'translated' to a set of 'if-then rules' for driving the FISs. To minimize subjective error which could be costly during this knowledge acquisition process, it is necessary to find a quantitative method to automatically accomplish this task. A well-developed machine learning technique, based on an adaptive neuro fuzzy inference system (ANFIS), was introduced in this study. Based on this approach, prior knowledge of a fuzzy inference system can be quickly collected from observation data (clinically used constraints). The learning capability and the accuracy of such a system were analyzed by generating multiple FIS from data collected from an AI system with known settings and rules. Multiple analyses showed good agreements of FIS and ANFIS according to rules (error of the output values of ANFIS based on the training data from FIS of 7.77 ± 0.02%) and membership functions (3.9%), thus suggesting that the 'behavior' of an FIS can be propagated to another, based on this process. The initial experimental results on a clinical case showed that ANFIS is an effective way to build FIS from practical data, and analysis of ANFIS and FIS with clinical cases showed good planning results provided by ANFIS. OAR volumes encompassed by characteristic percentages of isodoses were reduced by a mean of between 0 and 28%. The study demonstrated a feasible way

  4. Optimal Coordinated Management of a Plug-In Electric Vehicle Charging Station under a Flexible Penalty Contract for Voltage Security

    Directory of Open Access Journals (Sweden)

    Jip Kim

    2016-07-01

    Full Text Available The increasing penetration of plug-in electric vehicles (PEVs may cause a low-voltage problem in the distribution network. In particular, the introduction of charging stations where multiple PEVs are simultaneously charged at the same bus can aggravate the low-voltage problem. Unlike a distribution network operator (DNO who has the overall responsibility for stable and reliable network operation, a charging station operator (CSO may schedule PEV charging without consideration for the resulting severe voltage drop. Therefore, there is a need for the DNO to impose a coordination measure to induce the CSO to adjust its charging schedule to help mitigate the voltage problem. Although the current time-of-use (TOU tariff is an indirect coordination measure that can motivate the CSO to shift its charging demand to off-peak time by imposing a high rate at the peak time, it is limited by its rigidity in that the network voltage condition cannot be flexibly reflected in the tariff. Therefore, a flexible penalty contract (FPC for voltage security to be used as a direct coordination measure is proposed. In addition, the optimal coordinated management is formulated. Using the Pacific Gas and Electric Company (PG&E 69-bus test distribution network, the effectiveness of the coordination was verified by comparison with the current TOU tariff.

  5. An optimized routing algorithm for the automated assembly of standard multimode ribbon fibers in a full-mesh optical backplane

    Science.gov (United States)

    Basile, Vito; Guadagno, Gianluca; Ferrario, Maddalena; Fassi, Irene

    2018-03-01

    In this paper a parametric, modular and scalable algorithm allowing a fully automated assembly of a backplane fiber-optic interconnection circuit is presented. This approach guarantees the optimization of the optical fiber routing inside the backplane with respect to specific criteria (i.e. bending power losses), addressing both transmission performance and overall costs issues. Graph theory has been exploited to simplify the complexity of the NxN full-mesh backplane interconnection topology, firstly, into N independent sub-circuits and then, recursively, into a limited number of loops easier to be generated. Afterwards, the proposed algorithm selects a set of geometrical and architectural parameters whose optimization allows to identify the optimal fiber optic routing for each sub-circuit of the backplane. The topological and numerical information provided by the algorithm are then exploited to control a robot which performs the automated assembly of the backplane sub-circuits. The proposed routing algorithm can be extended to any array architecture and number of connections thanks to its modularity and scalability. Finally, the algorithm has been exploited for the automated assembly of an 8x8 optical backplane realized with standard multimode (MM) 12-fiber ribbons.

  6. Radius of Care in Secondary Schools in the Midwest: Are Automated External Defibrillators Sufficiently Accessible to Enable Optimal Patient Care?

    Science.gov (United States)

    Osterman, Michael; Claiborne, Tina; Liberi, Victor

    2018-04-25

      Sudden cardiac arrest is the leading cause of death among young athletes. According to the American Heart Association, an automated external defibrillator (AED) should be available within a 1- to 1.5-minute brisk walk from the patient for the highest chance of survival. Secondary school personnel have reported a lack of understanding about the proper number and placement of AEDs for optimal patient care.   To determine whether fixed AEDs were located within a 1- to 1.5-minute timeframe from any location on secondary school property (ie, radius of care).   Cross-sectional study.   Public and private secondary schools in northwest Ohio and southeast Michigan.   Thirty schools (24 public, 6 private) volunteered.   Global positioning system coordinates were used to survey the entire school properties and determine AED locations. From each AED location, the radius of care was calculated for 3 retrieval speeds: walking, jogging, and driving a utility vehicle. Data were analyzed to expose any property area that fell outside the radius of care.   Public schools (37.1% ± 11.0%) possessed more property outside the radius of care than did private schools (23.8% ± 8.0%; F 1,28 = 8.35, P = .01). After accounting for retrieval speed, we still observed differences between school types when personnel would need to walk or jog to retrieve an AED ( F 1.48,41.35 = 4.99, P = .02). The percentages of school property outside the radius of care for public and private schools were 72.6% and 56.3%, respectively, when walking and 34.4% and 12.2%, respectively, when jogging. Only 4.2% of the public and none of the private schools had property outside the radius of care when driving a utility vehicle.   Schools should strategically place AEDs to decrease the percentage of property area outside the radius of care. In some cases, placement in a centralized location that is publicly accessible may be more important than the overall number of AEDs on site.

  7. Moving Toward an Optimal and Automated Geospatial Network for CCUS Infrastructure

    Energy Technology Data Exchange (ETDEWEB)

    Hoover, Brendan Arthur [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-08-05

    Modifications in the global climate are being driven by the anthropogenic release of greenhouse gases (GHG) including carbon dioxide (CO2) (Middleton et al. 2014). CO2 emissions have, for example, been directly linked to an increase in total global temperature (Seneviratne et al. 2016). Strategies that limit CO2 emissions—like CO2 capture, utilization, and storage (CCUS) technology—can greatly reduce emissions by capturing CO2 before it is released to the atmosphere. However, to date CCUS technology has not been developed at a large commercial scale despite several promising high profile demonstration projects (Middleton et al. 2015). Current CCUS research has often focused on capturing CO2 emissions from coal-fired power plants, but recent research at Los Alamos National Laboratory (LANL) suggests focusing CCUS CO2 capture research upon industrial sources might better encourage CCUS deployment. To further promote industrial CCUS deployment, this project builds off current LANL research by continuing the development of a software tool called SimCCS, which estimates a regional system of transport to inject CO2 into sedimentary basins. The goal of SimCCS, which was first developed by Middleton and Bielicki (2009), is to output an automated and optimal geospatial industrial CCUS pipeline that accounts for industrial source and sink locations by estimating a Delaunay triangle network which also minimizes topographic and social costs (Middleton and Bielicki 2009). Current development of SimCCS is focused on creating a new version that accounts for spatial arrangements that were not available in the previous version. This project specifically addresses the issue of non-unique Delaunay triangles by adding additional triangles to the network, which can affect how the CCUS network is calculated.

  8. Automated bone segmentation from dental CBCT images using patch-based sparse representation and convex optimization

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Li; Gao, Yaozong; Shi, Feng; Liao, Shu; Li, Gang [Department of Radiology and BRIC, University of North Carolina at Chapel Hill, North Carolina 27599 (United States); Chen, Ken Chung [Department of Oral and Maxillofacial Surgery, Houston Methodist Hospital Research Institute, Houston, Texas 77030 and Department of Stomatology, National Cheng Kung University Medical College and Hospital, Tainan, Taiwan 70403 (China); Shen, Steve G. F.; Yan, Jin [Department of Oral and Craniomaxillofacial Surgery and Science, Shanghai Ninth People' s Hospital, Shanghai Jiao Tong University College of Medicine, Shanghai, China 200011 (China); Lee, Philip K. M.; Chow, Ben [Hong Kong Dental Implant and Maxillofacial Centre, Hong Kong, China 999077 (China); Liu, Nancy X. [Department of Oral and Maxillofacial Surgery, Houston Methodist Hospital Research Institute, Houston, Texas 77030 and Department of Oral and Maxillofacial Surgery, Peking University School and Hospital of Stomatology, Beijing, China 100050 (China); Xia, James J. [Department of Oral and Maxillofacial Surgery, Houston Methodist Hospital Research Institute, Houston, Texas 77030 (United States); Department of Surgery (Oral and Maxillofacial Surgery), Weill Medical College, Cornell University, New York, New York 10065 (United States); Department of Oral and Craniomaxillofacial Surgery and Science, Shanghai Ninth People' s Hospital, Shanghai Jiao Tong University College of Medicine, Shanghai, China 200011 (China); Shen, Dinggang, E-mail: dgshen@med.unc.edu [Department of Radiology and BRIC, University of North Carolina at Chapel Hill, North Carolina 27599 and Department of Brain and Cognitive Engineering, Korea University, Seoul, 136701 (Korea, Republic of)

    2014-04-15

    Purpose: Cone-beam computed tomography (CBCT) is an increasingly utilized imaging modality for the diagnosis and treatment planning of the patients with craniomaxillofacial (CMF) deformities. Accurate segmentation of CBCT image is an essential step to generate three-dimensional (3D) models for the diagnosis and treatment planning of the patients with CMF deformities. However, due to the poor image quality, including very low signal-to-noise ratio and the widespread image artifacts such as noise, beam hardening, and inhomogeneity, it is challenging to segment the CBCT images. In this paper, the authors present a new automatic segmentation method to address these problems. Methods: To segment CBCT images, the authors propose a new method for fully automated CBCT segmentation by using patch-based sparse representation to (1) segment bony structures from the soft tissues and (2) further separate the mandible from the maxilla. Specifically, a region-specific registration strategy is first proposed to warp all the atlases to the current testing subject and then a sparse-based label propagation strategy is employed to estimate a patient-specific atlas from all aligned atlases. Finally, the patient-specific atlas is integrated into amaximum a posteriori probability-based convex segmentation framework for accurate segmentation. Results: The proposed method has been evaluated on a dataset with 15 CBCT images. The effectiveness of the proposed region-specific registration strategy and patient-specific atlas has been validated by comparing with the traditional registration strategy and population-based atlas. The experimental results show that the proposed method achieves the best segmentation accuracy by comparison with other state-of-the-art segmentation methods. Conclusions: The authors have proposed a new CBCT segmentation method by using patch-based sparse representation and convex optimization, which can achieve considerably accurate segmentation results in CBCT

  9. Automated bone segmentation from dental CBCT images using patch-based sparse representation and convex optimization

    International Nuclear Information System (INIS)

    Wang, Li; Gao, Yaozong; Shi, Feng; Liao, Shu; Li, Gang; Chen, Ken Chung; Shen, Steve G. F.; Yan, Jin; Lee, Philip K. M.; Chow, Ben; Liu, Nancy X.; Xia, James J.; Shen, Dinggang

    2014-01-01

    Purpose: Cone-beam computed tomography (CBCT) is an increasingly utilized imaging modality for the diagnosis and treatment planning of the patients with craniomaxillofacial (CMF) deformities. Accurate segmentation of CBCT image is an essential step to generate three-dimensional (3D) models for the diagnosis and treatment planning of the patients with CMF deformities. However, due to the poor image quality, including very low signal-to-noise ratio and the widespread image artifacts such as noise, beam hardening, and inhomogeneity, it is challenging to segment the CBCT images. In this paper, the authors present a new automatic segmentation method to address these problems. Methods: To segment CBCT images, the authors propose a new method for fully automated CBCT segmentation by using patch-based sparse representation to (1) segment bony structures from the soft tissues and (2) further separate the mandible from the maxilla. Specifically, a region-specific registration strategy is first proposed to warp all the atlases to the current testing subject and then a sparse-based label propagation strategy is employed to estimate a patient-specific atlas from all aligned atlases. Finally, the patient-specific atlas is integrated into amaximum a posteriori probability-based convex segmentation framework for accurate segmentation. Results: The proposed method has been evaluated on a dataset with 15 CBCT images. The effectiveness of the proposed region-specific registration strategy and patient-specific atlas has been validated by comparing with the traditional registration strategy and population-based atlas. The experimental results show that the proposed method achieves the best segmentation accuracy by comparison with other state-of-the-art segmentation methods. Conclusions: The authors have proposed a new CBCT segmentation method by using patch-based sparse representation and convex optimization, which can achieve considerably accurate segmentation results in CBCT

  10. Automated property optimization via ab initio O(N) elongation method: Application to (hyper-)polarizability in DNA

    International Nuclear Information System (INIS)

    Orimoto, Yuuichi; Aoki, Yuriko

    2016-01-01

    An automated property optimization method was developed based on the ab initio O(N) elongation (ELG) method and applied to the optimization of nonlinear optical (NLO) properties in DNA as a first test. The ELG method mimics a polymerization reaction on a computer, and the reaction terminal of a starting cluster is attacked by monomers sequentially to elongate the electronic structure of the system by solving in each step a limited space including the terminal (localized molecular orbitals at the terminal) and monomer. The ELG-finite field (ELG-FF) method for calculating (hyper-)polarizabilities was used as the engine program of the optimization method, and it was found to show linear scaling efficiency while maintaining high computational accuracy for a random sequenced DNA model. Furthermore, the self-consistent field convergence was significantly improved by using the ELG-FF method compared with a conventional method, and it can lead to more feasible NLO property values in the FF treatment. The automated optimization method successfully chose an appropriate base pair from four base pairs (A, T, G, and C) for each elongation step according to an evaluation function. From test optimizations for the first order hyper-polarizability (β) in DNA, a substantial difference was observed depending on optimization conditions between “choose-maximum” (choose a base pair giving the maximum β for each step) and “choose-minimum” (choose a base pair giving the minimum β). In contrast, there was an ambiguous difference between these conditions for optimizing the second order hyper-polarizability (γ) because of the small absolute value of γ and the limitation of numerical differential calculations in the FF method. It can be concluded that the ab initio level property optimization method introduced here can be an effective step towards an advanced computer aided material design method as long as the numerical limitation of the FF method is taken into account.

  11. Civil Engineering and Building Service Topographic Permanent Landmarks Network. Spatial Coordinate Optimization

    Directory of Open Access Journals (Sweden)

    Lepadatu Daniel

    2016-06-01

    Full Text Available Sustainable development is a modern concept of adaptation conditions for achieving objectives that respond simultaneously to at least three major requirements: economic, social and environmental. Achieving sustainable development cannot be accomplished without a change of mentality of people and without communities able to use resources rationally and efficiently. For an efficient application programs surveying topography discipline the students have imagined and created a network of local topographic permanent terminals required for reporting the rectangular coordinates of applications. In order to obtain more accurate values of these coordinates we have made several types of measurements that will be presented in detail in this work.

  12. Optimization and validation of automated hippocampal subfield segmentation across the lifespan.

    Science.gov (United States)

    Bender, Andrew R; Keresztes, Attila; Bodammer, Nils C; Shing, Yee Lee; Werkle-Bergner, Markus; Daugherty, Ana M; Yu, Qijing; Kühn, Simone; Lindenberger, Ulman; Raz, Naftali

    2018-02-01

    Automated segmentation of hippocampal (HC) subfields from magnetic resonance imaging (MRI) is gaining popularity, but automated procedures that afford high speed and reproducibility have yet to be extensively validated against the standard, manual morphometry. We evaluated the concurrent validity of an automated method for hippocampal subfields segmentation (automated segmentation of hippocampal subfields, ASHS; Yushkevich et al., ) using a customized atlas of the HC body, with manual morphometry as a standard. We built a series of customized atlases comprising the entorhinal cortex (ERC) and subfields of the HC body from manually segmented images, and evaluated the correspondence of automated segmentations with manual morphometry. In samples with age ranges of 6-24 and 62-79 years, 20 participants each, we obtained validity coefficients (intraclass correlations, ICC) and spatial overlap measures (dice similarity coefficient) that varied substantially across subfields. Anterior and posterior HC body evidenced the greatest discrepancies between automated and manual segmentations. Adding anterior and posterior slices for atlas creation and truncating automated output to the ranges manually defined by multiple neuroanatomical landmarks substantially improved the validity of automated segmentation, yielding ICC above 0.90 for all subfields and alleviating systematic bias. We cross-validated the developed atlas on an independent sample of 30 healthy adults (age 31-84) and obtained good to excellent agreement: ICC (2) = 0.70-0.92. Thus, with described customization steps implemented by experts trained in MRI neuroanatomy, ASHS shows excellent concurrent validity, and can become a promising method for studying age-related changes in HC subfield volumes. © 2017 Wiley Periodicals, Inc.

  13. A heterarchic hybrid coordination strategy for congestion management and market optimization using the DREAM framework

    NARCIS (Netherlands)

    Kamphuis, I.G.; Wijbenga, J.P.; Veen, J.S. van der

    2016-01-01

    Software agent-based strategies using micro-economic theory like PowerMatcher[1] have been utilized to coordinate demand and supply matching for electricity. Virtual power plants (VPPs) using these strategies have been tested in living lab environments on a scale of up to hundreds of households. So

  14. Digital Piracy: An Assessment of Consumer Piracy Risk and Optimal Supply Chain Coordination Strategies

    Science.gov (United States)

    Jeong, Bong-Keun

    2010-01-01

    Digital piracy and the emergence of new distribution channels have changed the dynamics of supply chain coordination and created many interesting problems. There has been increased attention to understanding the phenomenon of consumer piracy behavior and its impact on supply chain profitability. The purpose of this dissertation is to better…

  15. Research on ISFLA-Based Optimal Control Strategy for the Coordinated Charging of EV Battery Swap Station

    Directory of Open Access Journals (Sweden)

    Xueliang Huang

    2013-01-01

    Full Text Available As an important component of the smart grid, electric vehicles (EVs could be a good measure against energy shortages and environmental pollution. A main way of energy supply to EVs is to swap battery from the swap station. Based on the characteristics of EV battery swap station, the coordinated charging optimal control strategy is investigated to smooth the load fluctuation. Shuffled frog leaping algorithm (SFLA is an optimization method inspired by the memetic evolution of a group of frogs when seeking food. An improved shuffled frog leaping algorithm (ISFLA with the reflecting method to deal with the boundary constraint is proposed to obtain the solution of the optimal control strategy for coordinated charging. Based on the daily load of a certain area, the numerical simulations including the comparison of PSO and ISFLA are carried out and the results show that the presented ISFLA can effectively lower the peak-valley difference and smooth the load profile with the faster convergence rate and higher convergence precision.

  16. A new framework for analysing automated acoustic species-detection data: occupancy estimation and optimization of recordings post-processing

    Science.gov (United States)

    Chambert, Thierry A.; Waddle, J. Hardin; Miller, David A.W.; Walls, Susan; Nichols, James D.

    2018-01-01

    The development and use of automated species-detection technologies, such as acoustic recorders, for monitoring wildlife are rapidly expanding. Automated classification algorithms provide a cost- and time-effective means to process information-rich data, but often at the cost of additional detection errors. Appropriate methods are necessary to analyse such data while dealing with the different types of detection errors.We developed a hierarchical modelling framework for estimating species occupancy from automated species-detection data. We explore design and optimization of data post-processing procedures to account for detection errors and generate accurate estimates. Our proposed method accounts for both imperfect detection and false positive errors and utilizes information about both occurrence and abundance of detections to improve estimation.Using simulations, we show that our method provides much more accurate estimates than models ignoring the abundance of detections. The same findings are reached when we apply the methods to two real datasets on North American frogs surveyed with acoustic recorders.When false positives occur, estimator accuracy can be improved when a subset of detections produced by the classification algorithm is post-validated by a human observer. We use simulations to investigate the relationship between accuracy and effort spent on post-validation, and found that very accurate occupancy estimates can be obtained with as little as 1% of data being validated.Automated monitoring of wildlife provides opportunity and challenges. Our methods for analysing automated species-detection data help to meet key challenges unique to these data and will prove useful for many wildlife monitoring programs.

  17. Tracking Systems for Orientation of Solar Panels and Optimization of their Positioning Using Three-coordinate Platforms

    Directory of Open Access Journals (Sweden)

    Chalbash O.H.

    2017-12-01

    Full Text Available Two-coordinate platforms equipped with orientation systems are 40-45% more effective than stationary installations. However, there are other factors affecting the efficiency of solar installations. In particular, the shading on the panels’ surfaces when panels, located in rows, are casting shadows on each other. This negatively affects the efficiency of photovoltaic installations. Previous experience in the design of photovoltaic systems shows that neither stationary platforms nor two-coordinate installations completely eliminate energy losses due to shadow formation. The only way to mitigate this negative impact is to increase the distance between the panels. At the same time, and the density ratio (the ratio of the panel area to the area of the land does not exceed 0.2. Our goal is to develop kinematic schemes and software control systems for three coordinate platforms that can avoid shadow formation on panels placed in constrained spaces. The result of our work is a numerical method that solves the optimization problem for controlling the motion of a set of platforms and a rational kinematic scheme for three coordinate platforms. This problem is especially relevant for solar photovoltaic systems located on space stations. In space, the changes in temperature between shaded and shadow-free sections of panels is enormous, due to temperature stress, the panels get destroyed and require expensive repairs. Three-coordinate tracking can reduce the surface occupied by solar panels by about 3 times compared to the currently used solutions and increased module placement densities from 0.2 to 0.6.

  18. Optimal Coordinated EV Charging with Reactive Power Support in Constrained Distribution Grids

    Energy Technology Data Exchange (ETDEWEB)

    Paudyal, Sumit; Ceylan, Oğuzhan; Bhattarai, Bishnu P.; Myers, Kurt S.

    2017-07-01

    Electric vehicle (EV) charging/discharging can take place in any P-Q quadrants, which means EVs could support reactive power to the grid while charging the battery. In controlled charging schemes, distribution system operator (DSO) coordinates with the charging of EV fleets to ensure grid’s operating constraints are not violated. In fact, this refers to DSO setting upper bounds on power limits for EV charging. In this work, we demonstrate that if EVs inject reactive power into the grid while charging, DSO could issue higher upper bounds on the active power limits for the EVs for the same set of grid constraints. We demonstrate the concept in an 33-node test feeder with 1,500 EVs. Case studies show that in constrained distribution grids in coordinated charging, average costs of EV charging could be reduced if the charging takes place in the fourth P-Q quadrant compared to charging with unity power factor.

  19. Multi-objective optimization for an automated and simultaneous phase and baseline correction of NMR spectral data

    Science.gov (United States)

    Sawall, Mathias; von Harbou, Erik; Moog, Annekathrin; Behrens, Richard; Schröder, Henning; Simoneau, Joël; Steimers, Ellen; Neymeyr, Klaus

    2018-04-01

    Spectral data preprocessing is an integral and sometimes inevitable part of chemometric analyses. For Nuclear Magnetic Resonance (NMR) spectra a possible first preprocessing step is a phase correction which is applied to the Fourier transformed free induction decay (FID) signal. This preprocessing step can be followed by a separate baseline correction step. Especially if series of high-resolution spectra are considered, then automated and computationally fast preprocessing routines are desirable. A new method is suggested that applies the phase and the baseline corrections simultaneously in an automated form without manual input, which distinguishes this work from other approaches. The underlying multi-objective optimization or Pareto optimization provides improved results compared to consecutively applied correction steps. The optimization process uses an objective function which applies strong penalty constraints and weaker regularization conditions. The new method includes an approach for the detection of zero baseline regions. The baseline correction uses a modified Whittaker smoother. The functionality of the new method is demonstrated for experimental NMR spectra. The results are verified against gravimetric data. The method is compared to alternative preprocessing tools. Additionally, the simultaneous correction method is compared to a consecutive application of the two correction steps.

  20. Application-Oriented Optimal Shift Schedule Extraction for a Dual-Motor Electric Bus with Automated Manual Transmission

    Directory of Open Access Journals (Sweden)

    Mingjie Zhao

    2018-02-01

    Full Text Available The conventional battery electric buses (BEBs have limited potential to optimize the energy consumption and reach a better dynamic performance. A practical dual-motor equipped with 4-speed Automated Manual Transmission (AMT propulsion system is proposed, which can eliminate the traction interruption in conventional AMT. A discrete model of the dual-motor-AMT electric bus (DMAEB is built and used to optimize the gear shift schedule. Dynamic programming (DP algorithm is applied to find the optimal results where the efficiency and shift time of each gear are considered to handle the application problem of global optimization. A rational penalty factor and a proper shift time delay based on bench test results are set to reduce the shift frequency by 82.5% in Chinese-World Transient Vehicle Cycle (C-WTVC. Two perspectives of applicable shift rule extraction methods, i.e., the classification method based on optimal operating points and clustering method based on optimal shifting points, are explored and compared. Eventually, the hardware-in-the-loop (HIL simulation results demonstrate that the proposed structure and extracted shift schedule can realize a significant improvement in reducing energy loss by 20.13% compared to traditional empirical strategies.

  1. Economic Load Dispatch - A Comparative Study on Heuristic Optimization Techniques With an Improved Coordinated Aggregation-Based PSO

    DEFF Research Database (Denmark)

    Vlachogiannis, Ioannis (John); Lee, KY

    2009-01-01

    In this paper an improved coordinated aggregation-based particle swarm optimization (ICA-PSO) algorithm is introduced for solving the optimal economic load dispatch (ELD) problem in power systems. In the ICA-PSO algorithm each particle in the swarm retains a memory of its best position ever...... and the particles search the decision space with accuracy up to two digit points resulting in the improved convergence of the process. The ICA-PSO algorithm is tested on a number of power systems, including the systems with 6, 13, 15, and 40 generating units, the island power system of Crete in Greece...... encountered, and is attracted only by other particles with better achievements than its own with the exception of the particle with the best achievement, which moves randomly. Moreover, the population size is increased adaptively, the number of search intervals for the particles is selected adaptively...

  2. Optimized and Automated Radiosynthesis of [18F]DHMT for Translational Imaging of Reactive Oxygen Species with Positron Emission Tomography

    Directory of Open Access Journals (Sweden)

    Wenjie Zhang

    2016-12-01

    Full Text Available Reactive oxygen species (ROS play important roles in cell signaling and homeostasis. However, an abnormally high level of ROS is toxic, and is implicated in a number of diseases. Positron emission tomography (PET imaging of ROS can assist in the detection of these diseases. For the purpose of clinical translation of [18F]6-(4-((1-(2-fluoroethyl-1H-1,2,3-triazol-4-ylmethoxyphenyl-5-methyl-5,6-dihydrophenanthridine-3,8-diamine ([18F]DHMT, a promising ROS PET radiotracer, we first manually optimized the large-scale radiosynthesis conditions and then implemented them in an automated synthesis module. Our manual synthesis procedure afforded [18F]DHMT in 120 min with overall radiochemical yield (RCY of 31.6% ± 9.3% (n = 2, decay-uncorrected and specific activity of 426 ± 272 GBq/µmol (n = 2. Fully automated radiosynthesis of [18F]DHMT was achieved within 77 min with overall isolated RCY of 6.9% ± 2.8% (n = 7, decay-uncorrected and specific activity of 155 ± 153 GBq/µmol (n = 7 at the end of synthesis. This study is the first demonstration of producing 2-[18F]fluoroethyl azide by an automated module, which can be used for a variety of PET tracers through click chemistry. It is also the first time that [18F]DHMT was successfully tested for PET imaging in a healthy beagle dog.

  3. Guiding automated NMR structure determination using a global optimization metric, the NMR DP score

    International Nuclear Information System (INIS)

    Huang, Yuanpeng Janet; Mao, Binchen; Xu, Fei; Montelione, Gaetano T.

    2015-01-01

    ASDP is an automated NMR NOE assignment program. It uses a distinct bottom-up topology-constrained network anchoring approach for NOE interpretation, with 2D, 3D and/or 4D NOESY peak lists and resonance assignments as input, and generates unambiguous NOE constraints for iterative structure calculations. ASDP is designed to function interactively with various structure determination programs that use distance restraints to generate molecular models. In the CASD–NMR project, ASDP was tested and further developed using blinded NMR data, including resonance assignments, either raw or manually-curated (refined) NOESY peak list data, and in some cases 15 N– 1 H residual dipolar coupling data. In these blinded tests, in which the reference structure was not available until after structures were generated, the fully-automated ASDP program performed very well on all targets using both the raw and refined NOESY peak list data. Improvements of ASDP relative to its predecessor program for automated NOESY peak assignments, AutoStructure, were driven by challenges provided by these CASD–NMR data. These algorithmic improvements include (1) using a global metric of structural accuracy, the discriminating power score, for guiding model selection during the iterative NOE interpretation process, and (2) identifying incorrect NOESY cross peak assignments caused by errors in the NMR resonance assignment list. These improvements provide a more robust automated NOESY analysis program, ASDP, with the unique capability of being utilized with alternative structure generation and refinement programs including CYANA, CNS, and/or Rosetta

  4. Kriging-Based Parameter Estimation Algorithm for Metabolic Networks Combined with Single-Dimensional Optimization and Dynamic Coordinate Perturbation.

    Science.gov (United States)

    Wang, Hong; Wang, Xicheng; Li, Zheng; Li, Keqiu

    2016-01-01

    The metabolic network model allows for an in-depth insight into the molecular mechanism of a particular organism. Because most parameters of the metabolic network cannot be directly measured, they must be estimated by using optimization algorithms. However, three characteristics of the metabolic network model, i.e., high nonlinearity, large amount parameters, and huge variation scopes of parameters, restrict the application of many traditional optimization algorithms. As a result, there is a growing demand to develop efficient optimization approaches to address this complex problem. In this paper, a Kriging-based algorithm aiming at parameter estimation is presented for constructing the metabolic networks. In the algorithm, a new infill sampling criterion, named expected improvement and mutual information (EI&MI), is adopted to improve the modeling accuracy by selecting multiple new sample points at each cycle, and the domain decomposition strategy based on the principal component analysis is introduced to save computing time. Meanwhile, the convergence speed is accelerated by combining a single-dimensional optimization method with the dynamic coordinate perturbation strategy when determining the new sample points. Finally, the algorithm is applied to the arachidonic acid metabolic network to estimate its parameters. The obtained results demonstrate the effectiveness of the proposed algorithm in getting precise parameter values under a limited number of iterations.

  5. GENPLAT: an automated platform for biomass enzyme discovery and cocktail optimization.

    Science.gov (United States)

    Walton, Jonathan; Banerjee, Goutami; Car, Suzana

    2011-10-24

    The high cost of enzymes for biomass deconstruction is a major impediment to the economic conversion of lignocellulosic feedstocks to liquid transportation fuels such as ethanol. We have developed an integrated high throughput platform, called GENPLAT, for the discovery and development of novel enzymes and enzyme cocktails for the release of sugars from diverse pretreatment/biomass combinations. GENPLAT comprises four elements: individual pure enzymes, statistical design of experiments, robotic pipeting of biomass slurries and enzymes, and automated colorimeteric determination of released Glc and Xyl. Individual enzymes are produced by expression in Pichia pastoris or Trichoderma reesei, or by chromatographic purification from commercial cocktails or from extracts of novel microorganisms. Simplex lattice (fractional factorial) mixture models are designed using commercial Design of Experiment statistical software. Enzyme mixtures of high complexity are constructed using robotic pipeting into a 96-well format. The measurement of released Glc and Xyl is automated using enzyme-linked colorimetric assays. Optimized enzyme mixtures containing as many as 16 components have been tested on a variety of feedstock and pretreatment combinations. GENPLAT is adaptable to mixtures of pure enzymes, mixtures of commercial products (e.g., Accellerase 1000 and Novozyme 188), extracts of novel microbes, or combinations thereof. To make and test mixtures of ˜10 pure enzymes requires less than 100 μg of each protein and fewer than 100 total reactions, when operated at a final total loading of 15 mg protein/g glucan. We use enzymes from several sources. Enzymes can be purified from natural sources such as fungal cultures (e.g., Aspergillus niger, Cochliobolus carbonum, and Galerina marginata), or they can be made by expression of the encoding genes (obtained from the increasing number of microbial genome sequences) in hosts such as E. coli, Pichia pastoris, or a filamentous fungus such

  6. Fully automated segmentation of a hip joint using the patient-specific optimal thresholding and watershed algorithm.

    Science.gov (United States)

    Kim, Jung Jin; Nam, Jimin; Jang, In Gwun

    2018-02-01

    Automated segmentation with high accuracy and speed is a prerequisite for FEA-based quantitative assessment with a large population. However, hip joint segmentation has remained challenging due to a narrow articular cartilage and thin cortical bone with a marked interindividual variance. To overcome this challenge, this paper proposes a fully automated segmentation method for a hip joint that uses the complementary characteristics between the thresholding technique and the watershed algorithm. Using the golden section method and load path algorithm, the proposed method first determines the patient-specific optimal threshold value that enables reliably separating a femur from a pelvis while removing cortical and trabecular bone in the femur at the minimum. This provides regional information on the femur. The watershed algorithm is then used to obtain boundary information on the femur. The proximal femur can be extracted by merging the complementary information on a target image. For eight CT images, compared with the manual segmentation and other segmentation methods, the proposed method offers a high accuracy in terms of the dice overlap coefficient (97.24 ± 0.44%) and average surface distance (0.36 ± 0.07 mm) within a fast timeframe in terms of processing time per slice (1.25 ± 0.27 s). The proposed method also delivers structural behavior which is close to that of the manual segmentation with a small mean of average relative errors of the risk factor (4.99%). The segmentation results show that, without the aid of a prerequisite dataset and users' manual intervention, the proposed method can segment a hip joint as fast as the simplified Kang (SK)-based automated segmentation, while maintaining the segmentation accuracy at a similar level of the snake-based semi-automated segmentation. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Optimization of the radiological protection of patients: Image quality and dose in mammography (co-ordinated research in Europe). Results of the coordinated research project on optimization of protection mammography in some eastern European States

    International Nuclear Information System (INIS)

    2005-05-01

    Mammography is an extremely useful non-invasive imaging technique with unparalleled advantages for the detection of breast cancer. It has played an immense role in the screening of women above a certain age or with a family history of breast cancer. The IAEA has a statutory responsibility to establish standards for the protection of people against exposure to ionizing radiation and to provide for the worldwide application of those standards. A fundamental requirement of the International Basic Safety Standards for Protection Against Ionizing Radiation (BSS) and for the Safety of Radiation Sources, issued by the IAEA and co-sponsored by FAO, ILO, WHO, PAHO and NEA, is the optimization of radiological protection of patients undergoing medical exposure. In keeping with its responsibility on the application of standards, the IAEA programme on Radiological Protection of Patients attempts to reduce radiation doses to patients while balancing quality assurance considerations. IAEA-TECDOC-796, Radiation Doses in Diagnostic Radiology and Methods for Dose Reduction (1995), addresses this aspect. The related IAEA-TECDOC-1423 on Optimization of the Radiological Protection of Patients undergoing Radiography, Fluoroscopy and Computed Tomography, (2004) constitutes the final report of the coordinated research in Africa, Asia and eastern Europe. The preceding publications do not explicitly consider mammography. Mindful of the importance of this imaging technique, the IAEA launched a Coordinated Research Project on Optimization of Protection in Mammography in some eastern European States. The present publication is the outcome of this project: it is aimed at evaluating the situation in a number of countries, identifying variations in the technique, examining the status of the equipment and comparing performance in the light of the norms established by the European Commission. A number of important aspects are covered, including: - quality control of mammography equipment; - imaging

  8. Recurrent oligomers in proteins: an optimal scheme reconciling accurate and concise backbone representations in automated folding and design studies.

    Science.gov (United States)

    Micheletti, C; Seno, F; Maritan, A

    2000-09-01

    A novel scheme is introduced to capture the spatial correlations of consecutive amino acids in naturally occurring proteins. This knowledge-based strategy is able to carry out optimally automated subdivisions of protein fragments into classes of similarity. The goal is to provide the minimal set of protein oligomers (termed "oligons" for brevity) that is able to represent any other fragment. At variance with previous studies in which recurrent local motifs were classified, our concern is to provide simplified protein representations that have been optimised for use in automated folding and/or design attempts. In such contexts, it is paramount to limit the number of degrees of freedom per amino acid without incurring loss of accuracy of structural representations. The suggested method finds, by construction, the optimal compromise between these needs. Several possible oligon lengths are considered. It is shown that meaningful classifications cannot be done for lengths greater than six or smaller than four. Different contexts are considered for which oligons of length five or six are recommendable. With only a few dozen oligons of such length, virtually any protein can be reproduced within typical experimental uncertainties. Structural data for the oligons are made publicly available.

  9. Gravity-Assist Trajectories to the Ice Giants: An Automated Method to Catalog Mass-or Time-Optimal Solutions

    Science.gov (United States)

    Hughes, Kyle M.; Knittel, Jeremy M.; Englander, Jacob A.

    2017-01-01

    This work presents an automated method of calculating mass (or time) optimal gravity-assist trajectories without a priori knowledge of the flyby-body combination. Since gravity assists are particularly crucial for reaching the outer Solar System, we use the Ice Giants, Uranus and Neptune, as example destinations for this work. Catalogs are also provided that list the most attractive trajectories found over launch dates ranging from 2024 to 2038. The tool developed to implement this method, called the Python EMTG Automated Trade Study Application (PEATSA), iteratively runs the Evolutionary Mission Trajectory Generator (EMTG), a NASA Goddard Space Flight Center in-house trajectory optimization tool. EMTG finds gravity-assist trajectories with impulsive maneuvers using a multiple-shooting structure along with stochastic methods (such as monotonic basin hopping) and may be run with or without an initial guess provided. PEATSA runs instances of EMTG in parallel over a grid of launch dates. After each set of runs completes, the best results within a neighborhood of launch dates are used to seed all other cases in that neighborhood---allowing the solutions across the range of launch dates to improve over each iteration. The results here are compared against trajectories found using a grid-search technique, and PEATSA is found to outperform the grid-search results for most launch years considered.

  10. Closure to Discussion on "Economic Load Dispatch-A Comparative Study on Heuristic Optimization Techniques With an Improved Coordinated Aggregation-Based PSO"

    DEFF Research Database (Denmark)

    Vlachogiannis, Ioannis (John); Lee, K. Y.

    2010-01-01

    In this paper an improved coordinated aggregation-based particle swarm optimization algorithm is introduced for solving the optimal economic load dispatch problem in power systems. In the improved coordinated aggregation-based particle swarm optimization algorithm each particle in the swarm retains...... a memory of its best position ever encountered, and is attracted only by other particles with better achievements than its own with the exception of the particle with the best achievement, which moves randomly.The ICA-PSO algorithm is tested on a number of power systems, including the systems with 6, 13...

  11. Vendor-buyer coordination and supply chain optimization with deterministic demand function

    Directory of Open Access Journals (Sweden)

    Uddin Mohammed

    2016-01-01

    Full Text Available This paper presents a model that deals with a vendor-buyer multi-product, multi-facility and multi-customer location selection problem, which subsume a set of manufacturer with limited production capacities situated within a geographical area. We assume that the vendor and the buyer are coordinated by mutually sharing information. We formulate Mixed Integer Linear Fractional Programming (MILFP model that maximize the ratio of return on investment of the distribution network, and a Mixed Integer Program (MIP, used for the comparison. The performance of the model is illustrated by a numerical example. In addition, product distribution and allocation of different customers along with the sensitivity of the key parameters are analyzed. It can be observed that the increment of the opening cost decreases the profit in both MILFP and MIP models. If the opening cost of a location decreases or increases, the demand and the capacity of that location changes accordingly.

  12. Optimizing Decision Preparedness by Adapting Scenario Complexity and Automating Scenario Generation

    Science.gov (United States)

    Dunne, Rob; Schatz, Sae; Flore, Stephen M.; Nicholson, Denise

    2011-01-01

    Klein's recognition-primed decision (RPD) framework proposes that experts make decisions by recognizing similarities between current decision situations and previous decision experiences. Unfortunately, military personnel arQ often presented with situations that they have not experienced before. Scenario-based training (S8T) can help mitigate this gap. However, SBT remains a challenging and inefficient training approach. To address these limitations, the authors present an innovative formulation of scenario complexity that contributes to the larger research goal of developing an automated scenario generation system. This system will enable trainees to effectively advance through a variety of increasingly complex decision situations and experiences. By adapting scenario complexities and automating generation, trainees will be provided with a greater variety of appropriately calibrated training events, thus broadening their repositories of experience. Preliminary results from empirical testing (N=24) of the proof-of-concept formula are presented, and future avenues of scenario complexity research are also discussed.

  13. Automated Gravimetric Calibration to Optimize the Accuracy and Precision of TECAN Freedom EVO Liquid Handler.

    Science.gov (United States)

    Bessemans, Laurent; Jully, Vanessa; de Raikem, Caroline; Albanese, Mathieu; Moniotte, Nicolas; Silversmet, Pascal; Lemoine, Dominique

    2016-10-01

    High-throughput screening technologies are increasingly integrated into the formulation development process of biopharmaceuticals. The performance of liquid handling systems is dependent on the ability to deliver accurate and precise volumes of specific reagents to ensure process quality. We have developed an automated gravimetric calibration procedure to adjust the accuracy and evaluate the precision of the TECAN Freedom EVO liquid handling system. Volumes from 3 to 900 µL using calibrated syringes and fixed tips were evaluated with various solutions, including aluminum hydroxide and phosphate adjuvants, β-casein, sucrose, sodium chloride, and phosphate-buffered saline. The methodology to set up liquid class pipetting parameters for each solution was to split the process in three steps: (1) screening of predefined liquid class, including different pipetting parameters; (2) adjustment of accuracy parameters based on a calibration curve; and (3) confirmation of the adjustment. The run of appropriate pipetting scripts, data acquisition, and reports until the creation of a new liquid class in EVOware was fully automated. The calibration and confirmation of the robotic system was simple, efficient, and precise and could accelerate data acquisition for a wide range of biopharmaceutical applications. © 2016 Society for Laboratory Automation and Screening.

  14. Damping Improvement of Multiple Damping Controllers by Using Optimal Coordinated Design Based on PSS and FACTS-POD in a Multi-Machine Power System

    Directory of Open Access Journals (Sweden)

    Ali Nasser Hussain

    2016-09-01

    Full Text Available The aim of this study is to present a comprehensive comparison and assessment of the damping function improvement of power system oscillation for the multiple damping controllers using the simultaneously coordinated design based on Power System Stabilizer (PSS and Flexible AC Transmission System (FACTS devices. FACTS devices can help in the enhancing the stability of the power system by adding supplementary damping controller to the control channel of the FACTS input to implement the task of Power Oscillation Damping (FACT POD controller. Simultaneous coordination can be performed in different ways. First, the dual coordinated designs between PSS and FACTS POD controller or between different FACTS POD controllers are arranged in a multiple FACTS devices without PSS. Second, the simultaneous coordination has been extended to triple coordinated design among PSS and different FACTS POD controllers. The parameters of the damping controllers have been tuned in the individual controllers and coordinated designs by using a Chaotic Particle Swarm Optimization (CPSO algorithm that optimized the given eigenvalue-based objective function. The simulation results for a multi-machine power system show that the dual coordinated design provide satisfactory damping performance over the individual control responses. Furthermore, the triple coordinated design has been shown to be more effective in damping oscillations than the dual damping controllers.

  15. Optimization of the coupling of nuclear reactors and desalination systems. Report on the IAEA coordinated research program

    International Nuclear Information System (INIS)

    Konishi, Toshio; Kupitz, Juergen; Megahed, Mohamed M.

    2003-01-01

    Energy and water are essential elements for human existence. Increasing demands worldwide, especially in the developing world, are being intensified both in energy and in freshwater. In many developing countries, the option of combining nuclear energy with seawater desalination is being explored to tackle these two problems. In 1998, the International Atomic Energy Agency (IAEA) launched a Coordinated Research Project (CRP) on the ''Optimization of the Coupling of Nuclear Reactors and Desalination Systems'', with the participation of research institutes from interested IAEA Member States. The Research Project focused on the following four main topics: (1) Nuclear reactor design intended for coupling with desalination systems (2) Optimization of thermal coupling of NSSS and desalination systems (3) Performance improvement of desalination systems for coupling (4) Advance desalination technologies for nuclear desalination. The current CRP has been evaluating various coupling configurations of nuclear reactors and desalination systems. Reactor types evaluated in the optimization include a PHWR, PWRs and dedicated heat reactors. The present paper summarizes the overall findings in the CRP, highlighting design optimisation, safety and some economic considerations. (author)

  16. Generic Protocol for Optimization of Heterologous Protein Production Using Automated Microbioreactor Technology.

    Science.gov (United States)

    Hemmerich, Johannes; Freier, Lars; Wiechert, Wolfgang; von Lieres, Eric; Oldiges, Marco

    2017-12-15

    A core business in industrial biotechnology using microbial production cell factories is the iterative process of strain engineering and optimization of bioprocess conditions. One important aspect is the improvement of cultivation medium to provide an optimal environment for microbial formation of the product of interest. It is well accepted that the media composition can dramatically influence overall bioprocess performance. Nutrition medium optimization is known to improve recombinant protein production with microbial systems and thus, this is a rewarding step in bioprocess development. However, very often standard media recipes are taken from literature, since tailor-made design of the cultivation medium is a tedious task that demands microbioreactor technology for sufficient cultivation throughput, fast product analytics, as well as support by lab robotics to enable reliability in liquid handling steps. Furthermore, advanced mathematical methods are required for rationally analyzing measurement data and efficiently designing parallel experiments such as to achieve optimal information content. The generic nature of the presented protocol allows for easy adaption to different lab equipment, other expression hosts, and target proteins of interest, as well as further bioprocess parameters. Moreover, other optimization objectives like protein production rate, specific yield, or product quality can be chosen to fit the scope of other optimization studies. The applied Kriging Toolbox (KriKit) is a general tool for Design of Experiments (DOE) that contributes to improved holistic bioprocess optimization. It also supports multi-objective optimization which can be important in optimizing both upstream and downstream processes.

  17. Generic Protocol for Optimization of Heterologous Protein Production Using Automated Microbioreactor Technology

    Science.gov (United States)

    Wiechert, Wolfgang; von Lieres, Eric; Oldiges, Marco

    2017-01-01

    A core business in industrial biotechnology using microbial production cell factories is the iterative process of strain engineering and optimization of bioprocess conditions. One important aspect is the improvement of cultivation medium to provide an optimal environment for microbial formation of the product of interest. It is well accepted that the media composition can dramatically influence overall bioprocess performance. Nutrition medium optimization is known to improve recombinant protein production with microbial systems and thus, this is a rewarding step in bioprocess development. However, very often standard media recipes are taken from literature, since tailor-made design of the cultivation medium is a tedious task that demands microbioreactor technology for sufficient cultivation throughput, fast product analytics, as well as support by lab robotics to enable reliability in liquid handling steps. Furthermore, advanced mathematical methods are required for rationally analyzing measurement data and efficiently designing parallel experiments such as to achieve optimal information content. The generic nature of the presented protocol allows for easy adaption to different lab equipment, other expression hosts, and target proteins of interest, as well as further bioprocess parameters. Moreover, other optimization objectives like protein production rate, specific yield, or product quality can be chosen to fit the scope of other optimization studies. The applied Kriging Toolbox (KriKit) is a general tool for Design of Experiments (DOE) that contributes to improved holistic bioprocess optimization. It also supports multi-objective optimization which can be important in optimizing both upstream and downstream processes. PMID:29286407

  18. Optimizing Electric Vehicle Coordination Over a Heterogeneous Mesh Network in a Scaled-Down Smart Grid Testbed

    DEFF Research Database (Denmark)

    Bhattarai, Bishnu Prasad; Lévesque, Martin; Maier, Martin

    2015-01-01

    High penetration of renewable energy sources and electric vehicles (EVs) create power imbalance and congestion in the existing power network, and hence causes significant problems in the control and operation. Despite investing huge efforts from the electric utilities, governments, and researchers...... is developed in a laboratory by scaling a 250 kVA, 0.4 kV real low-voltage distribution feeder down to 1 kVA, 0.22 kV. Information and communication technology is integrated in the scaled-down network to establish real-time monitoring and control. The novelty of the developed testbed is demonstrated...... by optimizing EV charging coordination realized through the synchronized exchange of monitoring and control packets via an heterogeneous Ethernet-based mesh network....

  19. Wide-area Power System Damping Control Coordination Based on Particle Swarm Optimization with Time Delay Considered

    Science.gov (United States)

    Zhang, J. Y.; Jiang, Y.

    2017-10-01

    To ensure satisfactory dynamic performance of controllers in time-delayed power systems, a WAMS-based control strategy is investigated in the presence of output feedback delay. An integrated approach based on Pade approximation and particle swarm optimization (PSO) is employed for parameter configuration of PSS. The coordination configuration scheme of power system controllers is achieved by a series of stability constraints at the aim of maximizing the minimum damping ratio of inter-area mode of power system. The validity of this derived PSS is verified on a prototype power system. The findings demonstrate that the proposed approach for control design could damp the inter-area oscillation and enhance the small-signal stability.

  20. Optimizing object-based image analysis for semi-automated geomorphological mapping

    NARCIS (Netherlands)

    Anders, N.; Smith, M.; Seijmonsbergen, H.; Bouten, W.; Hengl, T.; Evans, I.S.; Wilson, J.P.; Gould, M.

    2011-01-01

    Object-Based Image Analysis (OBIA) is considered a useful tool for analyzing high-resolution digital terrain data. In the past, both segmentation and classification parameters were optimized manually by trial and error. We propose a method to automatically optimize classification parameters for

  1. A Technique for Binocular Stereo Vision System Calibration by the Nonlinear Optimization and Calibration Points with Accurate Coordinates

    International Nuclear Information System (INIS)

    Chen, H; Ye, D; Che, R S; Chen, G

    2006-01-01

    With the increasing need for higher accuracy measurement in computer vision, the precision of camera calibration is a more important factor. The objective of stereo camera calibration is to estimate the intrinsic and extrinsic parameters of each camera. We presented a high-accurate technique to calibrate binocular stereo vision system having been mounted the locations and attitudes, which was realized by combining nonlinear optimization method with accurate calibration points. The calibration points with accurate coordinates, were formed by an infrared LED moved with three-dimensional coordinate measurement machine, which can ensure indeterminacy of measurement is 1/30000. By using bilinear interpolation square-gray weighted centroid location algorithm, the imaging centers of the calibration points can be accurately determined. The accuracy of the calibration is measured in terms of the accuracy in the reconstructing calibration points through triangulation, the mean distance between reconstructing point and given calibration point is 0.039mm. The technique can satisfy the goals of measurement and camera accurate calibration

  2. Optimizing nitrogen fertilizer application to irrigated wheat. Results of a co-ordinated research project. 1994-1998

    International Nuclear Information System (INIS)

    2000-07-01

    This TECDOC summarizes the results of a Co-ordinated Research Project (CRP) on the Use of Nuclear Techniques for Optimizing Fertilizer Application under Irrigated Wheat to Increase the Efficient Use of Nitrogen Fertilizer and Consequently Reduce Environmental Pollution. The project was carried out between 1994 and 1998 through the technical co-ordination of the Joint FAO/IAEA Division of Nuclear Techniques in Food and Agriculture. Fourteen Member States of the IAEA and FAO carried out a series of field experiments aimed at improving irrigation water and fertilizer-N uptake efficiencies through integrated management of the complex Interactions involving inputs, soils, climate, and wheat cultivars. Its goals were: to investigate various aspects of fertilizer N uptake efficiency of wheat crops under irrigation through an interregional research network involving countries growing large areas of irrigated wheat; to use 15 N and the soil-moisture neutron probe to determine the fate of applied N, to follow water and nitrate movement in the soil, and to determine water balance and water-use efficiency in irrigated wheat cropping systems; to use the data generated to further develop and refine various relationships in the Ceres-Wheat computer simulation model; to use the knowledge generated to produce a N-rate-recommendation package to refine specific management strategies with respect to fertilizer applications and expected yields

  3. Automation and optimization of liquid-phase microextraction by gas chromatography.

    Science.gov (United States)

    Ouyang, Gangfeng; Zhao, Wennan; Pawliszyn, Janusz

    2007-01-05

    Several fully automated liquid-phase microextraction (LPME) techniques, including static headspace LPME (HS-LPME) (a drop of solvent is suspended at the tip of a microsyringe needle and exposed to the headspace of the sample solution), exposed dynamic HS-LPME (the solvent is exposed in the headspace of sample vial for different time, and then withdrawn into the barrel of the syringe. This procedure is repeated a number of times), unexposed dynamic HS-LPME (the solvent is moved inside the needle and the barrel of a syringe, and the gaseous sample is withdrawn into the barrel and then ejected), static direct-immersed LPME (DI-LPME) (a drop of solvent is suspended at the tip of a microsyringe needle and directly immersed into the sample solution), dynamic DI-LPME (the solvent is moved inside the needle and the barrel of a syringe, and the sample solution is withdrawn and ejected), and two phase hollow fiber-protected LPME (HF-LPME) (a hollow fiber is used to stabilize and protect the solvent), auto-performed with a commercial CTC CombiPal autosampler, are described in this paper. Critical experimental factors, including temperature, choice of extraction solvent, solvent volume, plunger movement rate, and extraction time were investigated. Among the three HS-LPME techniques that were evaluated, the exposed dynamic HS-LPME technique provided the best performance, compared to the unexposed dynamic HS-LPME and static HS-LPME approaches. For DI-LPME, the dynamic process can enhance the extraction efficiency and the achieved method precision is comparable with the static DI-LPME technique. The precision of the fully automated HF-LPME is quite acceptable (RSD values below 6.8%), and the concentration enrichment factors are better than the DI-LPME approaches. The fully automated LPME techniques are more accurate and more convenient, and the reproducibility achieved eliminates the need for an internal standard to improve the method precision.

  4. Coordinated optimization of the parameters of the cooled gas-turbine flow path and the parameters of gas-turbine cycles and combined-cycle power plants

    Science.gov (United States)

    Kler, A. M.; Zakharov, Yu. B.; Potanina, Yu. M.

    2014-06-01

    In the present paper, we evaluate the effectiveness of the coordinated solution to the optimization problem for the parameters of cycles in gas turbine and combined cycle power plants and to the optimization problem for the gas-turbine flow path parameters within an integral complex problem. We report comparative data for optimizations of the combined cycle power plant at coordinated and separate optimizations, when, first, the gas turbine and, then, the steam part of a combined cycle plant is optimized. The comparative data are presented in terms of economic indicators, energy-effectiveness characteristics, and specific costs. Models that were used in the present study for calculating the flow path enable taking into account, as a factor influencing the economic and energy effectiveness of the power plant, the heat stability of alloys from which the nozzle and rotor blades of gas-turbine stages are made.

  5. SpaceScanner: COPASI wrapper for automated management of global stochastic optimization experiments.

    Science.gov (United States)

    Elsts, Atis; Pentjuss, Agris; Stalidzans, Egils

    2017-09-15

    Due to their universal applicability, global stochastic optimization methods are popular for designing improvements of biochemical networks. The drawbacks of global stochastic optimization methods are: (i) no guarantee of finding global optima, (ii) no clear optimization run termination criteria and (iii) no criteria to detect stagnation of an optimization run. The impact of these drawbacks can be partly compensated by manual work that becomes inefficient when the solution space is large due to combinatorial explosion of adjustable parameters or for other reasons. SpaceScanner uses parallel optimization runs for automatic termination of optimization tasks in case of consensus and consecutively applies a pre-defined set of global stochastic optimization methods in case of stagnation in the currently used method. Automatic scan of adjustable parameter combination subsets for best objective function values is possible with a summary file of ranked solutions. https://github.com/atiselsts/spacescanner . egils.stalidzans@lu.lv. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  6. Optimal Geometrical Set for Automated Marker Placement to Virtualized Real-Time Facial Emotions.

    Directory of Open Access Journals (Sweden)

    Vasanthan Maruthapillai

    Full Text Available In recent years, real-time face recognition has been a major topic of interest in developing intelligent human-machine interaction systems. Over the past several decades, researchers have proposed different algorithms for facial expression recognition, but there has been little focus on detection in real-time scenarios. The present work proposes a new algorithmic method of automated marker placement used to classify six facial expressions: happiness, sadness, anger, fear, disgust, and surprise. Emotional facial expressions were captured using a webcam, while the proposed algorithm placed a set of eight virtual markers on each subject's face. Facial feature extraction methods, including marker distance (distance between each marker to the center of the face and change in marker distance (change in distance between the original and new marker positions, were used to extract three statistical features (mean, variance, and root mean square from the real-time video sequence. The initial position of each marker was subjected to the optical flow algorithm for marker tracking with each emotional facial expression. Finally, the extracted statistical features were mapped into corresponding emotional facial expressions using two simple non-linear classifiers, K-nearest neighbor and probabilistic neural network. The results indicate that the proposed automated marker placement algorithm effectively placed eight virtual markers on each subject's face and gave a maximum mean emotion classification rate of 96.94% using the probabilistic neural network.

  7. Integration of numerical analysis tools for automated numerical optimization of a transportation package design

    International Nuclear Information System (INIS)

    Witkowski, W.R.; Eldred, M.S.; Harding, D.C.

    1994-01-01

    The use of state-of-the-art numerical analysis tools to determine the optimal design of a radioactive material (RAM) transportation container is investigated. The design of a RAM package's components involves a complex coupling of structural, thermal, and radioactive shielding analyses. The final design must adhere to very strict design constraints. The current technique used by cask designers is uncoupled and involves designing each component separately with respect to its driving constraint. With the use of numerical optimization schemes, the complex couplings can be considered directly, and the performance of the integrated package can be maximized with respect to the analysis conditions. This can lead to more efficient package designs. Thermal and structural accident conditions are analyzed in the shape optimization of a simplified cask design. In this paper, details of the integration of numerical analysis tools, development of a process model, nonsmoothness difficulties with the optimization of the cask, and preliminary results are discussed

  8. Nonlinear Modeling and Coordinate Optimization of a Semi-Active Energy Regenerative Suspension with an Electro-Hydraulic Actuator

    Directory of Open Access Journals (Sweden)

    Farong Kou

    2018-01-01

    Full Text Available In order to coordinate the damping performance and energy regenerative performance of energy regenerative suspension, this paper proposes a structure of a vehicle semi-active energy regenerative suspension with an electro-hydraulic actuator (EHA. In light of the proposed concept, a specific energy regenerative scheme is designed and a mechanical properties test is carried out. Based on the test results, the parameter identification for the system model is conducted using a recursive least squares algorithm. On the basis of the system principle, the nonlinear model of the semi-active energy regenerative suspension with an EHA is built. Meanwhile, linear-quadratic-Gaussian control strategy of the system is designed. Then, the influence of the main parameters of the EHA on the damping performance and energy regenerative performance of the suspension is analyzed. Finally, the main parameters of the EHA are optimized via the genetic algorithm. The test results show that when a sinusoidal is input at the frequency of 2 Hz and the amplitude of 30 mm, the spring mass acceleration root meam square value of the optimized EHA semi-active energy regenerative suspension is reduced by 22.23% and the energy regenerative power RMS value is increased by 40.51%, which means that while meeting the requirements of vehicle ride comfort and driving safety, the energy regenerative performance is improved significantly.

  9. Automation of POST Cases via External Optimizer and "Artificial p2" Calculation

    Science.gov (United States)

    Dees, Patrick D.; Zwack, Mathew R.

    2017-01-01

    During early conceptual design of complex systems, speed and accuracy are often at odds with one another. While many characteristics of the design are fluctuating rapidly during this phase there is nonetheless a need to acquire accurate data from which to down-select designs as these decisions will have a large impact upon program life-cycle cost. Therefore enabling the conceptual designer to produce accurate data in a timely manner is tantamount to program viability. For conceptual design of launch vehicles, trajectory analysis and optimization is a large hurdle. Tools such as the industry standard Program to Optimize Simulated Trajectories (POST) have traditionally required an expert in the loop for setting up inputs, running the program, and analyzing the output. The solution space for trajectory analysis is in general non-linear and multi-modal requiring an experienced analyst to weed out sub-optimal designs in pursuit of the global optimum. While an experienced analyst presented with a vehicle similar to one which they have already worked on can likely produce optimal performance figures in a timely manner, as soon as the "experienced" or "similar" adjectives are invalid the process can become lengthy. In addition, an experienced analyst working on a similar vehicle may go into the analysis with preconceived ideas about what the vehicle's trajectory should look like which can result in sub-optimal performance being recorded. Thus, in any case but the ideal either time or accuracy can be sacrificed. In the authors' previous work a tool called multiPOST was created which captures the heuristics of a human analyst over the process of executing trajectory analysis with POST. However without the instincts of a human in the loop, this method relied upon Monte Carlo simulation to find successful trajectories. Overall the method has mixed results, and in the context of optimizing multiple vehicles it is inefficient in comparison to the method presented POST's internal

  10. TH-AB-BRA-02: Automated Triplet Beam Orientation Optimization for MRI-Guided Co-60 Radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Nguyen, D; Thomas, D; Cao, M; O’Connor, D; Lamb, J; Sheng, K [Department of Radiation Oncology, University of California Los Angeles, Los Angeles, CA (United States)

    2016-06-15

    Purpose: MRI guided Co-60 provides daily and intrafractional MRI soft tissue imaging for improved target tracking and adaptive radiotherapy. To remedy the low output limitation, the system uses three Co-60 sources at 120° apart, but using all three sources in planning is considerably unintuitive. We automate the beam orientation optimization using column generation, and then solve a novel fluence map optimization (FMO) problem while regularizing the number of MLC segments. Methods: Three patients—1 prostate (PRT), 1 lung (LNG), and 1 head-and-neck boost plan (H&NBoost)—were evaluated. The beamlet dose for 180 equally spaced coplanar beams under 0.35 T magnetic field was calculated using Monte Carlo. The 60 triplets were selected utilizing the column generation algorithm. The FMO problem was formulated using an L2-norm minimization with anisotropic total variation (TV) regularization term, which allows for control over the number of MLC segments. Our Fluence Regularized and Optimized Selection of Triplets (FROST) plans were compared against the clinical treatment plans (CLN) produced by an experienced dosimetrist. Results: The mean PTV D95, D98, and D99 differ by −0.02%, +0.12%, and +0.44% of the prescription dose between planning methods, showing same PTV dose coverage. The mean PTV homogeneity (D95/D5) was at 0.9360 (FROST) and 0.9356 (CLN). R50 decreased by 0.07 with FROST. On average, FROST reduced Dmax and Dmean of OARs by 6.56% and 5.86% of the prescription dose. The manual CLN planning required iterative trial and error runs which is very time consuming, while FROST required minimal human intervention. Conclusions: MRI guided Co-60 therapy needs the output of all sources yet suffers from unintuitive and laborious manual beam selection processes. Automated triplet orientation optimization is shown essential to overcome the difficulty and improves the dosimetry. A novel FMO with regularization provides additional controls over the number of MLC segments

  11. SU-E-J-130: Automating Liver Segmentation Via Combined Global and Local Optimization

    International Nuclear Information System (INIS)

    Li, Dengwang; Wang, Jie; Kapp, Daniel S.; Xing, Lei

    2015-01-01

    Purpose: The aim of this work is to develop a robust algorithm for accurate segmentation of liver with special attention paid to the problems with fuzzy edges and tumor. Methods: 200 CT images were collected from radiotherapy treatment planning system. 150 datasets are selected as the panel data for shape dictionary and parameters estimation. The remaining 50 datasets were used as test images. In our study liver segmentation was formulated as optimization process of implicit function. The liver region was optimized via local and global optimization during iterations. Our method consists five steps: 1)The livers from the panel data were segmented manually by physicians, and then We estimated the parameters of GMM (Gaussian mixture model) and MRF (Markov random field). Shape dictionary was built by utilizing the 3D liver shapes. 2)The outlines of chest and abdomen were located according to rib structure in the input images, and the liver region was initialized based on GMM. 3)The liver shape for each 2D slice was adjusted using MRF within the neighborhood of liver edge for local optimization. 4)The 3D liver shape was corrected by employing SSR (sparse shape representation) based on liver shape dictionary for global optimization. Furthermore, H-PSO(Hybrid Particle Swarm Optimization) was employed to solve the SSR equation. 5)The corrected 3D liver was divided into 2D slices as input data of the third step. The iteration was repeated within the local optimization and global optimization until it satisfied the suspension conditions (maximum iterations and changing rate). Results: The experiments indicated that our method performed well even for the CT images with fuzzy edge and tumors. Comparing with physician delineated results, the segmentation accuracy with the 50 test datasets (VOE, volume overlap percentage) was on average 91%–95%. Conclusion: The proposed automatic segmentation method provides a sensible technique for segmentation of CT images. This work is

  12. Are automated molecular dynamics simulations and binding free energy calculations realistic tools in lead optimization? An evaluation of the linear interaction energy (LIE) method

    NARCIS (Netherlands)

    Stjernschantz, E.M.; Marelius, J.; Medina, C.; Jacobsson, M.; Vermeulen, N.P.E.; Oostenbrink, C.

    2006-01-01

    An extensive evaluation of the linear interaction energy (LIE) method for the prediction of binding affinity of docked compounds has been performed, with an emphasis on its applicability in lead optimization. An automated setup is presented, which allows for the use of the method in an industrial

  13. Optimizing the service area and trip selection of an electric automated taxi system used for the last mile of train trips

    NARCIS (Netherlands)

    Liang, X.; Homem de Almeida Correia, G.; van Arem, B.

    2016-01-01

    We propose two integer programming models for optimizing an automated taxi (AT) system for last mile of train trips. Model S1: trip reservations are accepted or rejected by the operator according to the profit maximization; model S2: any reservation on a selected zone by the model must be

  14. Strategies for Optimization and Automated Design of Gas Turbine Engines (Les strategies pour l’optimisation et la conception automatique de turbines a gaz)

    Science.gov (United States)

    2010-09-01

    Sep 2010 Strategies for Optimization and Automated Design of Gas Turbine Engines (Les Stratégies pour l’optimisation et la conception automatique de...Engines (Les Stratégies pour l’optimisation et la conception automatique de turbines à gaz) The material in this publication was assembled to support

  15. An Optimized Clustering Approach for Automated Detection of White Matter Lesions in MRI Brain Images

    Directory of Open Access Journals (Sweden)

    M. Anitha

    2012-04-01

    Full Text Available Settings White Matter lesions (WMLs are small areas of dead cells found in parts of the brain. In general, it is difficult for medical experts to accurately quantify the WMLs due to decreased contrast between White Matter (WM and Grey Matter (GM. The aim of this paper is to
    automatically detect the White Matter Lesions which is present in the brains of elderly people. WML detection process includes the following stages: 1. Image preprocessing, 2. Clustering (Fuzzy c-means clustering, Geostatistical Possibilistic clustering and Geostatistical Fuzzy clustering and 3.Optimization using Particle Swarm Optimization (PSO. The proposed system is tested on a database of 208 MRI images. GFCM yields high sensitivity of 89%, specificity of 94% and overall accuracy of 93% over FCM and GPC. The clustered brain images are then subjected to Particle Swarm Optimization (PSO. The optimized result obtained from GFCM-PSO provides sensitivity of 90%, specificity of 94% and accuracy of 95%. The detection results reveals that GFCM and GFCMPSO better localizes the large regions of lesions and gives less false positive rate when compared to GPC and GPC-PSO which captures the largest loads of WMLs only in the upper ventral horns of the brain.

  16. Automated Spectroscopic Analysis Using the Particle Swarm Optimization Algorithm: Implementing a Guided Search Algorithm to Autofit

    Science.gov (United States)

    Ervin, Katherine; Shipman, Steven

    2017-06-01

    While rotational spectra can be rapidly collected, their analysis (especially for complex systems) is seldom straightforward, leading to a bottleneck. The AUTOFIT program was designed to serve that need by quickly matching rotational constants to spectra with little user input and supervision. This program can potentially be improved by incorporating an optimization algorithm in the search for a solution. The Particle Swarm Optimization Algorithm (PSO) was chosen for implementation. PSO is part of a family of optimization algorithms called heuristic algorithms, which seek approximate best answers. This is ideal for rotational spectra, where an exact match will not be found without incorporating distortion constants, etc., which would otherwise greatly increase the size of the search space. PSO was tested for robustness against five standard fitness functions and then applied to a custom fitness function created for rotational spectra. This talk will explain the Particle Swarm Optimization algorithm and how it works, describe how Autofit was modified to use PSO, discuss the fitness function developed to work with spectroscopic data, and show our current results. Seifert, N.A., Finneran, I.A., Perez, C., Zaleski, D.P., Neill, J.L., Steber, A.L., Suenram, R.D., Lesarri, A., Shipman, S.T., Pate, B.H., J. Mol. Spec. 312, 13-21 (2015)

  17. Mathematical model as means of optimization of the automation system of the process of incidents of information security management

    Directory of Open Access Journals (Sweden)

    Yulia G. Krasnozhon

    2018-03-01

    Full Text Available Modern information technologies have an increasing importance for development dynamics and management structure of an enterprise. The management efficiency of implementation of modern information technologies directly related to the quality of information security incident management. However, issues of assessment of the impact of information security incidents management on quality and efficiency of the enterprise management system are not sufficiently highlighted neither in Russian nor in foreign literature. The main direction to approach these problems is the optimization of the process automation system of the information security incident management. Today a special attention is paid to IT-technologies while dealing with information security incidents at mission-critical facilities in Russian Federation such as the Federal Tax Service of Russia (FTS. It is proposed to use the mathematical apparatus of queueing theory in order to build a mathematical model of the system optimization. The developed model allows to estimate quality of the management taking into account the rules and restrictions imposed on the system by the effects of information security incidents. Here an example is given in order to demonstrate the system in work. The obtained statistical data are shown. An implementation of the system discussed here will improve the quality of the Russian FTS services and make responses to information security incidents faster.

  18. Software integration for automated stability analysis and design optimization of a bearingless rotor blade

    Science.gov (United States)

    Gunduz, Mustafa Emre

    Many government agencies and corporations around the world have found the unique capabilities of rotorcraft indispensable. Incorporating such capabilities into rotorcraft design poses extra challenges because it is a complicated multidisciplinary process. The concept of applying several disciplines to the design and optimization processes may not be new, but it does not currently seem to be widely accepted in industry. The reason for this might be the lack of well-known tools for realizing a complete multidisciplinary design and analysis of a product. This study aims to propose a method that enables engineers in some design disciplines to perform a fairly detailed analysis and optimization of a design using commercially available software as well as codes developed at Georgia Tech. The ultimate goal is when the system is set up properly, the CAD model of the design, including all subsystems, will be automatically updated as soon as a new part or assembly is added to the design; or it will be updated when an analysis and/or an optimization is performed and the geometry needs to be modified. Designers and engineers will be involved in only checking the latest design for errors or adding/removing features. Such a design process will take dramatically less time to complete; therefore, it should reduce development time and costs. The optimization method is demonstrated on an existing helicopter rotor originally designed in the 1960's. The rotor is already an effective design with novel features. However, application of the optimization principles together with high-speed computing resulted in an even better design. The objective function to be minimized is related to the vibrations of the rotor system under gusty wind conditions. The design parameters are all continuous variables. Optimization is performed in a number of steps. First, the most crucial design variables of the objective function are identified. With these variables, Latin Hypercube Sampling method is used

  19. A Velocity-Level Bi-Criteria Optimization Scheme for Coordinated Path Tracking of Dual Robot Manipulators Using Recurrent Neural Network.

    Science.gov (United States)

    Xiao, Lin; Zhang, Yongsheng; Liao, Bolin; Zhang, Zhijun; Ding, Lei; Jin, Long

    2017-01-01

    A dual-robot system is a robotic device composed of two robot arms. To eliminate the joint-angle drift and prevent the occurrence of high joint velocity, a velocity-level bi-criteria optimization scheme, which includes two criteria (i.e., the minimum velocity norm and the repetitive motion), is proposed and investigated for coordinated path tracking of dual robot manipulators. Specifically, to realize the coordinated path tracking of dual robot manipulators, two subschemes are first presented for the left and right robot manipulators. After that, such two subschemes are reformulated as two general quadratic programs (QPs), which can be formulated as one unified QP. A recurrent neural network (RNN) is thus presented to solve effectively the unified QP problem. At last, computer simulation results based on a dual three-link planar manipulator further validate the feasibility and the efficacy of the velocity-level optimization scheme for coordinated path tracking using the recurrent neural network.

  20. Optimal power flow based TU/CHP/PV/WPP coordination in view of wind speed, solar irradiance and load correlations

    International Nuclear Information System (INIS)

    Azizipanah-Abarghooee, Rasoul; Niknam, Taher; Malekpour, Mostafa; Bavafa, Farhad; Kaji, Mahdi

    2015-01-01

    Highlights: • Formulate probabilistic OPF with VPE, multi-fuel options, POZs, FOR of CHP units. • Propose a new powerful optimization method based on enhanced black hole algorithm. • Coordinate of TUs, WPPs, PVs and CHP units together in the proposed problem. • Evaluate the impacts of inputs’ uncertainties and their correlations on the POPF. • Use the 2m + 1 point estimated method. - Abstract: This paper addresses a novel probabilistic optimisation framework for handling power system uncertainties in the optimal power flow (OPF) problem that considers all the essential factors of great impact in the OPF problem. The object is to study and model the correlation and fluctuation of load demands, photovoltaic (PV) and wind power plants (WPPs) which have an important influence on transmission lines and bus voltages. Moreover, as an important tool of saving waste heat energy in the thermoelectric power plant, the power networks share of combined heat and power (CHP) has increased dramatically in the past decade. So, the probabilistic OPF (POPF) problem considering valve point effects, multi-fuel options and prohibited zones of thermal units (TUs) is firstly formulated. The PV, WPP and CHP units are also modeled. Then, a new method utilizing enhanced binary black hole (EBBH) algorithm and 2m + 1 point estimated method is proposed to solve this problem and to handle the random nature of solar irradiance, wind speed and load of consumers. The correlation between input random variables is considered using a correlation matrix. Finally, numerical results are presented and considered regarding the IEEE 118-busses, including PV, WPP, CHP and TU at several busses. The simulation and comparison results obtained demonstrate the broad advantages and feasibility of the suggested framework in the presence of dependent non-Gaussian distribution of random variables

  1. AUTOMATION OF OPTIMAL IDENTIFICATION OF DYNAMIC ELEMENT TRANSFER FUNCTIONS IN COMPLEX TECHNICAL OBJECTS BASED ON ACCELERATION CURVES

    Directory of Open Access Journals (Sweden)

    A. Yu. Alikov

    2017-01-01

    Full Text Available Objectives. The aim of present paper is to minimise the errors in the approximation of experimentally obtained acceleration curves.Methods. Based on the features and disadvantages of the well-known Simoyu method for calculating transfer functions on the basis of acceleration curves, a modified version of the method is developed using the MathLab and MathCad software. This is based on minimising the sum of the squares of the experimental point deviations from the solution of the differential equation at the same points.Results. Methods for the implementation of parametric identification are analysed and the Simoyu method is chosen as the most effective. On the basis of the analysis of its advantages and disadvantages, a modified method is proposed that allows the structure and parameters of the transfer function to be identified according to the experimental acceleration curve, as well as the choice of optimal numerical values of those parameters obtained for minimising errors in the approximation of the experimentally obtained acceleration curves.Conclusion. The problem of optimal control over a complex technical facility was solved. On the basis of the modified Simoyu method, an algorithm for the automated selection of the optimal shape and calculation of transfer function parameters of dynamic elements of complex technical objects according to the acceleration curves in the impact channels was developed. This has allowed the calculation efficiency of the dynamic characteristics of control objects to be increased by minimising the approximation errors. The efficiency of the proposed calculation method is shown. Its simplicity makes it possible to apply to practical calculations, especially for use in the design of complex technical objects within the framework of the computer aided design system. The proposed method makes it possible to increase the accuracy of the approximation by at least 20%, which is an important advantage for its practical

  2. Optimization of automated external defibrillator deployment outdoors: An evidence-based approach.

    Science.gov (United States)

    Dahan, Benjamin; Jabre, Patricia; Karam, Nicole; Misslin, Renaud; Bories, Marie-Cécile; Tafflet, Muriel; Bougouin, Wulfran; Jost, Daniel; Beganton, Frankie; Beal, Guillaume; Pelloux, Patricia; Marijon, Eloi; Jouven, Xavier

    2016-11-01

    The benefits of available automatic external defibrillators (AEDs) for out-of-hospital cardiac arrests (OHCAs) are well known, but strategies for their deployment outdoors remain somewhat arbitrary. Our study sought to assess different strategies for AED deployment. All OHCAs in Paris between 2000 and 2010 were prospectively recorded and geocoded. A guidelines-based strategy of placing an AED in locations where more than one OHCA had occurred within the past five years was compared to two novel strategies: a grid-based strategy with a regular distance between AEDs and a landmark-based strategy. The expected number of AEDs necessary and their median (IQR) distance to the nearest OHCA were assessed for each strategy. Of 4176 OHCAs, 1372 (33%) occurred in public settings. The first strategy would result in the placement of 170 AEDs, with a distance to OHCA of 416 (180-614) m and a continuous increase in the number of AEDS. In the second strategy, the number of AEDs and their distance to the closest OHCA would change with the grid size, with a number of AEDs between 200 and 400 seeming optimal. In the third strategy, median distances between OHCAs and AEDs would be 324m if placed at post offices (n=195), 239 at subway stations (n=302), 137 at bike-sharing stations (n=957), and 142 at pharmacies (n=1466). This study presents an original evidence-based approach to strategies of AED deployment to optimize their number and location. This rational approach can estimate the optimal number of AEDs for any city. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  3. Optimization of an automated FI-FT-IR procedure for the determination of o-xylene, toluene and ethyl benzene in n-hexane

    OpenAIRE

    Wells, Ian; Worsfold, Paul J.

    1999-01-01

    The development and optimization of an automated flow injection (FI) manifold coupled with a Fourier transform infrared (FT-IR) detector for the determination of toluene, ethyl benzene and o-oxylene in an n-hexane matrix is described. FT-IR parameters optimized were resolution and number of co-added scans; FI parameters optimized were type of pump tubing, carrier flow rate and sample volume. ATR and transmission flow cells were compared for the determination of o-xylene, the ATR cell was easi...

  4. Generating combinatorial test cases using Simplified Swarm Optimization (SSO algorithm for automated GUI functional testing

    Directory of Open Access Journals (Sweden)

    Bestoun S. Ahmed

    2014-12-01

    Full Text Available Graphical User Interface (GUI is the outer skin of programs that facilitate the interaction between the user and different type of computing devices. It is been used in different aspects ranging from normal computers, mobile device, to even very small device nowadays like watches. This interaction uses different tools and programming objects like images, text, buttons, checkboxes, etc. With this emergence of different types of GUIs, they become an essential component to be tested (if available in the software to ensure that the software meets the required quality by the user. In contrast to non-functional testing, function testing of GUI insures a proper interaction between the user and the application interface without dealing with the coding internals. In this paper, a strategy for GUI functional testing using Simplified Swarm Optimization (SSO is proposed. The SSO is used to generate an optimized test suite with the help of Event-Interaction Graph (EIG. The proposed strategy also manages and repairs the test suites by deleting the unnecessary event sequences that are not applicable. The proposed generation algorithm based on SSO has proved its effectiveness by evaluating it against other algorithms. In addition, the strategy is applied on a standard case study and proved its applicability in reality.

  5. Automation for tsetse mass rearing for use in sterile insect technique programmes. Final report of a co-ordinated research project 1995-2001

    International Nuclear Information System (INIS)

    2003-05-01

    The rearing of tsetse flies for the sterile insect technique has been a laborious procedure in the past. The purpose of this co-ordinated research project (CRP) 'Automation for tsetse mass rearing for use in sterile insect technique programmes' was to develop appropriate semiautomated procedures to simplify the rearing, reduce the cost and standardize the product. Two main objectives were accomplished. The first was to simplify the handling of adults at emergence. This was achieved by allowing the adults to emerge directly into the production cages. Selection of the appropriate environmental conditions and timing allowed the manipulation of the emergence pattern to achieve the desired ratio of four females to one male with minimal un-emerged females remaining mixed with the male pupae. Tests demonstrated that putting the sexes together at emergence, leaving the males in the production cages, and using a ratio of 4:1 (3:1 for a few species) did not adversely affect pupal production. This has resulted in a standardized system for the self stocking of production cages. The second was to reduce the labour involved in feeding the flies. Three distinct systems were developed and tested in sequence. The first tsetse production unit (TPU 1) was a fully automated system, but the fly survival and fecundity were unacceptably low. From this a simpler TPU 2 was developed and tested, where 63 large cages were held on a frame that could be moved as a single unit to the feeding location. TPU 2 was tested in various locations, and found to satisfy the basic requirements, and the adoption of Plexiglas pupal collection slopes resolved much of the problem due to light distribution. However the cage holding frame was heavy and difficult to position on the feeding frame and the movement disturbed the flies. TPU 2 was superseded by TPU 3, in which the cages remain stationary at all times, and the blood is brought to the flies. The blood feeding system is mounted on rails to make it

  6. Optimized Energy Management of a Single-House Residential Micro-Grid With Automated Demand Response

    DEFF Research Database (Denmark)

    Anvari-Moghaddam, Amjad; Monsef, Hassan; Rahimi-Kian, Ashkan

    2015-01-01

    In this paper, an intelligent multi-objective energy management system (MOEMS) is proposed for applications in residential LVAC micro-grids where households are equipped with smart appliances, such as washing machine, dishwasher, tumble dryer and electric heating and they have the capability to t...... to reduce residential energy use and improve the user’s satisfaction degree by optimal management of demand/generation sides.......In this paper, an intelligent multi-objective energy management system (MOEMS) is proposed for applications in residential LVAC micro-grids where households are equipped with smart appliances, such as washing machine, dishwasher, tumble dryer and electric heating and they have the capability...... to take part in demand response (DR) programs. The superior performance and efficiency of the proposed system is studied through several scenarios and case studies and validated in comparison with the conventional models. The simulation results demonstrate that the proposed MOEMS has the capability...

  7. Optimal Coordinated Design of Multiple Damping Controllers Based on PSS and UPFC Device to Improve Dynamic Stability in the Power System

    Directory of Open Access Journals (Sweden)

    A. N. Hussain

    2013-01-01

    Full Text Available Unified Power Flow Controller (UPFC device is applied to control power flow in transmission lines. Supplementary damping controller can be installed on any control channel of the UPFC inputs to implement the task of Power Oscillation Damping (POD controller. In this paper, we have presented the simultaneous coordinated design of the multiple damping controllers between Power System Stabilizer (PSS and UPFC-based POD or between different multiple UPFC-based POD controllers without PSS in a single-machine infinite-bus power system in order to identify the design that provided the most effective damping performance. The parameters of the damping controllers are optimized utilizing a Chaotic Particle Swarm Optimization (CPSO algorithm based on eigenvalue objective function. The simulation results show that the coordinated design of the multiple damping controllers has high ability in damping oscillations compared to the individual damping controllers. Furthermore, the coordinated design of UPFC-based POD controllers demonstrates the superiority over the coordinated design of PSS and UPFC-based POD controllers for enhancing greatly the stability of the power system.

  8. Optimization of the coupling of nuclear reactors and desalination systems. Final report of a coordinated research project 1999-2003

    International Nuclear Information System (INIS)

    2005-06-01

    Nuclear power has been used for five decades and has been one of the fastest growing energy options. Although the rate at which nuclear power has penetrated the world energy market has declined, it has retained a substantial share, and is expected to continue as a viable option well into the future. Seawater desalination by distillation is much older than nuclear technology. However, the current desalination technology involving large-scale application, has a history comparable to nuclear power, i.e. it spans about five decades. Both nuclear and desalination technologies are mature and proven, and are commercially available from a variety of suppliers. Therefore, there are benefits in combining the two technologies together. Where nuclear energy could be an option for electricity supply, it can also be used as an energy source for seawater desalination. This has been recognized from the early days of the two technologies. However, the main interest during the 1960s and 1970s was directed towards the use of nuclear energy for electricity generation, district heating, and industrial process heat. Renewed interest in nuclear desalination has been growing worldwide since 1989, as indicated by the adoption of a number of resolutions on the subject at the IAEA General Conferences. Responding to this trend, the IAEA reviewed information on desalination technologies and the coupling of nuclear reactors with desalination plants, compared the economic viability of seawater desalination using nuclear energy in various coupling configuration with fossil fuels in a generic assessment, conducted a regional feasibility study on nuclear desalination in the North African Countries and initiated in a two-year Options Identification Programme (OIP) to identify candidate reactor and desalination technologies that could serve as practical demonstrations of nuclear desalination, supplementing the existing expertise and experience. In 1998, the IAEA initiated a Coordinated Research

  9. APPLICATION OF RANKING BASED ATTRIBUTE SELECTION FILTERS TO PERFORM AUTOMATED EVALUATION OF DESCRIPTIVE ANSWERS THROUGH SEQUENTIAL MINIMAL OPTIMIZATION MODELS

    Directory of Open Access Journals (Sweden)

    C. Sunil Kumar

    2014-10-01

    Full Text Available In this paper, we study the performance of various models for automated evaluation of descriptive answers by using rank based feature selection filters for dimensionality reduction. We quantitatively analyze the best feature selection technique from amongst the five rank based feature selection techniques, namely Chi squared filter, Information gain filter, Gain ratio filter, Relief filter and Symmetrical uncertainty filter. We use Sequential Minimal Optimization with Polynomial kernel to build models and we evaluate the models across various parameters such as Accuracy, Time to build models, Kappa, Mean Absolute Error and Root Mean Squared Error. Except with Relief filter, for all other filters applied models, the accuracies obtained are at least 4% better than accuracies obtained with models with no filters applied. The accuracies recorded are same across Chi squared filter, Information gain filter, Gain ratio filter and Symmetrical Uncertainty filter. Therefore accuracy alone is not the determinant in selecting the best filter. The time taken to build models, Kappa, Mean absolute error and Root Mean Squared Error played a major role in determining the effectiveness of the filters. The overall rank aggregation metric of Symmetrical uncertainty filter is 45 and this is better by 1 rank than the rank aggregation metric of information gain attribute evaluation filter, the nearest contender to Symmetric attribute evaluation filter. Symmetric uncertainty rank aggregation metric is better by 3, 6, 112 ranks respectively when compared to rank aggregation metrics of Chi squared filter, Gain ratio filter and Relief filters. Through these quantitative measurements, we conclude that Symmetrical uncertainty attribute evaluation is the overall best performing rank based feature selection algorithm applicable for auto evaluation of descriptive answers.

  10. Coordinated Platoon Routing in a Metropolitan Network

    Energy Technology Data Exchange (ETDEWEB)

    Larson, Jeffrey; Munson, Todd; Sokolov, Vadim

    2016-10-10

    Platooning vehicles—connected and automated vehicles traveling with small intervehicle distances—use less fuel because of reduced aerodynamic drag. Given a network de- fined by vertex and edge sets and a set of vehicles with origin/destination nodes/times, we model and solve the combinatorial optimization problem of coordinated routing of vehicles in a manner that routes them to their destination on time while using the least amount of fuel. Common approaches decompose the platoon coordination and vehicle routing into separate problems. Our model addresses both problems simultaneously to obtain the best solution. We use modern modeling techniques and constraints implied from analyzing the platoon routing problem to address larger numbers of vehicles and larger networks than previously considered. While the numerical method used is unable to certify optimality for candidate solutions to all networks and parameters considered, we obtain excellent solutions in approximately one minute for much larger networks and vehicle sets than previously considered in the literature.

  11. Development of methodologies for optimization of surveillance testing and maintenance of safety related equipment at NPPs. Report of a research coordination meeting. Working material

    International Nuclear Information System (INIS)

    1997-01-01

    This report summarizes the results of the first meeting of the Coordinated Research Programme (CRP) on Development of Methodologies for Optimization of Surveillance Testing and Maintenance of Safety Related Equipment at NPPs, held at the Agency Headquarters in Vienna, from 16 to 20 December 1996. The purpose of this Research Coordination Meeting (RCM) was that all Chief Scientific Investigators of the groups participating in the CRP presented an outline of their proposed research projects. Additionally, the participants discussed the objective, scope, work plan and information channels of the CRP in detail. Based on these presentations and discussions, the entire project plan was updated, completed and included in this report. This report represents a common agreed project work plan for the CRP. Refs, figs, tabs

  12. A multistage coordinative optimization for sitting and sizing P2G plants in an integrated electricity and natural gas system

    DEFF Research Database (Denmark)

    Zeng, Q.; Fang, J.; Chen, Z.

    2016-01-01

    Power-to-Gas (P2G) allows for the large scale energy storage which provides a big potential to accommodate the rapid growth of the renewables. In this paper, a long-term optimization model for the co-planning of the electricity and natural gas systems is presented. The P2G Plants are optimally...

  13. Analysis of the Optimal Duration of Behavioral Observations Based on an Automated Continuous Monitoring System in Tree Swallows (Tachycineta bicolor: Is One Hour Good Enough?

    Directory of Open Access Journals (Sweden)

    Ádám Z Lendvai

    Full Text Available Studies of animal behavior often rely on human observation, which introduces a number of limitations on sampling. Recent developments in automated logging of behaviors make it possible to circumvent some of these problems. Once verified for efficacy and accuracy, these automated systems can be used to determine optimal sampling regimes for behavioral studies. Here, we used a radio-frequency identification (RFID system to quantify parental effort in a bi-parental songbird species: the tree swallow (Tachycineta bicolor. We found that the accuracy of the RFID monitoring system was similar to that of video-recorded behavioral observations for quantifying parental visits. Using RFID monitoring, we also quantified the optimum duration of sampling periods for male and female parental effort by looking at the relationship between nest visit rates estimated from sampling periods with different durations and the total visit numbers for the day. The optimum sampling duration (the shortest observation time that explained the most variation in total daily visits per unit time was 1h for both sexes. These results show that RFID and other automated technologies can be used to quantify behavior when human observation is constrained, and the information from these monitoring technologies can be useful for evaluating the efficacy of human observation methods.

  14. What Factors Coordinate the Optimal Position of a Single Monitoring Well Down Gradient of a Hazardous Site?

    Science.gov (United States)

    Bode, F.; Nowak, W.

    2013-12-01

    Drinking-water well catchments include many sources for potential contaminations like gas stations, roads, or fields used for agriculture. Additionally, there are many contaminated sites that need to be monitored inside and outside drinking water catchments. Finding optimal positions of monitoring wells for such purposes is challenging because there are various parameters (and their uncertainties) that influence the reliability and optimality of a suggested monitoring location. For example, there may be uncertainty in the exact position of the contamination, in the source volume, in the direction of the velocity field which can vary in angle and absolute value, and in other parameters that describe, e.g., dispersion and decay. Many national regulations and UN guidelines suggest monitoring as measure of risk control, but make no statements how to asses or design monitoring under the fact of uncertainty. To obtain optimal positions of monitoring wells, a large body of recent studies uses formal optimization approaches. Our goal is to obtain a better system understanding at a fundamental process level for the one-on-one situation of a single monitoring well for a single monitoring target. This knowledge can be used for a better understanding of the optimization results in complex situations, and also to better guide and restrict optimization procedures by newly obtained export knowledge. In order to obtain fundamental statements regardless of specific simulation settings, we use an analytical model based on the 2D steady-state advection-dispersion equation to predict contaminant transport from the monitoring target. Monte Carlo simulation techniques are applied to represent parametric uncertainty. Thus, we can obtain maps of contaminant detection probability for all possible placements of the monitoring well. The optimal position is defined by the highest detection probability. First findings show that uncertainty in the spill location pushes the optimal monitoring

  15. Development of Decision-Making Automated System for Optimal Placement of Physical Access Control System’s Elements

    Science.gov (United States)

    Danilova, Olga; Semenova, Zinaida

    2018-04-01

    The objective of this study is a detailed analysis of physical protection systems development for information resources. The optimization theory and decision-making mathematical apparatus is used to formulate correctly and create an algorithm of selection procedure for security systems optimal configuration considering the location of the secured object’s access point and zones. The result of this study is a software implementation scheme of decision-making system for optimal placement of the physical access control system’s elements.

  16. EXPERIMENTS TOWARDS DETERMINING BEST TRAINING SAMPLE SIZE FOR AUTOMATED EVALUATION OF DESCRIPTIVE ANSWERS THROUGH SEQUENTIAL MINIMAL OPTIMIZATION

    Directory of Open Access Journals (Sweden)

    Sunil Kumar C

    2014-01-01

    Full Text Available With number of students growing each year there is a strong need to automate systems capable of evaluating descriptive answers. Unfortunately, there aren’t many systems capable of performing this task. In this paper, we use a machine learning tool called LightSIDE to accomplish auto evaluation and scoring of descriptive answers. Our experiments are designed to cater to our primary goal of identifying the optimum training sample size so as to get optimum auto scoring. Besides the technical overview and the experiments design, the paper also covers challenges, benefits of the system. We also discussed interdisciplinary areas for future research on this topic.

  17. Optimal coordination method of opportunistic array radars for multi-target-tracking-based radio frequency stealth in clutter

    Science.gov (United States)

    Zhang, Zhenkai; Salous, Sana; Li, Hailin; Tian, Yubo

    2015-11-01

    Opportunistic array radar is a new radar system that can improve the modern radar performance effectively. In order to improve its radio frequency stealth ability, a novel coordination method of opportunistic array radars in the network for target tracking in clutter is presented. First, the database of radar cross section for targets is built, then the signal-to-noise ratio for netted radars is computed according to the radar cross section and range of target. Then the joint probabilistic data association algorithm of tracking is improved with consideration of emitted power of the opportunistic array radar, which has a main impact on detection probability for tracking in clutter. Finally, with the help of grey relational grade and covariance control, the opportunistic array radar with the minimum radiated power will be selected for better radio frequency stealth performance. Simulation results show that the proposed algorithm not only has excellent tracking accuracy in clutter but also saves much more radiated power comparing with other methods.

  18. 'Outbreak Gold Standard' selection to provide optimized threshold for infectious diseases early-alert based on China Infectious Disease Automated-alert and Response System.

    Science.gov (United States)

    Wang, Rui-Ping; Jiang, Yong-Gen; Zhao, Gen-Ming; Guo, Xiao-Qin; Michael, Engelgau

    2017-12-01

    The China Infectious Disease Automated-alert and Response System (CIDARS) was successfully implemented and became operational nationwide in 2008. The CIDARS plays an important role in and has been integrated into the routine outbreak monitoring efforts of the Center for Disease Control (CDC) at all levels in China. In the CIDARS, thresholds are determined using the "Mean+2SD‟ in the early stage which have limitations. This study compared the performance of optimized thresholds defined using the "Mean +2SD‟ method to the performance of 5 novel algorithms to select optimal "Outbreak Gold Standard (OGS)‟ and corresponding thresholds for outbreak detection. Data for infectious disease were organized by calendar week and year. The "Mean+2SD‟, C1, C2, moving average (MA), seasonal model (SM), and cumulative sum (CUSUM) algorithms were applied. Outbreak signals for the predicted value (Px) were calculated using a percentile-based moving window. When the outbreak signals generated by an algorithm were in line with a Px generated outbreak signal for each week, this Px was then defined as the optimized threshold for that algorithm. In this study, six infectious diseases were selected and classified into TYPE A (chickenpox and mumps), TYPE B (influenza and rubella) and TYPE C [hand foot and mouth disease (HFMD) and scarlet fever]. Optimized thresholds for chickenpox (P 55 ), mumps (P 50 ), influenza (P 40 , P 55 , and P 75 ), rubella (P 45 and P 75 ), HFMD (P 65 and P 70 ), and scarlet fever (P 75 and P 80 ) were identified. The C1, C2, CUSUM, SM, and MA algorithms were appropriate for TYPE A. All 6 algorithms were appropriate for TYPE B. C1 and CUSUM algorithms were appropriate for TYPE C. It is critical to incorporate more flexible algorithms as OGS into the CIDRAS and to identify the proper OGS and corresponding recommended optimized threshold by different infectious disease types.

  19. Optimal installation locations for automated external defibrillators in Taipei 7-Eleven stores: using GIS and a genetic algorithm with a new stirring operator.

    Science.gov (United States)

    Huang, Chung-Yuan; Wen, Tzai-Hung

    2014-01-01

    Immediate treatment with an automated external defibrillator (AED) increases out-of-hospital cardiac arrest (OHCA) patient survival potential. While considerable attention has been given to determining optimal public AED locations, spatial and temporal factors such as time of day and distance from emergency medical services (EMSs) are understudied. Here we describe a geocomputational genetic algorithm with a new stirring operator (GANSO) that considers spatial and temporal cardiac arrest occurrence factors when assessing the feasibility of using Taipei 7-Eleven stores as installation locations for AEDs. Our model is based on two AED conveyance modes, walking/running and driving, involving service distances of 100 and 300 meters, respectively. Our results suggest different AED allocation strategies involving convenience stores in urban settings. In commercial areas, such installations can compensate for temporal gaps in EMS locations when responding to nighttime OHCA incidents. In residential areas, store installations can compensate for long distances from fire stations, where AEDs are currently held in Taipei.

  20. Security Classification Using Automated Learning (SCALE): Optimizing Statistical Natural Language Processing Techniques to Assign Security Labels to Unstructured Text

    Science.gov (United States)

    2010-12-01

    2010 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la Défense nationale, 2010 Abstract Automating the... fonction de son expérience et des politiques de sécurité. Pour étiqueter de manière efficace toutes les données disponibles dans les réseaux du...bien que l’on ait étudié en profondeur la catégorisation automatique de données en fonction du sujet, peu de recherches axées sur l’évaluation

  1. Coordination of Heat Pumps, Electric Vehicles and AGC for Efficient LFC in a Smart Hybrid Power System via SCA-Based Optimized FOPID Controllers

    Directory of Open Access Journals (Sweden)

    Rahmat Khezri

    2018-02-01

    Full Text Available Due to the high price of fossil fuels, the increased carbon footprint in conventional generation units and the intermittent functionality of renewable units, alternative sources must contribute to the load frequency control (LFC of the power system. To tackle the challenge, dealing with controllable loads, the ongoing study aims at efficient LFC in smart hybrid power systems. To achieve this goal, heat pumps (HPs and electric vehicles (EVs are selected as the most effective controllable loads to contribute to the LFC issue. In this regard, the EVs can be controlled in a bidirectional manner as known charging and discharging states under a smart structure. In addition, regarding the HPs, the power consumption is controllable. As the main task, this paper proposes a fractional order proportional integral differential (FOPID controller for coordinated control of power consumption in HPs, the discharging state in EVs and automatic generation control (AGC. The parameters of the FOPID controllers are optimized simultaneously by the sine cosine algorithm (SCA, which is a new method for optimization problems. In the sequel, four scenarios, including step and random load changes, aggregated intermittent generated power from wind turbines, a random load change scenario and a sensitivity analysis scenario, are selected to demonstrate the efficiency of the proposed SCA-based FOPID controllers in a hybrid two-area power system.

  2. Protein standardization III: Method optimization basic principles for quantitative determination of human serum proteins on automated instruments based on turbidimetry or nephelometry.

    Science.gov (United States)

    Blirup-Jensen, S

    2001-11-01

    Quantitative protein determinations in routine laboratories are today most often carried out using automated instruments. However, slight variations in the assay principle, in the programming of the instrument or in the reagents may lead to different results. This has led to the prerequisite of method optimization and standardization. The basic principles of turbidimetry and nephelometry are discussed. The different reading principles are illustrated and investigated. Various problems are identified and a suggestion is made for an integrated, fast and convenient test system for the determination of a number of different proteins on the same instrument. An optimized test system for turbidimetry and nephelometry should comprise high-quality antibodies, calibrators, controls, and buffers and a protocol with detailed parameter settings in order to program the instrument correctly. A good user program takes full advantage of the optimal reading principles for the different instruments. This implies--for all suitable instruments--sample preincubation followed by real sample blanking, which automatically corrects for initial turbidity in the sample. Likewise it is recommended to measure the reagent blank, which represents any turbidity caused by the antibody itself. By correcting all signals with these two blank values the best possible signal is obtained for the specific analyte. An optimized test system should preferably offer a wide measuring range combined with a wide security range, which for the user means few re-runs and maximum security against antigen excess. A non-linear calibration curve based on six standards is obtained using a suitable mathematical fitting model, which normally is part of the instrument software.

  3. Parameter Extraction for PSpice Models by means of an Automated Optimization Tool – An IGBT model Study Case

    DEFF Research Database (Denmark)

    Suárez, Carlos Gómez; Reigosa, Paula Diaz; Iannuzzo, Francesco

    2016-01-01

    An original tool for parameter extraction of PSpice models has been released, enabling a simple parameter identification. A physics-based IGBT model is used to demonstrate that the optimization tool is capable of generating a set of parameters which predicts the steady-state and switching behavior...

  4. Slotting optimization of automated storage and retrieval system (AS/RS) for efficient delivery of parts in an assembly shop using genetic algorithm: A case Study

    Science.gov (United States)

    Yue, L.; Guan, Z.; He, C.; Luo, D.; Saif, U.

    2017-06-01

    In recent years, the competitive pressure on manufacturing companies shifted them from mass production to mass customization to produce large variety of products. It is a great challenge for companies nowadays to produce customized mixed flow mode of production to meet customized demand on time. Due to large variety of products, the storage system to deliver variety of products to production lines influences on the timely production of variety of products, as investigated from by simulation study of an inefficient storage system of a real Company, in the current research. Therefore, current research proposed a slotting optimization model with mixed model sequence to assemble in consideration of the final flow lines to optimize whole automated storage and retrieval system (AS/RS) and distribution system in the case company. Current research is aimed to minimize vertical height of centre of gravity of AS/RS and total time spent for taking the materials out from the AS/RS simultaneously. Genetic algorithm is adopted to solve the proposed problem and computational result shows significant improvement in stability and efficiency of AS/RS as compared to the existing method used in the case company.

  5. Biogas-pH automation control strategy for optimizing organic loading rate of anaerobic membrane bioreactor treating high COD wastewater.

    Science.gov (United States)

    Yu, Dawei; Liu, Jibao; Sui, Qianwen; Wei, Yuansong

    2016-03-01

    Control of organic loading rate (OLR) is essential for anaerobic digestion treating high COD wastewater, which would cause operation failure by overload or less efficiency by underload. A novel biogas-pH automation control strategy using the combined gas-liquor phase monitoring was developed for an anaerobic membrane bioreactor (AnMBR) treating high COD (27.53 g·L(-1)) starch wastewater. The biogas-pH strategy was proceeded with threshold between biogas production rate >98 Nml·h(-1) preventing overload and pH>7.4 preventing underload, which were determined by methane production kinetics and pH titration of methanogenesis slurry, respectively. The OLR and the effluent COD were doubled as 11.81 kgCOD·kgVSS(-1)·d(-1) and halved as 253.4 mg·L(-1), respectively, comparing with a constant OLR control strategy. Meanwhile COD removal rate, biogas yield and methane concentration were synchronously improved to 99.1%, 312 Nml·gCODin(-1) and 74%, respectively. Using the biogas-pH strategy, AnMBR formed a "pH self-regulation ternary buffer system" which seizes carbon dioxide and hence provides sufficient buffering capacity. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. DG-AMMOS: A New tool to generate 3D conformation of small molecules using Distance Geometry and Automated Molecular Mechanics Optimization for in silico Screening

    Directory of Open Access Journals (Sweden)

    Villoutreix Bruno O

    2009-11-01

    Full Text Available Abstract Background Discovery of new bioactive molecules that could enter drug discovery programs or that could serve as chemical probes is a very complex and costly endeavor. Structure-based and ligand-based in silico screening approaches are nowadays extensively used to complement experimental screening approaches in order to increase the effectiveness of the process and facilitating the screening of thousands or millions of small molecules against a biomolecular target. Both in silico screening methods require as input a suitable chemical compound collection and most often the 3D structure of the small molecules has to be generated since compounds are usually delivered in 1D SMILES, CANSMILES or in 2D SDF formats. Results Here, we describe the new open source program DG-AMMOS which allows the generation of the 3D conformation of small molecules using Distance Geometry and their energy minimization via Automated Molecular Mechanics Optimization. The program is validated on the Astex dataset, the ChemBridge Diversity database and on a number of small molecules with known crystal structures extracted from the Cambridge Structural Database. A comparison with the free program Balloon and the well-known commercial program Omega generating the 3D of small molecules is carried out. The results show that the new free program DG-AMMOS is a very efficient 3D structure generator engine. Conclusion DG-AMMOS provides fast, automated and reliable access to the generation of 3D conformation of small molecules and facilitates the preparation of a compound collection prior to high-throughput virtual screening computations. The validation of DG-AMMOS on several different datasets proves that generated structures are generally of equal quality or sometimes better than structures obtained by other tested methods.

  7. Fully automated radiosynthesis of [18F]Fluoromisonidazole with single neutral alumina column purification: optimization of reaction parameters

    International Nuclear Information System (INIS)

    Nandy, S.K.; Rajan, M.G.R.

    2010-01-01

    1-H-1-(3-[ 18 F]fluoro-2-hydroxypropyl)-2-nitroimidazole ([ 18 F]FMISO), is the most used hypoxia-imaging agent in oncology and we have recently reported a fully automated procedure for its synthesis using the Nuclear Interface FDG module and a single neutral alumina column for purification. Using 1-(2'-nitro-1'-imidazolyl)-2-O-tetra-hydropyranyl-3-O- toluenesulfonylpropanediol (NITTP) as the precursor, we have investigated the yield of [ 18 F]FMISO using different reaction times, temperatures, and the amount of precursor. The overall yield was 48.4 ± 1.2% (n = 3), (without decay correction) obtained using 10 mg NITTP with the radio-fluorination carried out at 145 deg C for 3 min followed by acid hydrolysis for 3 min at 125 deg C in a total synthesis time of 32 ± 1 min. Increasing the precursor amount to 25 mg did not improve the overall yield under identical reaction conditions, with the decay uncorrected yield being 46.8 ± 1.6% (n = 3), but rather made the production less economical. It was also observed that the yield increased linearly with the amount of NITTP used, from 2.5 to 10 mg and plateaued from 10 to 25 mg. Radio-fluorination efficiency at four different conditions was also compared. It was also observed by radio thin layer chromatography (radio-TLC) that the duration of radio-fluorination of NITTP, not the radio-fluorination temperature favoured the formation of labeled thermally degraded product, but the single neutral alumina column purification was sufficient enough to obtain [ 18 F]FMISO devoid of any radiochemical as well as cold impurities. (author)

  8. Metabolic flux ratio analysis and multi-objective optimization revealed a globally conserved and coordinated metabolic response of E. coli to paraquat-induced oxidative stress.

    Science.gov (United States)

    Shen, Tie; Rui, Bin; Zhou, Hong; Zhang, Ximing; Yi, Yin; Wen, Han; Zheng, Haoran; Wu, Jihui; Shi, Yunyu

    2013-01-27

    The ability of a microorganism to adapt to changes in the environment, such as in nutrient or oxygen availability, is essential for its competitive fitness and survival. The cellular objective and the strategy of the metabolic response to an extreme environment are therefore of tremendous interest and, thus, have been increasingly explored. However, the cellular objective of the complex regulatory structure of the metabolic changes has not yet been fully elucidated and more details regarding the quantitative behaviour of the metabolic flux redistribution are required to understand the systems-wide biological significance of this response. In this study, the intracellular metabolic flux ratios involved in the central carbon metabolism were determined by fractional (13)C-labeling and metabolic flux ratio analysis (MetaFoR) of the wild-type E. coli strain JM101 at an oxidative environment in a chemostat. We observed a significant increase in the flux through phosphoenolpyruvate carboxykinase (PEPCK), phosphoenolpyruvate carboxylase (PEPC), malic enzyme (MEZ) and serine hydroxymethyltransferase (SHMT). We applied an ε-constraint based multi-objective optimization to investigate the trade-off relationships between the biomass yield and the generation of reductive power using the in silico iJR904 genome-scale model of E. coli K-12. The theoretical metabolic redistribution supports that the trans-hydrogenase pathway should not play a direct role in the defence mounted by E. coli against oxidative stress. The agreement between the measured ratio and the theoretical redistribution established the significance of NADPH synthesis as the goal of the metabolic reprogramming that occurs in response to oxidative stress. Our work presents a framework that combines metabolic flux ratio analysis and multi-objective optimization to investigate the metabolic trade-offs that occur under varied environmental conditions. Our results led to the proposal that the metabolic response of E

  9. Towards reduction of Paradigm coordination models

    NARCIS (Netherlands)

    S. Andova; L.P.J. Groenewegen; E.P. de Vink (Erik Peter); L. Aceto (Luca); M.R. Mousavi

    2011-01-01

    htmlabstractThe coordination modelling language Paradigm addresses collaboration between components in terms of dynamic constraints. Within a Paradigm model, component dynamics are consistently specified at a detailed and a global level of abstraction. To enable automated verification of Paradigm

  10. Optimization of the radiological protection of patients undergoing radiography, fluoroscopy and computed tomography. Final report of a coordinated research project in Africa, Asia and eastern Europe

    International Nuclear Information System (INIS)

    2004-12-01

    Although radiography has been an established imaging modality for over a century, continuous developments have led to improvements in technique resulting in improved image quality at reduced patient dose. If one compares the technique used by Roentgen with the methods used today, one finds that a radiograph can now be obtained at a dose which is smaller by a factor of 100 or more. Nonetheless, some national surveys, particularly in the United Kingdom and in the United States of America in the 1980s and 1990s, have indicated large variations in patient doses for the same diagnostic examination, in some cases by a factor of 20 or more. This arises not only owing to the various types of equipment and accessories used by the different health care providers, but also because of operational factors. The IAEA has a statutory responsibility to establish standards for the protection of people against exposure to ionising radiation and to provide for the worldwide application of those standards. A fundamental requirement of the International Basic Safety Standards for Protection against Ionizing Radiation and for the Safety of Radiation Sources (BSS), issued by the IAEA in cooperation with the FAO, ILO, WHO, PAHO and NEA, is the optimization of radiological protection of patients undergoing medical exposure. Towards its responsibility of implementation of standards and under the subprogramme of radiation safety, in 1995, the IAEA launched a coordinated research project (CRP) on radiological protection in diagnostic radiology in some countries in the Eastern European, African and Asian region. Initially, the CRP addressed radiography only and it covered wide aspects of optimisation of radiological protection. Subsequently, the scope of the CRP was extended to fluoroscopy and computed tomography (CT), but it covered primarily situation analysis of patient doses and equipment quality control. It did not cover patient dose reduction aspects in fluoroscopy and CT. The project

  11. Optimization

    CERN Document Server

    Pearce, Charles

    2009-01-01

    Focuses on mathematical structure, and on real-world applications. This book includes developments in several optimization-related topics such as decision theory, linear programming, turnpike theory, duality theory, convex analysis, and queuing theory.

  12. WE-AB-209-12: Quasi Constrained Multi-Criteria Optimization for Automated Radiation Therapy Treatment Planning

    Energy Technology Data Exchange (ETDEWEB)

    Watkins, W.T.; Siebers, J.V. [University of Virginia, Charlottesville, VA (United States)

    2016-06-15

    Purpose: To introduce quasi-constrained Multi-Criteria Optimization (qcMCO) for unsupervised radiation therapy optimization which generates alternative patient-specific plans emphasizing dosimetric tradeoffs and conformance to clinical constraints for multiple delivery techniques. Methods: For N Organs At Risk (OARs) and M delivery techniques, qcMCO generates M(N+1) alternative treatment plans per patient. Objective weight variations for OARs and targets are used to generate alternative qcMCO plans. For 30 locally advanced lung cancer patients, qcMCO plans were generated for dosimetric tradeoffs to four OARs: each lung, heart, and esophagus (N=4) and 4 delivery techniques (simple 4-field arrangements, 9-field coplanar IMRT, 27-field non-coplanar IMRT, and non-coplanar Arc IMRT). Quasi-constrained objectives included target prescription isodose to 95% (PTV-D95), maximum PTV dose (PTV-Dmax)< 110% of prescription, and spinal cord Dmax<45 Gy. The algorithm’s ability to meet these constraints while simultaneously revealing dosimetric tradeoffs was investigated. Statistically significant dosimetric tradeoffs were defined such that the coefficient of determination between dosimetric indices which varied by at least 5 Gy between different plans was >0.8. Results: The qcMCO plans varied mean dose by >5 Gy to ipsilateral lung for 24/30 patients, contralateral lung for 29/30 patients, esophagus for 29/30 patients, and heart for 19/30 patients. In the 600 plans computed without human interaction, average PTV-D95=67.4±3.3 Gy, PTV-Dmax=79.2±5.3 Gy, and spinal cord Dmax was >45 Gy in 93 plans (>50 Gy in 2/600 plans). Statistically significant dosimetric tradeoffs were evident in 19/30 plans, including multiple tradeoffs of at least 5 Gy between multiple OARs in 7/30 cases. The most common statistically significant tradeoff was increasing PTV-Dmax to reduce OAR dose (15/30 patients). Conclusion: The qcMCO method can conform to quasi-constrained objectives while revealing

  13. WE-AB-209-12: Quasi Constrained Multi-Criteria Optimization for Automated Radiation Therapy Treatment Planning

    International Nuclear Information System (INIS)

    Watkins, W.T.; Siebers, J.V.

    2016-01-01

    Purpose: To introduce quasi-constrained Multi-Criteria Optimization (qcMCO) for unsupervised radiation therapy optimization which generates alternative patient-specific plans emphasizing dosimetric tradeoffs and conformance to clinical constraints for multiple delivery techniques. Methods: For N Organs At Risk (OARs) and M delivery techniques, qcMCO generates M(N+1) alternative treatment plans per patient. Objective weight variations for OARs and targets are used to generate alternative qcMCO plans. For 30 locally advanced lung cancer patients, qcMCO plans were generated for dosimetric tradeoffs to four OARs: each lung, heart, and esophagus (N=4) and 4 delivery techniques (simple 4-field arrangements, 9-field coplanar IMRT, 27-field non-coplanar IMRT, and non-coplanar Arc IMRT). Quasi-constrained objectives included target prescription isodose to 95% (PTV-D95), maximum PTV dose (PTV-Dmax)< 110% of prescription, and spinal cord Dmax<45 Gy. The algorithm’s ability to meet these constraints while simultaneously revealing dosimetric tradeoffs was investigated. Statistically significant dosimetric tradeoffs were defined such that the coefficient of determination between dosimetric indices which varied by at least 5 Gy between different plans was >0.8. Results: The qcMCO plans varied mean dose by >5 Gy to ipsilateral lung for 24/30 patients, contralateral lung for 29/30 patients, esophagus for 29/30 patients, and heart for 19/30 patients. In the 600 plans computed without human interaction, average PTV-D95=67.4±3.3 Gy, PTV-Dmax=79.2±5.3 Gy, and spinal cord Dmax was >45 Gy in 93 plans (>50 Gy in 2/600 plans). Statistically significant dosimetric tradeoffs were evident in 19/30 plans, including multiple tradeoffs of at least 5 Gy between multiple OARs in 7/30 cases. The most common statistically significant tradeoff was increasing PTV-Dmax to reduce OAR dose (15/30 patients). Conclusion: The qcMCO method can conform to quasi-constrained objectives while revealing

  14. Optimized anion exchange column isolation of zirconium-89 ( 89 Zr) from yttrium cyclotron target: Method development and implementation on an automated fluidic platform

    Energy Technology Data Exchange (ETDEWEB)

    O’Hara, Matthew J.; Murray, Nathaniel J.; Carter, Jennifer C.; Morrison, Samuel S.

    2018-04-01

    Zirconium-89 (89Zr), produced by the (p,n) reaction from naturally monoisotopic yttrium (natY), is a promising positron emitting isotope for immunoPET imaging. Its long half-life of 78.4 h is sufficient for evaluating slow physiological processes. A prototype automated fluidic system, coupled to on-line and in-line detectors, has been constructed to facilitate development of new 89Zr purification methodologies. The highly reproducible reagent delivery platform and near-real time monitoring of column effluents allows for efficient method optimization. The separation of Zr from dissolved Y metal targets was evaluated using several anion exchange resins. Each resin was evaluated against its ability to quantitatively capture Zr from a load solution that is high in dissolved Y. The most appropriate anion exchange resin for this application was identified, and the separation method was optimized. The method is capable of a high Y decontamination factor (>105) and has been shown to separate Fe, an abundant contaminant in Y foils, from the 89Zr elution fraction. Finally, the performance of the method was evaluated using cyclotron bombarded Y foil targets. The separation method was shown to achieve >95% recovery of the 89Zr present in the foils. The 89Zr eluent, however, was in a chemical matrix not immediately conducive to labeling onto proteins. The main intent of this study was to develop a tandem column 89Zr purification process, wherein the anion exchange column method described here is the first separation in a dual-column purification process.

  15. Optimal Installation Locations for Automated External Defibrillators in Taipei 7-Eleven Stores: Using GIS and a Genetic Algorithm with a New Stirring Operator

    Directory of Open Access Journals (Sweden)

    Chung-Yuan Huang

    2014-01-01

    Full Text Available Immediate treatment with an automated external defibrillator (AED increases out-of-hospital cardiac arrest (OHCA patient survival potential. While considerable attention has been given to determining optimal public AED locations, spatial and temporal factors such as time of day and distance from emergency medical services (EMSs are understudied. Here we describe a geocomputational genetic algorithm with a new stirring operator (GANSO that considers spatial and temporal cardiac arrest occurrence factors when assessing the feasibility of using Taipei 7-Eleven stores as installation locations for AEDs. Our model is based on two AED conveyance modes, walking/running and driving, involving service distances of 100 and 300 meters, respectively. Our results suggest different AED allocation strategies involving convenience stores in urban settings. In commercial areas, such installations can compensate for temporal gaps in EMS locations when responding to nighttime OHCA incidents. In residential areas, store installations can compensate for long distances from fire stations, where AEDs are currently held in Taipei.

  16. Multiple microfermentor battery: a versatile tool for use with automated parallel cultures of microorganisms producing recombinant proteins and for optimization of cultivation protocols.

    Science.gov (United States)

    Frachon, Emmanuel; Bondet, Vincent; Munier-Lehmann, Hélène; Bellalou, Jacques

    2006-08-01

    A multiple microfermentor battery was designed for high-throughput recombinant protein production in Escherichia coli. This novel system comprises eight aerated glass reactors with a working volume of 80 ml and a moving external optical sensor for measuring optical densities at 600 nm (OD600) ranging from 0.05 to 100 online. Each reactor can be fitted with miniature probes to monitor temperature, dissolved oxygen (DO), and pH. Independent temperature regulation for each vessel is obtained with heating/cooling Peltier devices. Data from pH, DO, and turbidity sensors are collected on a FieldPoint (National Instruments) I/O interface and are processed and recorded by a LabVIEW program on a personal computer, which enables feedback control of the culture parameters. A high-density medium formulation was designed, which enabled us to grow E. coli to OD600 up to 100 in batch cultures with oxygen-enriched aeration. Accordingly, the biomass and the amount of recombinant protein produced in a 70-ml culture were at least equivalent to the biomass and the amount of recombinant protein obtained in a Fernbach flask with 1 liter of conventional medium. Thus, the microfermentor battery appears to be well suited for automated parallel cultures and process optimization, such as that needed for structural genomics projects.

  17. Full design automation of multi-state RNA devices to program gene expression using energy-based optimization.

    Directory of Open Access Journals (Sweden)

    Guillermo Rodrigo

    Full Text Available Small RNAs (sRNAs can operate as regulatory agents to control protein expression by interaction with the 5' untranslated region of the mRNA. We have developed a physicochemical framework, relying on base pair interaction energies, to design multi-state sRNA devices by solving an optimization problem with an objective function accounting for the stability of the transition and final intermolecular states. Contrary to the analysis of the reaction kinetics of an ensemble of sRNAs, we solve the inverse problem of finding sequences satisfying targeted reactions. We show here that our objective function correlates well with measured riboregulatory activity of a set of mutants. This has enabled the application of the methodology for an extended design of RNA devices with specified behavior, assuming different molecular interaction models based on Watson-Crick interaction. We designed several YES, NOT, AND, and OR logic gates, including the design of combinatorial riboregulators. In sum, our de novo approach provides a new paradigm in synthetic biology to design molecular interaction mechanisms facilitating future high-throughput functional sRNA design.

  18. Sequential injection analysis for automation of the Winkler methodology, with real-time SIMPLEX optimization and shipboard application

    International Nuclear Information System (INIS)

    Horstkotte, Burkhard; Tovar Sanchez, Antonio; Duarte, Carlos M.; Cerda, Victor

    2010-01-01

    A multipurpose analyzer system based on sequential injection analysis (SIA) for the determination of dissolved oxygen (DO) in seawater is presented. Three operation modes were established and successfully applied onboard during a research cruise in the Southern ocean: 1st, in-line execution of the entire Winkler method including precipitation of manganese (II) hydroxide, fixation of DO, precipitate dissolution by confluent acidification, and spectrophotometric quantification of the generated iodine/tri-iodide (I 2 /I 3 - ), 2nd, spectrophotometric quantification of I 2 /I 3 - in samples prepared according the classical Winkler protocol, and 3rd, accurate batch-wise titration of I 2 /I 3 - with thiosulfate using one syringe pump of the analyzer as automatic burette. In the first mode, the zone stacking principle was applied to achieve high dispersion of the reagent solutions in the sample zone. Spectrophotometric detection was done at the isobestic wavelength 466 nm of I 2 /I 3 - . Highly reduced consumption of reagents and sample compared to the classical Winkler protocol, linear response up to 16 mg L -1 DO, and an injection frequency of 30 per hour were achieved. It is noteworthy that for the offline protocol, sample metering and quantification with a potentiometric titrator lasts in general over 5 min without counting sample fixation, incubation, and glassware cleaning. The modified SIMPLEX methodology was used for the simultaneous optimization of four volumetric and two chemical variables. Vertex calculation and consequent application including in-line preparation of one reagent was carried out in real-time using the software AutoAnalysis. The analytical system featured high signal stability, robustness, and a repeatability of 3% RSD (1st mode) and 0.8% (2nd mode) during shipboard application.

  19. Sequential injection analysis for automation of the Winkler methodology, with real-time SIMPLEX optimization and shipboard application.

    Science.gov (United States)

    Horstkotte, Burkhard; Tovar Sánchez, Antonio; Duarte, Carlos M; Cerdà, Víctor

    2010-01-25

    A multipurpose analyzer system based on sequential injection analysis (SIA) for the determination of dissolved oxygen (DO) in seawater is presented. Three operation modes were established and successfully applied onboard during a research cruise in the Southern ocean: 1st, in-line execution of the entire Winkler method including precipitation of manganese (II) hydroxide, fixation of DO, precipitate dissolution by confluent acidification, and spectrophotometric quantification of the generated iodine/tri-iodide (I(2)/I(3)(-)), 2nd, spectrophotometric quantification of I(2)/I(3)(-) in samples prepared according the classical Winkler protocol, and 3rd, accurate batch-wise titration of I(2)/I(3)(-) with thiosulfate using one syringe pump of the analyzer as automatic burette. In the first mode, the zone stacking principle was applied to achieve high dispersion of the reagent solutions in the sample zone. Spectrophotometric detection was done at the isobestic wavelength 466 nm of I(2)/I(3)(-). Highly reduced consumption of reagents and sample compared to the classical Winkler protocol, linear response up to 16 mg L(-1) DO, and an injection frequency of 30 per hour were achieved. It is noteworthy that for the offline protocol, sample metering and quantification with a potentiometric titrator lasts in general over 5 min without counting sample fixation, incubation, and glassware cleaning. The modified SIMPLEX methodology was used for the simultaneous optimization of four volumetric and two chemical variables. Vertex calculation and consequent application including in-line preparation of one reagent was carried out in real-time using the software AutoAnalysis. The analytical system featured high signal stability, robustness, and a repeatability of 3% RSD (1st mode) and 0.8% (2nd mode) during shipboard application. Copyright 2009 Elsevier B.V. All rights reserved.

  20. Sequential injection analysis for automation of the Winkler methodology, with real-time SIMPLEX optimization and shipboard application

    Energy Technology Data Exchange (ETDEWEB)

    Horstkotte, Burkhard; Tovar Sanchez, Antonio; Duarte, Carlos M. [Department of Global Change Research, IMEDEA (CSIC-UIB) Institut Mediterrani d' Estudis Avancats, Miquel Marques 21, 07190 Esporles (Spain); Cerda, Victor, E-mail: Victor.Cerda@uib.es [University of the Balearic Islands, Department of Chemistry Carreterra de Valldemossa km 7.5, 07011 Palma de Mallorca (Spain)

    2010-01-25

    A multipurpose analyzer system based on sequential injection analysis (SIA) for the determination of dissolved oxygen (DO) in seawater is presented. Three operation modes were established and successfully applied onboard during a research cruise in the Southern ocean: 1st, in-line execution of the entire Winkler method including precipitation of manganese (II) hydroxide, fixation of DO, precipitate dissolution by confluent acidification, and spectrophotometric quantification of the generated iodine/tri-iodide (I{sub 2}/I{sub 3}{sup -}), 2nd, spectrophotometric quantification of I{sub 2}/I{sub 3}{sup -} in samples prepared according the classical Winkler protocol, and 3rd, accurate batch-wise titration of I{sub 2}/I{sub 3}{sup -} with thiosulfate using one syringe pump of the analyzer as automatic burette. In the first mode, the zone stacking principle was applied to achieve high dispersion of the reagent solutions in the sample zone. Spectrophotometric detection was done at the isobestic wavelength 466 nm of I{sub 2}/I{sub 3}{sup -}. Highly reduced consumption of reagents and sample compared to the classical Winkler protocol, linear response up to 16 mg L{sup -1} DO, and an injection frequency of 30 per hour were achieved. It is noteworthy that for the offline protocol, sample metering and quantification with a potentiometric titrator lasts in general over 5 min without counting sample fixation, incubation, and glassware cleaning. The modified SIMPLEX methodology was used for the simultaneous optimization of four volumetric and two chemical variables. Vertex calculation and consequent application including in-line preparation of one reagent was carried out in real-time using the software AutoAnalysis. The analytical system featured high signal stability, robustness, and a repeatability of 3% RSD (1st mode) and 0.8% (2nd mode) during shipboard application.

  1. Systematic review automation technologies

    Science.gov (United States)

    2014-01-01

    Systematic reviews, a cornerstone of evidence-based medicine, are not produced quickly enough to support clinical practice. The cost of production, availability of the requisite expertise and timeliness are often quoted as major contributors for the delay. This detailed survey of the state of the art of information systems designed to support or automate individual tasks in the systematic review, and in particular systematic reviews of randomized controlled clinical trials, reveals trends that see the convergence of several parallel research projects. We surveyed literature describing informatics systems that support or automate the processes of systematic review or each of the tasks of the systematic review. Several projects focus on automating, simplifying and/or streamlining specific tasks of the systematic review. Some tasks are already fully automated while others are still largely manual. In this review, we describe each task and the effect that its automation would have on the entire systematic review process, summarize the existing information system support for each task, and highlight where further research is needed for realizing automation for the task. Integration of the systems that automate systematic review tasks may lead to a revised systematic review workflow. We envisage the optimized workflow will lead to system in which each systematic review is described as a computer program that automatically retrieves relevant trials, appraises them, extracts and synthesizes data, evaluates the risk of bias, performs meta-analysis calculations, and produces a report in real time. PMID:25005128

  2. Automated Power-Distribution System

    Science.gov (United States)

    Ashworth, Barry; Riedesel, Joel; Myers, Chris; Miller, William; Jones, Ellen F.; Freeman, Kenneth; Walsh, Richard; Walls, Bryan K.; Weeks, David J.; Bechtel, Robert T.

    1992-01-01

    Autonomous power-distribution system includes power-control equipment and automation equipment. System automatically schedules connection of power to loads and reconfigures itself when it detects fault. Potential terrestrial applications include optimization of consumption of power in homes, power supplies for autonomous land vehicles and vessels, and power supplies for automated industrial processes.

  3. Automated Test-Form Generation

    Science.gov (United States)

    van der Linden, Wim J.; Diao, Qi

    2011-01-01

    In automated test assembly (ATA), the methodology of mixed-integer programming is used to select test items from an item bank to meet the specifications for a desired test form and optimize its measurement accuracy. The same methodology can be used to automate the formatting of the set of selected items into the actual test form. Three different…

  4. Consistent integrated automation. Optimized power plant control by means of IEC 61850; Durchgaengig automatisieren. Optimierte Kraftwerksleittechnik durch die Norm IEC 61850

    Energy Technology Data Exchange (ETDEWEB)

    Orth, J. [ABB AG, Mannheim (Germany). Geschaeftsbereich Power Generation

    2007-07-01

    Today's power plants are highly automated. All subsystems of large thermal power plants can be controlled from a central control room. The electrical systems are an important part. In future the new standard IEC 61850 will improve the integration of electrical systems into automation of power plants supporting the reduction of operation and maintenance cost. (orig.)

  5. Implementing Office Automation in Postsecondary Educational Institutions.

    Science.gov (United States)

    Creutz, Alan

    1984-01-01

    Three implementation strategies for office automation and decision support systems within postsecondary educational institutions--"natural evolution,""the total solution," and "coordinate evolution"--are identified. The components of an effective implementation plan are discussed. (Author/MLW)

  6. Automated hybrid closed-loop control with a proportional-integral-derivative based system in adolescents and adults with type 1 diabetes: individualizing settings for optimal performance.

    Science.gov (United States)

    Ly, Trang T; Weinzimer, Stuart A; Maahs, David M; Sherr, Jennifer L; Roy, Anirban; Grosman, Benyamin; Cantwell, Martin; Kurtz, Natalie; Carria, Lori; Messer, Laurel; von Eyben, Rie; Buckingham, Bruce A

    2017-08-01

    Automated insulin delivery systems, utilizing a control algorithm to dose insulin based upon subcutaneous continuous glucose sensor values and insulin pump therapy, will soon be available for commercial use. The objective of this study was to determine the preliminary safety and efficacy of initialization parameters with the Medtronic hybrid closed-loop controller by comparing percentage of time in range, 70-180 mg/dL (3.9-10 mmol/L), mean glucose values, as well as percentage of time above and below target range between sensor-augmented pump therapy and hybrid closed-loop, in adults and adolescents with type 1 diabetes. We studied an initial cohort of 9 adults followed by a second cohort of 15 adolescents, using the Medtronic hybrid closed-loop system with the proportional-integral-derivative with insulin feed-back (PID-IFB) algorithm. Hybrid closed-loop was tested in supervised hotel-based studies over 4-5 days. The overall mean percentage of time in range (70-180 mg/dL, 3.9-10 mmol/L) during hybrid closed-loop was 71.8% in the adult cohort and 69.8% in the adolescent cohort. The overall percentage of time spent under 70 mg/dL (3.9 mmol/L) was 2.0% in the adult cohort and 2.5% in the adolescent cohort. Mean glucose values were 152 mg/dL (8.4 mmol/L) in the adult cohort and 153 mg/dL (8.5 mmol/L) in the adolescent cohort. Closed-loop control using the Medtronic hybrid closed-loop system enables adaptive, real-time basal rate modulation. Initializing hybrid closed-loop in clinical practice will involve individualizing initiation parameters to optimize overall glucose control. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  7. Optimization of production and quality control of therapeutic radionuclides and radiopharmaceuticals. Final report of a co-ordinated research project 1994-1998

    International Nuclear Information System (INIS)

    1999-09-01

    The 'renaissance' of the therapeutic applications of radiopharmaceuticals during the last few years was in part due to a greater availability of radionuclides with appropriate nuclear decay properties, as well as to the development of carrier molecules with improved characteristics. Although radionuclides such as 32 P, 89 Sr and 131 I, were used from the early days of nuclear medicine in the late 1930s and early 1940s, the inclusion of other particle emitting radionuclides into the nuclear medicine armamentarium was rather late. Only in the early 1980s did the specialized scientific literature start to show the potential for using other beta emitting nuclear reactor produced radionuclides such as 153 Sm, 166 Ho, 165 Dy and 186-188 Re. Bone seeking agents radiolabelled with the above mentioned beta emitting radionuclides demonstrated clear clinical potential in relieving intense bone pain resulting from metastases of the breast, prostate and lung of cancer patients. Therefore, upon the recommendation of a consultants meeting held in Vienna in 1993, the Co-ordinated Research Project (CRP) on Optimization of the Production and quality control of Radiotherapeutic Radionuclides and Radiopharmaceuticals was established in 1994. The CRP aimed at developing and improving existing laboratory protocols for the production of therapeutic radionuclides using existing nuclear research reactors including the corresponding radiolabelling, quality control procedures; and validation in experimental animals. With the participation of ten scientists from IAEA Member States, several laboratory procedures for preparation and quality control were developed, tested and assessed as potential therapeutic radiopharmaceuticals for bone pain palliation. In particular, the CRP optimised the reactor production of 153 Sm and the preparation of the radiopharmaceutical 153 Sm-EDTMP (ethylene diamine tetramethylene phosphonate), as well as radiolabelling techniques and quality control methods for

  8. Optimal Real-time Dispatch for Integrated Energy Systems

    DEFF Research Database (Denmark)

    Anvari-Moghaddam, Amjad; Guerrero, Josep M.; Rahimi-Kian, Ashkan

    2016-01-01

    into a cohesive, networked package that fully utilizes smart energy-efficient end-use devices, advanced building control/automation systems, and integrated communications architectures, it is possible to efficiently manage energy and comfort at the end-use location. In this paper, an ontology-driven multi......-agent control system with intelligent optimizers is proposed for optimal real-time dispatch of an integrated building and microgrid system considering coordinated demand response (DR) and DERs management. The optimal dispatch problem is formulated as a mixed integer nonlinear programing problem (MINLP...

  9. EPOS for Coordination of Asynchronous Sensor Webs

    Data.gov (United States)

    National Aeronautics and Space Administration — Develop, integrate, and deploy software-based tools to coordinate asynchronous, distributed missions and optimize observation planning spanning simultaneous...

  10. Library Automation

    OpenAIRE

    Dhakne, B. N.; Giri, V. V; Waghmode, S. S.

    2010-01-01

    New technologies library provides several new materials, media and mode of storing and communicating the information. Library Automation reduces the drudgery of repeated manual efforts in library routine. By use of library automation collection, Storage, Administration, Processing, Preservation and communication etc.

  11. Process automation

    International Nuclear Information System (INIS)

    Moser, D.R.

    1986-01-01

    Process automation technology has been pursued in the chemical processing industries and to a very limited extent in nuclear fuel reprocessing. Its effective use has been restricted in the past by the lack of diverse and reliable process instrumentation and the unavailability of sophisticated software designed for process control. The Integrated Equipment Test (IET) facility was developed by the Consolidated Fuel Reprocessing Program (CFRP) in part to demonstrate new concepts for control of advanced nuclear fuel reprocessing plants. A demonstration of fuel reprocessing equipment automation using advanced instrumentation and a modern, microprocessor-based control system is nearing completion in the facility. This facility provides for the synergistic testing of all chemical process features of a prototypical fuel reprocessing plant that can be attained with unirradiated uranium-bearing feed materials. The unique equipment and mission of the IET facility make it an ideal test bed for automation studies. This effort will provide for the demonstration of the plant automation concept and for the development of techniques for similar applications in a full-scale plant. A set of preliminary recommendations for implementing process automation has been compiled. Some of these concepts are not generally recognized or accepted. The automation work now under way in the IET facility should be useful to others in helping avoid costly mistakes because of the underutilization or misapplication of process automation. 6 figs

  12. Automated campaign system

    Science.gov (United States)

    Vondran, Gary; Chao, Hui; Lin, Xiaofan; Beyer, Dirk; Joshi, Parag; Atkins, Brian; Obrador, Pere

    2006-02-01

    To run a targeted campaign involves coordination and management across numerous organizations and complex process flows. Everything from market analytics on customer databases, acquiring content and images, composing the materials, meeting the sponsoring enterprise brand standards, driving through production and fulfillment, and evaluating results; all processes are currently performed by experienced highly trained staff. Presented is a developed solution that not only brings together technologies that automate each process, but also automates the entire flow so that a novice user could easily run a successful campaign from their desktop. This paper presents the technologies, structure, and process flows used to bring this system together. Highlighted will be how the complexity of running a targeted campaign is hidden from the user through technologies, all while providing the benefits of a professionally managed campaign.

  13. Coordination cycles

    Czech Academy of Sciences Publication Activity Database

    Steiner, Jakub

    2008-01-01

    Roč. 63, č. 1 (2008), s. 308-327 ISSN 0899-8256 Institutional research plan: CEZ:AV0Z70850503 Keywords : global games * coordination * crises * cycles and fluctuations Subject RIV: AH - Economics Impact factor: 1.333, year: 2008

  14. Coordination cycles

    Czech Academy of Sciences Publication Activity Database

    Steiner, Jakub

    -, č. 274 (2005), s. 1-26 ISSN 1211-3298 Institutional research plan: CEZ:AV0Z70850503 Keywords : coordination * crises * cycles and fluctuations Subject RIV: AH - Economics http://www.cerge-ei.cz/pdf/wp/Wp274.pdf

  15. Comparison and Assessment of a Multiple Optimal Coordinated Design Based on PSS and the STATCOM Device for Damping Power System Oscillations

    OpenAIRE

    A.N. Hussain; F. Malek; M.A. Rashid; L. Mohamed

    2014-01-01

    The aim of this study is to present a comprehensive comparison and assessment of the damping function improvement for the multiple damping stabilizers using the simultaneously coordinated design based on Power System Stabilizer (PSS) and Static synchronous Compensator (STATCOM). In electrical power system, the STATCOM device is used to support bus voltage by compensating reactive power; it is also capable of enhancing the stability of the power system by the adding a supplementary damping sta...

  16. Automated analysis of images acquired with electronic portal imaging device during delivery of quality assurance plans for inversely optimized arc therapy

    DEFF Research Database (Denmark)

    Fredh, Anna; Korreman, Stine; Rosenschöld, Per Munck af

    2010-01-01

    This work presents an automated method for comprehensively analyzing EPID images acquired for quality assurance of RapidArc treatment delivery. In-house-developed software has been used for the analysis and long-term results from measurements on three linacs are presented....

  17. Reagent and labor cost optimization through automation of fluorescence in situ hybridization (FISH) with the VP 2000: an Italian case study.

    Science.gov (United States)

    Zanatta, Lucia; Valori, Laura; Cappelletto, Eleonora; Pozzebon, Maria Elena; Pavan, Elisabetta; Dei Tos, Angelo Paolo; Merkle, Dennis

    2015-02-01

    In the modern molecular diagnostic laboratory, cost considerations are of paramount importance. Automation of complex molecular assays not only allows a laboratory to accommodate higher test volumes and throughput but also has a considerable impact on the cost of testing from the perspective of reagent costs, as well as hands-on time for skilled laboratory personnel. The following study tracked the cost of labor (hands-on time) and reagents for fluorescence in situ hybridization (FISH) testing in a routine, high-volume pathology and cytogenetics laboratory in Treviso, Italy, over a 2-y period (2011-2013). The laboratory automated FISH testing with the VP 2000 Processor, a deparaffinization, pretreatment, and special staining instrument produced by Abbott Molecular, and compared hands-on time and reagent costs to manual FISH testing. The results indicated significant cost and time saving when automating FISH with VP 2000 when more than six FISH tests were run per week. At 12 FISH assays per week, an approximate total cost reduction of 55% was observed. When running 46 FISH specimens per week, the cost saving increased to 89% versus manual testing. The results demonstrate that the VP 2000 processor can significantly reduce the cost of FISH testing in diagnostic laboratories. © 2014 Society for Laboratory Automation and Screening.

  18. Rhythms of dialogue in infancy: coordinated timing in development.

    Science.gov (United States)

    Jaffe, J; Beebe, B; Feldstein, S; Crown, C L; Jasnow, M D

    2001-01-01

    Although theories of early social development emphasize the advantage of mother-infant rhythmic coupling and bidirectional coordination, empirical demonstrations remain sparse. We therefore test the hypothesis that vocal rhythm coordination at age 4 months predicts attachment and cognition at age 12 months. Partner and site novelty were studied by recording mother-infant, stranger-infant, and mother-stranger face-to-face interactions in both home and laboratory sites for 88 4-month-old infants, for a total of 410 recordings. An automated dialogic coding scheme, appropriate to the nonperiodic rhythms of our data, implemented a systems concept of every action as jointly produced by both partners. Adult-infant coordination at age 4 months indeed predicted both outcomes at age 12 months, but midrange degree of mother-infant and stranger-infant coordination was optimal for attachment (Strange Situation), whereas high ("tight") stranger-infant coordination in the lab was optimal for cognition (Bayley Scales). Thus, high coordination can index more or less optimal outcomes, as a function of outcome measure, partner, and site. Bidirectional coordination patterns were salient in both attachment and cognition predictions. Comparison of mother-infant and stranger-infant interactions was particularly informative, suggesting the dynamics of infants' early differentiation from mothers. Stranger and infant showed different patterns of vocal rhythm activity level, were more bidirectional, accounted for 8 times more variance in Bayley scores, predicted attachment just as well as mother and infant, and revealed more varied contingency structures and a wider range of attachment outcomes. To explain why vocal timing measures at age 4 months predict outcomes at age 12 months, our dialogue model was construed as containing procedures for regulating the pragmatics of proto-conversation. The timing patterns of the 4-month-olds were seen as procedural or performance knowledge, and as

  19. TECHNICAL COORDINATION

    CERN Multimedia

    A. Ball

    Overview From a technical perspective, CMS has been in “beam operation” state since 6th November. The detector is fully closed with all components operational and the magnetic field is normally at the nominal 3.8T. The UXC cavern is normally closed with the radiation veto set. Access to UXC is now only possible during downtimes of LHC. Such accesses must be carefully planned, documented and carried out in agreement with CMS Technical Coordination, Experimental Area Management, LHC programme coordination and the CCC. Material flow in and out of UXC is now strictly controlled. Access to USC remains possible at any time, although, for safety reasons, it is necessary to register with the shift crew in the control room before going down.It is obligatory for all material leaving UXC to pass through the underground buffer zone for RP scanning, database entry and appropriate labeling for traceability. Technical coordination (notably Stephane Bally and Christoph Schaefer), the shift crew and run ...

  20. Design automation, languages, and simulations

    CERN Document Server

    Chen, Wai-Kai

    2003-01-01

    As the complexity of electronic systems continues to increase, the micro-electronic industry depends upon automation and simulations to adapt quickly to market changes and new technologies. Compiled from chapters contributed to CRC's best-selling VLSI Handbook, this volume covers a broad range of topics relevant to design automation, languages, and simulations. These include a collaborative framework that coordinates distributed design activities through the Internet, an overview of the Verilog hardware description language and its use in a design environment, hardware/software co-design, syst

  1. Application of an automation system and a supervisory control and data acquisition (SCADA) system for the optimal operation of a membrane adsorption hybrid system.

    Science.gov (United States)

    Smith, P J; Vigneswaran, S; Ngo, H H; Nguyen, H T; Ben-Aim, R

    2006-01-01

    The application of automation and supervisory control and data acquisition (SCADA) systems to municipal water and wastewater treatment plants is rapidly increasing. However, the application of these systems is less frequent in the research and development phases of emerging treatment technologies used in these industries. This study involved the implementation of automation and a SCADA system to the submerged membrane adsorption hybrid system for use in a semi-pilot scale research project. An incremental approach was used in the development of the automation and SCADA systems, leading to the development of two new control systems. The first system developed involved closed loop control of the backwash initiation, based upon a pressure increase, leading to productivity improvements as the backwash is only activated when required, not at a fixed time. This system resulted in a 40% reduction in the number of backwashes required and also enabled optimised operations under unsteady concentrations of wastewater. The second system developed involved closed loop control of the backwash duration, whereby the backwash was terminated when the pressure reached a steady state. This system resulted in a reduction of the duration of the backwash of up to 25% and enabled optimised operations as the foulant build-up within the reactor increased.

  2. Coordinated unbundling

    DEFF Research Database (Denmark)

    Timmermans, Bram; Zabala-Iturriagagoitia, Jon Mikel

    2013-01-01

    Public procurement for innovation is a matter of using public demand to trigger innovation. Empirical studies have demonstrated that demand-based policy instruments can be considered to be a powerful tool in stimulating innovative processes among existing firms; however, the existing literature has...... not focused on the role this policy instrument can play in the promotion of (knowledge-intensive) entrepreneurship. This paper investigates this link in more detail and introduces the concept of coordinated unbundling as a strategy that can facilitate this purpose. We also present a framework on how...... to organise public procurement for innovation around this unbundling strategy and provide a set of challenges that need to be addressed....

  3. TECHNICAL COORDINATION

    CERN Multimedia

    A. Ball

    2010-01-01

    Operational Experience At the end of the first full-year running period of LHC, CMS is established as a reliable, robust and mature experiment. In particular common systems and infrastructure faults accounted for <0.6 % CMS downtime during LHC pp physics. Technical operation throughout the entire year was rather smooth, the main faults requiring UXC access being sub-detector power systems and rack-cooling turbines. All such problems were corrected during scheduled technical stops, in the shadow of tunnel access needed by the LHC, or in negotiated accesses or access extensions. Nevertheless, the number of necessary accesses to the UXC averaged more than one per week and the technical stops were inevitably packed with work packages, typically 30 being executed within a few days, placing a high load on the coordination and area management teams. It is an appropriate moment for CMS Technical Coordination to thank all those in many CERN departments and in the Collaboration, who were involved in CMS techni...

  4. Automation in biological crystallization.

    Science.gov (United States)

    Stewart, Patrick Shaw; Mueller-Dieckmann, Jochen

    2014-06-01

    Crystallization remains the bottleneck in the crystallographic process leading from a gene to a three-dimensional model of the encoded protein or RNA. Automation of the individual steps of a crystallization experiment, from the preparation of crystallization cocktails for initial or optimization screens to the imaging of the experiments, has been the response to address this issue. Today, large high-throughput crystallization facilities, many of them open to the general user community, are capable of setting up thousands of crystallization trials per day. It is thus possible to test multiple constructs of each target for their ability to form crystals on a production-line basis. This has improved success rates and made crystallization much more convenient. High-throughput crystallization, however, cannot relieve users of the task of producing samples of high quality. Moreover, the time gained from eliminating manual preparations must now be invested in the careful evaluation of the increased number of experiments. The latter requires a sophisticated data and laboratory information-management system. A review of the current state of automation at the individual steps of crystallization with specific attention to the automation of optimization is given.

  5. Automated External Defibrillator

    Science.gov (United States)

    ... To Health Topics / Automated External Defibrillator Automated External Defibrillator Also known as What Is An automated external ... in survival. Training To Use an Automated External Defibrillator Learning how to use an AED and taking ...

  6. Library Automation.

    Science.gov (United States)

    Husby, Ole

    1990-01-01

    The challenges and potential benefits of automating university libraries are reviewed, with special attention given to cooperative systems. Aspects discussed include database size, the role of the university computer center, storage modes, multi-institutional systems, resource sharing, cooperative system management, networking, and intelligent…

  7. RUN COORDINATION

    CERN Multimedia

    C. Delaere

    2012-01-01

      With the analysis of the first 5 fb–1 culminating in the announcement of the observation of a new particle with mass of around 126 GeV/c2, the CERN directorate decided to extend the LHC run until February 2013. This adds three months to the original schedule. Since then the LHC has continued to perform extremely well, and the total luminosity delivered so far this year is 22 fb–1. CMS also continues to perform excellently, recording data with efficiency higher than 95% for fills with the magnetic field at nominal value. The highest instantaneous luminosity achieved by LHC to date is 7.6x1033 cm–2s–1, which translates into 35 interactions per crossing. On the CMS side there has been a lot of work to handle these extreme conditions, such as a new DAQ computer farm and trigger menus to handle the pile-up, automation of recovery procedures to minimise the lost luminosity, better training for the shift crews, etc. We did suffer from a couple of infrastructure ...

  8. Coordinated Pitch & Torque Control of Large-Scale Wind Turbine Based on Pareto Eciency Analysis

    DEFF Research Database (Denmark)

    Lin, Zhongwei; Chen, Zhenyu; Wu, Qiuwei

    2018-01-01

    to optimize the controller coordination based on the Pareto optimization theory. Three solutions are obtained through optimization, which includes the optimal torque solution, optimal power solution, and satisfactory solution. Detailed comparisons evaluate the performance of the three selected solutions...

  9. RUN COORDINATION

    CERN Multimedia

    C. Delaere

    2013-01-01

    Since the LHC ceased operations in February, a lot has been going on at Point 5, and Run Coordination continues to monitor closely the advance of maintenance and upgrade activities. In the last months, the Pixel detector was extracted and is now stored in the pixel lab in SX5; the beam pipe has been removed and ME1/1 removal has started. We regained access to the vactank and some work on the RBX of HB has started. Since mid-June, electricity and cooling are back in S1 and S2, allowing us to turn equipment back on, at least during the day. 24/7 shifts are not foreseen in the next weeks, and safety tours are mandatory to keep equipment on overnight, but re-commissioning activities are slowly being resumed. Given the (slight) delays accumulated in LS1, it was decided to merge the two global runs initially foreseen into a single exercise during the week of 4 November 2013. The aim of the global run is to check that we can run (parts of) CMS after several months switched off, with the new VME PCs installed, th...

  10. RUN COORDINATION

    CERN Multimedia

    Christophe Delaere

    2013-01-01

    The focus of Run Coordination during LS1 is to monitor closely the advance of maintenance and upgrade activities, to smooth interactions between subsystems and to ensure that all are ready in time to resume operations in 2015 with a fully calibrated and understood detector. After electricity and cooling were restored to all equipment, at about the time of the last CMS week, recommissioning activities were resumed for all subsystems. On 7 October, DCS shifts began 24/7 to allow subsystems to remain on to facilitate operations. That culminated with the Global Run in November (GriN), which   took place as scheduled during the week of 4 November. The GriN has been the first centrally managed operation since the beginning of LS1, and involved all subdetectors but the Pixel Tracker presently in a lab upstairs. All nights were therefore dedicated to long stable runs with as many subdetectors as possible. Among the many achievements in that week, three items may be highlighted. First, the Strip...

  11. Multi-Agent Systems for Transportation Planning and Coordination

    NARCIS (Netherlands)

    J.M. Moonen (Hans)

    2009-01-01

    textabstractMany transportation problems are in fact coordination problems: problems that require communication, coordination and negotiation to be optimally solved. However, most software systems targeted at transportation have never approached it this way, and have instead concentrated on

  12. Lighting Automation - Flying an Earthlike Habit Project

    Science.gov (United States)

    Falker, Jay; Howard, Ricky; Culbert, Christopher; Clark, Toni Anne; Kolomenski, Andrei

    2017-01-01

    Our proposal will enable the development of automated spacecraft habitats for long duration missions. Majority of spacecraft lighting systems employ lamps or zone specific switches and dimmers. Automation is not in the "picture". If we are to build long duration environments, which provide earth-like habitats, minimize crew time, and optimize spacecraft power reserves, innovation in lighting automation is a must. To transform how spacecraft lighting environments are automated, we will provide performance data on a standard lighting communication protocol. We will investigate utilization and application of an industry accepted lighting control protocol, DMX512. We will demonstrate how lighting automation can conserve power, assist with lighting countermeasures, and utilize spatial body tracking. By using DMX512 we will prove the "wheel" does not need to be reinvented in terms of smart lighting and future spacecraft can use a standard lighting protocol to produce an effective, optimized and potentially earthlike habitat.

  13. Automated curved planar reformation of 3D spine images

    International Nuclear Information System (INIS)

    Vrtovec, Tomaz; Likar, Bostjan; Pernus, Franjo

    2005-01-01

    Traditional techniques for visualizing anatomical structures are based on planar cross-sections from volume images, such as images obtained by computed tomography (CT) or magnetic resonance imaging (MRI). However, planar cross-sections taken in the coordinate system of the 3D image often do not provide sufficient or qualitative enough diagnostic information, because planar cross-sections cannot follow curved anatomical structures (e.g. arteries, colon, spine, etc). Therefore, not all of the important details can be shown simultaneously in any planar cross-section. To overcome this problem, reformatted images in the coordinate system of the inspected structure must be created. This operation is usually referred to as curved planar reformation (CPR). In this paper we propose an automated method for CPR of 3D spine images, which is based on the image transformation from the standard image-based to a novel spine-based coordinate system. The axes of the proposed spine-based coordinate system are determined on the curve that represents the vertebral column, and the rotation of the vertebrae around the spine curve, both of which are described by polynomial models. The optimal polynomial parameters are obtained in an image analysis based optimization framework. The proposed method was qualitatively and quantitatively evaluated on five CT spine images. The method performed well on both normal and pathological cases and was consistent with manually obtained ground truth data. The proposed spine-based CPR benefits from reduced structural complexity in favour of improved feature perception of the spine. The reformatted images are diagnostically valuable and enable easier navigation, manipulation and orientation in 3D space. Moreover, reformatted images may prove useful for segmentation and other image analysis tasks

  14. How to assess sustainability in automated manufacturing

    DEFF Research Database (Denmark)

    Dijkman, Teunis Johannes; Rödger, Jan-Markus; Bey, Niki

    2015-01-01

    The aim of this paper is to describe how sustainability in automation can be assessed. The assessment method is illustrated using a case study of a robot. Three aspects of sustainability assessment in automation are identified. Firstly, we consider automation as part of a larger system...... that fulfills the market demand for a given functionality. Secondly, three aspects of sustainability have to be assessed: environment, economy, and society. Thirdly, automation is part of a system with many levels, with different actors on each level, resulting in meeting the market demand. In this system......, (sustainability) specifications move top-down, which helps avoiding sub-optimization and problem shifting. From these three aspects, sustainable automation is defined as automation that contributes to products that fulfill a market demand in a more sustainable way. The case study presents the carbon footprints...

  15. Optimization of automated segmentation of monkeypox virus-induced lung lesions from normal lung CT images using hard C-means algorithm

    Science.gov (United States)

    Castro, Marcelo A.; Thomasson, David; Avila, Nilo A.; Hufton, Jennifer; Senseney, Justin; Johnson, Reed F.; Dyall, Julie

    2013-03-01

    Monkeypox virus is an emerging zoonotic pathogen that results in up to 10% mortality in humans. Knowledge of clinical manifestations and temporal progression of monkeypox disease is limited to data collected from rare outbreaks in remote regions of Central and West Africa. Clinical observations show that monkeypox infection resembles variola infection. Given the limited capability to study monkeypox disease in humans, characterization of the disease in animal models is required. A previous work focused on the identification of inflammatory patterns using PET/CT image modality in two non-human primates previously inoculated with the virus. In this work we extended techniques used in computer-aided detection of lung tumors to identify inflammatory lesions from monkeypox virus infection and their progression using CT images. Accurate estimation of partial volumes of lung lesions via segmentation is difficult because of poor discrimination between blood vessels, diseased regions, and outer structures. We used hard C-means algorithm in conjunction with landmark based registration to estimate the extent of monkeypox virus induced disease before inoculation and after disease progression. Automated estimation is in close agreement with manual segmentation.

  16. Automated optimization of JPEG 2000 encoder options based on model observer performance for detecting variable signals in X-ray coronary angiograms.

    Science.gov (United States)

    Zhang, Yani; Pham, Binh T; Eckstein, Miguel P

    2004-04-01

    Image compression is indispensable in medical applications where inherently large volumes of digitized images are presented. JPEG 2000 has recently been proposed as a new image compression standard. The present recommendations on the choice of JPEG 2000 encoder options were based on nontask-based metrics of image quality applied to nonmedical images. We used the performance of a model observer [non-prewhitening matched filter with an eye filter (NPWE)] in a visual detection task of varying signals [signal known exactly but variable (SKEV)] in X-ray coronary angiograms to optimize JPEG 2000 encoder options through a genetic algorithm procedure. We also obtained the performance of other model observers (Hotelling, Laguerre-Gauss Hotelling, channelized-Hotelling) and human observers to evaluate the validity of the NPWE optimized JPEG 2000 encoder settings. Compared to the default JPEG 2000 encoder settings, the NPWE-optimized encoder settings improved the detection performance of humans and the other three model observers for an SKEV task. In addition, the performance also was improved for a more clinically realistic task where the signal varied from image to image but was not known a priori to observers [signal known statistically (SKS)]. The highest performance improvement for humans was at a high compression ratio (e.g., 30:1) which resulted in approximately a 75% improvement for both the SKEV and SKS tasks.

  17. Adult congenital heart disease nurse coordination: Essential skills and role in optimizing team-based care a position statement from the International Society for Adult Congenital Heart Disease (ISACHD).

    Science.gov (United States)

    Sillman, Christina; Morin, Joanne; Thomet, Corina; Barber, Deena; Mizuno, Yoshiko; Yang, Hsiao-Ling; Malpas, Theresa; Flocco, Serena Francesca; Finlay, Clare; Chen, Chi-Wen; Balon, Yvonne; Fernandes, Susan M

    2017-02-15

    Founded in 1992, the International Society for Adult Congenital Heart Disease (ISACHD) is the leading global organization of professionals dedicated to pursuing excellence in the care of adults with congenital heart disease (CHD) worldwide. Among ISACHD's objectives is to "promote a holistic team-based approach to the care of the adult with CHD that is comprehensive, patient-centered, and interdisciplinary" (http://www.isachd.org). This emphasis on team-based care reflects the fact that adults with CHD constitute a heterogeneous population with a wide spectrum of disease complexity, frequent association with other organ involvement, and varied co-morbidities and psychosocial issues. Recognizing the vital role of the adult CHD (ACHD) nurse coordinator (ACHD-NC) in optimizing team-based care, ISACHD established a task force to elucidate and provide guidance on the roles and responsibilities of the ACHD-NC. Acknowledging that nursing roles can vary widely from region to region based on factors such as credentials, scopes of practice, regulations, and local culture and tradition, an international panel was assembled with experts from North America, Europe, East Asia, and Oceania. The writing committee was tasked with reviewing key aspects of the ACHD-NC's role in team-based ACHD care. The resulting ISACHD position statement addresses the ACHD-NC's role and skills required in organizing, coordinating, and facilitating the care of adults with CHD, holistic assessment of the ACHD patient, patient education and counseling, and support for self-care management and self-advocacy. Crown Copyright © 2016. Published by Elsevier B.V. All rights reserved.

  18. Autonomous Systems: Habitat Automation

    Data.gov (United States)

    National Aeronautics and Space Administration — The Habitat Automation Project Element within the Autonomous Systems Project is developing software to automate the automation of habitats and other spacecraft. This...

  19. An Automation Planning Primer.

    Science.gov (United States)

    Paynter, Marion

    1988-01-01

    This brief planning guide for library automation incorporates needs assessment and evaluation of options to meet those needs. A bibliography of materials on automation planning and software reviews, library software directories, and library automation journals is included. (CLB)

  20. EPOS for Coordination of Asynchronous Sensor Webs Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Develop, integrate, and deploy software-based tools to coordinate asynchronous, distributed missions and optimize observation planning spanning simultaneous...

  1. Automated Budget System -

    Data.gov (United States)

    Department of Transportation — The Automated Budget System (ABS) automates management and planning of the Mike Monroney Aeronautical Center (MMAC) budget by providing enhanced capability to plan,...

  2. Automation 2017

    CERN Document Server

    Zieliński, Cezary; Kaliczyńska, Małgorzata

    2017-01-01

    This book consists of papers presented at Automation 2017, an international conference held in Warsaw from March 15 to 17, 2017. It discusses research findings associated with the concepts behind INDUSTRY 4.0, with a focus on offering a better understanding of and promoting participation in the Fourth Industrial Revolution. Each chapter presents a detailed analysis of a specific technical problem, in most cases followed by a numerical analysis, simulation and description of the results of implementing the solution in a real-world context. The theoretical results, practical solutions and guidelines presented are valuable for both researchers working in the area of engineering sciences and practitioners looking for solutions to industrial problems. .

  3. Marketing automation

    Directory of Open Access Journals (Sweden)

    TODOR Raluca Dania

    2017-01-01

    Full Text Available The automation of the marketing process seems to be nowadays, the only solution to face the major changes brought by the fast evolution of technology and the continuous increase in supply and demand. In order to achieve the desired marketing results, businessis have to employ digital marketing and communication services. These services are efficient and measurable thanks to the marketing technology used to track, score and implement each campaign. Due to the technical progress, the marketing fragmentation, demand for customized products and services on one side and the need to achieve constructive dialogue with the customers, immediate and flexible response and the necessity to measure the investments and the results on the other side, the classical marketing approached had changed continue to improve substantially.

  4. Design automation for integrated optics

    Science.gov (United States)

    Condrat, Christopher

    Recent breakthroughs in silicon photonics technology are enabling the integration of optical devices into silicon-based semiconductor processes. Photonics technology enables high-speed, high-bandwidth, and high-fidelity communications on the chip-scale---an important development in an increasingly communications-oriented semiconductor world. Significant developments in silicon photonic manufacturing and integration are also enabling investigations into applications beyond that of traditional telecom: sensing, filtering, signal processing, quantum technology---and even optical computing. In effect, we are now seeing a convergence of communications and computation, where the traditional roles of optics and microelectronics are becoming blurred. As the applications for opto-electronic integrated circuits (OEICs) are developed, and manufacturing capabilities expand, design support is necessary to fully exploit the potential of this optics technology. Such design support for moving beyond custom-design to automated synthesis and optimization is not well developed. Scalability requires abstractions, which in turn enables and requires the use of optimization algorithms and design methodology flows. Design automation represents an opportunity to take OEIC design to a larger scale, facilitating design-space exploration, and laying the foundation for current and future optical applications---thus fully realizing the potential of this technology. This dissertation proposes design automation for integrated optic system design. Using a building-block model for optical devices, we provide an EDA-inspired design flow and methodologies for optical design automation. Underlying these flows and methodologies are new supporting techniques in behavioral and physical synthesis, as well as device-resynthesis techniques for thermal-aware system integration. We also provide modeling for optical devices and determine optimization and constraint parameters that guide the automation

  5. Optimization and studies of the welding processes, automation of the sealing welding system and fracture mechanics in the vessels surveillance in nuclear power plants

    International Nuclear Information System (INIS)

    Gama R, G.

    2011-01-01

    Inside this work the optimization of two welding systems is described, as well as the conclusion of a system for the qualification of containers sealing in the National Institute of Nuclear Research that have application in the surveillance programs of nuclear reactors vessels and the correspondent extension of the operation license. The test tubes Charpy are assay to evaluate the embrittlement grade, when obtaining the increment in the reference temperature and the decrease of the absorbed maximum energy, in the transition curve fragile-ductile of the material. After the test two test tube halves are obtained that should take advantage to follow the surveillance of the vessel and their possible operation extension, this is achieved by means of rebuilding (being obtained of a tested test tube two reconstituted test tubes). The welding system for the rebuilding of test tubes Charpy, was optimized when diminishing the union force at solder, achieving the elimination of the rejection for penetration lack for spill. For this work temperature measurements were carried out at different distances of the welding interface from 1 up to 12 mm, obtaining temperature profiles. With the maximum temperatures were obtained a graph and equation that represents the maximum temperature regarding the distance of the interface, giving as a result practical the elimination of other temperature measurements. The reconstituted test tubes were introduced inside pressurized containers with helium of ultra high purity to 1 pressure atmosphere. This process was carried out in the welding system for containers sealing, where an automatic process was implemented by means of an application developed in the program LabVIEW, reducing operation times and allowing the remote control of the process, the acquisition parameters as well as the generation of welding reports, avoiding with this the human error. (Author)

  6. Automated Methods of Corrosion Measurements

    DEFF Research Database (Denmark)

    Andersen, Jens Enevold Thaulov

    1997-01-01

    Scanning probe microscopy (SPM) techniques rely on computer recordings of interactions between the tip of a minute probe and the surface of the small specimen as a function of position; the measurements are used to depict an image of the atomic-scale surface topography on the computer screen....... Mechanical control, recording, and data processing must therefore be automated to a high level of precision and reliability. These general techniques and the apparatus involved have been described extensively. The automated methods of such high-resolution microscopy coordinated with computerized...... electrochemical measurements as well as elemental analysis look very promising for elucidating corrosion reaction mechanisms. The study of initial surface reactions at the atomic or submicron level is becoming an important field of research in the understanding of corrosion processes. At present, mainly two...

  7. Multi Satellite Cooperative and Non-Cooperative Trajectory Coordination

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of this project is to develop a framework to optimize the coordination of multiple spacecraft, each with defined goals. Using this framework, optimal...

  8. Control coordination abilities in shock combat sports

    Directory of Open Access Journals (Sweden)

    Natalya Boychenko

    2014-12-01

    Full Text Available Purpose: optimize the process control level of coordination abilities in martial arts. Material and Methods: analysis and compilation of scientific and methodological literature, interviews with coaches of drum martial arts, video analysis techniques, teacher observations. Results: identified specific types of coordination abilities in shock combat sports. Pod branny and offered specific and nonspecific tests to monitor the level of species athletes coordination abilities. Conclusion: it is determined that in order to achieve victory in the fight martial artists to navigate the space to be able to assess and manage dynamic and spatio-temporal parameters of movements, maintain balance, have a high coordination of movements. The proposed tests to monitor species coordination abilities athletes allow an objective assessment of not only the overall level of coordination, and the level of specific types of manifestations of this ability.

  9. Optimal trading based on ideal coordination

    International Nuclear Information System (INIS)

    Egeland, H.

    1992-01-01

    The author places more emphasis on the technical than on the economical aspects of trading with electric power. Calculation models which can be used to study this trade taking place under the influence of unequal preconditions in Denmark, Finland, Norway and Sweden are presented. It is suggested that trade between these countries is currently satisfactory and should be further developed. The advantages of standard contracts are mentioned. Forms of exchange of, for example, technology know-how between these Nordic countries in the process of connecting their distribution systems etc. would be most advantageous. (AB)

  10. Optimization of Aimpoints for Coordinate Seeking Weapons

    Science.gov (United States)

    2015-09-01

    aiming) and independent (ballistic) errors are taken into account, before utilizing each of the three damage functions representing the weapon. A Monte...into account, before utilizing each of the three damage functions representing the weapon. A Monte-Carlo simulation method is used to calculate the...Rectangular Cookie Cutter RDF Rectangular Damage Function REP Range Error Probable xvi SSPD Single Sortie Probability of Damage TLE Target Location

  11. Both Automation and Paper.

    Science.gov (United States)

    Purcell, Royal

    1988-01-01

    Discusses the concept of a paperless society and the current situation in library automation. Various applications of automation and telecommunications are addressed, and future library automation is considered. Automation at the Monroe County Public Library in Bloomington, Indiana, is described as an example. (MES)

  12. Laboratory automation: trajectory, technology, and tactics.

    Science.gov (United States)

    Markin, R S; Whalen, S A

    2000-05-01

    Laboratory automation is in its infancy, following a path parallel to the development of laboratory information systems in the late 1970s and early 1980s. Changes on the horizon in healthcare and clinical laboratory service that affect the delivery of laboratory results include the increasing age of the population in North America, the implementation of the Balanced Budget Act (1997), and the creation of disease management companies. Major technology drivers include outcomes optimization and phenotypically targeted drugs. Constant cost pressures in the clinical laboratory have forced diagnostic manufacturers into less than optimal profitability states. Laboratory automation can be a tool for the improvement of laboratory services and may decrease costs. The key to improvement of laboratory services is implementation of the correct automation technology. The design of this technology should be driven by required functionality. Automation design issues should be centered on the understanding of the laboratory and its relationship to healthcare delivery and the business and operational processes in the clinical laboratory. Automation design philosophy has evolved from a hardware-based approach to a software-based approach. Process control software to support repeat testing, reflex testing, and transportation management, and overall computer-integrated manufacturing approaches to laboratory automation implementation are rapidly expanding areas. It is clear that hardware and software are functionally interdependent and that the interface between the laboratory automation system and the laboratory information system is a key component. The cost-effectiveness of automation solutions suggested by vendors, however, has been difficult to evaluate because the number of automation installations are few and the precision with which operational data have been collected to determine payback is suboptimal. The trend in automation has moved from total laboratory automation to a

  13. Coordinators in Safaliba

    OpenAIRE

    Bodua-Mango, Kenneth

    2012-01-01

    This study examines the Safaliba coordinators „ní‟ / „aní‟, „á‟, „ka‟, „chɛ‟ and „bíí‟ in their naturally occurring environments. Safaliba is a Gur language spoken by some 5000 -7000 people in the north-western part of Ghana. The main areas of study include the syntactic categories that each coordinator can coordinate, the semantic properties of each of the coordinators and the pragmatic effect that the use of theses coordinators can have. Combinations of the individual coordinators calle...

  14. Laboratory automation in clinical bacteriology: what system to choose?

    Science.gov (United States)

    Croxatto, A; Prod'hom, G; Faverjon, F; Rochais, Y; Greub, G

    2016-03-01

    Automation was introduced many years ago in several diagnostic disciplines such as chemistry, haematology and molecular biology. The first laboratory automation system for clinical bacteriology was released in 2006, and it rapidly proved its value by increasing productivity, allowing a continuous increase in sample volumes despite limited budgets and personnel shortages. Today, two major manufacturers, BD Kiestra and Copan, are commercializing partial or complete laboratory automation systems for bacteriology. The laboratory automation systems are rapidly evolving to provide improved hardware and software solutions to optimize laboratory efficiency. However, the complex parameters of the laboratory and automation systems must be considered to determine the best system for each given laboratory. We address several topics on laboratory automation that may help clinical bacteriologists to understand the particularities and operative modalities of the different systems. We present (a) a comparison of the engineering and technical features of the various elements composing the two different automated systems currently available, (b) the system workflows of partial and complete laboratory automation, which define the basis for laboratory reorganization required to optimize system efficiency, (c) the concept of digital imaging and telebacteriology, (d) the connectivity of laboratory automation to the laboratory information system, (e) the general advantages and disadvantages as well as the expected impacts provided by laboratory automation and (f) the laboratory data required to conduct a workflow assessment to determine the best configuration of an automated system for the laboratory activities and specificities. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  15. Automation Framework for Flight Dynamics Products Generation

    Science.gov (United States)

    Wiegand, Robert E.; Esposito, Timothy C.; Watson, John S.; Jun, Linda; Shoan, Wendy; Matusow, Carla

    2010-01-01

    XFDS provides an easily adaptable automation platform. To date it has been used to support flight dynamics operations. It coordinates the execution of other applications such as Satellite TookKit, FreeFlyer, MATLAB, and Perl code. It provides a mechanism for passing messages among a collection of XFDS processes, and allows sending and receiving of GMSEC messages. A unified and consistent graphical user interface (GUI) is used for the various tools. Its automation configuration is stored in text files, and can be edited either directly or using the GUI.

  16. Coordination control of distributed systems

    CERN Document Server

    Villa, Tiziano

    2015-01-01

    This book describes how control of distributed systems can be advanced by an integration of control, communication, and computation. The global control objectives are met by judicious combinations of local and nonlocal observations taking advantage of various forms of communication exchanges between distributed controllers. Control architectures are considered according to  increasing degrees of cooperation of local controllers:  fully distributed or decentralized control,  control with communication between controllers,  coordination control, and multilevel control.  The book covers also topics bridging computer science, communication, and control, like communication for control of networks, average consensus for distributed systems, and modeling and verification of discrete and of hybrid systems. Examples and case studies are introduced in the first part of the text and developed throughout the book. They include: control of underwater vehicles, automated-guided vehicles on a container terminal, contro...

  17. Dynamic adaptive service architecture – Towards coordinated service composition

    OpenAIRE

    Pahl, Claus

    2010-01-01

    peer-reviewed With software services becoming a strategic capability for the software sector, software architecture needs to address integration problems to help services to collaborate and coordinate their activities. The increasing need to address dynamic and automated changes can be answered by a service coordination architecture with event-based collaboration that enables dynamic and adaptive architectures. Intelligent service and process identification and adaptation techniques are su...

  18. Autonomy and Automation

    Science.gov (United States)

    Shively, Jay

    2017-01-01

    A significant level of debate and confusion has surrounded the meaning of the terms autonomy and automation. Automation is a multi-dimensional concept, and we propose that Remotely Piloted Aircraft Systems (RPAS) automation should be described with reference to the specific system and task that has been automated, the context in which the automation functions, and other relevant dimensions. In this paper, we present definitions of automation, pilot in the loop, pilot on the loop and pilot out of the loop. We further propose that in future, the International Civil Aviation Organization (ICAO) RPAS Panel avoids the use of the terms autonomy and autonomous when referring to automated systems on board RPA. Work Group 7 proposes to develop, in consultation with other workgroups, a taxonomy of Levels of Automation for RPAS.

  19. An automated swimming respirometer

    DEFF Research Database (Denmark)

    STEFFENSEN, JF; JOHANSEN, K; BUSHNELL, PG

    1984-01-01

    An automated respirometer is described that can be used for computerized respirometry of trout and sharks.......An automated respirometer is described that can be used for computerized respirometry of trout and sharks....

  20. Configuration Management Automation (CMA) -

    Data.gov (United States)

    Department of Transportation — Configuration Management Automation (CMA) will provide an automated, integrated enterprise solution to support CM of FAA NAS and Non-NAS assets and investments. CMA...

  1. Automation in College Libraries.

    Science.gov (United States)

    Werking, Richard Hume

    1991-01-01

    Reports the results of a survey of the "Bowdoin List" group of liberal arts colleges. The survey obtained information about (1) automation modules in place and when they had been installed; (2) financing of automation and its impacts on the library budgets; and (3) library director's views on library automation and the nature of the…

  2. Bioprocessing automation in cell therapy manufacturing: Outcomes of special interest group automation workshop.

    Science.gov (United States)

    Ball, Oliver; Robinson, Sarah; Bure, Kim; Brindley, David A; Mccall, David

    2018-02-13

    Phacilitate held a Special Interest Group workshop event in Edinburgh, UK, in May 2017. The event brought together leading stakeholders in the cell therapy bioprocessing field to identify present and future challenges and propose potential solutions to automation in cell therapy bioprocessing. Here, we review and summarize discussions from the event. Deep biological understanding of a product, its mechanism of action and indication pathogenesis underpin many factors relating to bioprocessing and automation. To fully exploit the opportunities of bioprocess automation, therapeutics developers must closely consider whether an automation strategy is applicable, how to design an 'automatable' bioprocess and how to implement process modifications with minimal disruption. Major decisions around bioprocess automation strategy should involve all relevant stakeholders; communication between technical and business strategy decision-makers is of particular importance. Developers should leverage automation to implement in-process testing, in turn applicable to process optimization, quality assurance (QA)/ quality control (QC), batch failure control, adaptive manufacturing and regulatory demands, but a lack of precedent and technical opportunities can complicate such efforts. Sparse standardization across product characterization, hardware components and software platforms is perceived to complicate efforts to implement automation. The use of advanced algorithmic approaches such as machine learning may have application to bioprocess and supply chain optimization. Automation can substantially de-risk the wider supply chain, including tracking and traceability, cryopreservation and thawing and logistics. The regulatory implications of automation are currently unclear because few hardware options exist and novel solutions require case-by-case validation, but automation can present attractive regulatory incentives. Copyright © 2018 International Society for Cellular Therapy

  3. Modeling the Coordinated Operation between Bus Rapid Transit and Bus

    Directory of Open Access Journals (Sweden)

    Jiaqing Wu

    2015-01-01

    Full Text Available The coordination between bus rapid transit (BRT and feeder bus service is helpful in improving the operational efficiency and service level of urban public transport system. Therefore, a coordinated operation model of BRT and bus is intended to develop in this paper. The total costs are formulated and optimized by genetic algorithm. Moreover, the skip-stop BRT operation is considered when building the coordinated operation model. A case of the existing bus network in Beijing is studied, the proposed coordinated operation model of BRT and bus is applied, and the optimized headway and costs are obtained. The results show that the coordinated operation model could effectively decrease the total costs of the transit system and the transfer time of passengers. The results also suggest that the coordination between the skip-stop BRT and bus during peak hour is more effective than non-coordination operation.

  4. Advances in Automation and Robotics

    CERN Document Server

    International conference on Automation and Robotics ICAR2011

    2012-01-01

    The international conference on Automation and Robotics-ICAR2011 is held during December 12-13, 2011 in Dubai, UAE. The proceedings of ICAR2011 have been published by Springer Lecture Notes in Electrical Engineering, which include 163 excellent papers selected from more than 400 submitted papers.   The conference is intended to bring together the researchers and engineers/technologists working in different aspects of intelligent control systems and optimization, robotics and automation, signal processing, sensors, systems modeling and control, industrial engineering, production and management.   This part of proceedings includes 81 papers contributed by many researchers in relevant topic areas covered at ICAR2011 from various countries such as France, Japan, USA, Korea and China etc.     Many papers introduced their advanced research work recently; some of them gave a new solution to problems in the field, with powerful evidence and detail demonstration. Others stated the application of their designed and...

  5. Embedding Temporal Constraints For Coordinated Execution in Habitat Automation

    Science.gov (United States)

    Morris, Paul; Schwabacher, Mark; Dalal, Michael; Fry, Charles

    2013-01-01

    Future NASA plans call for long-duration deep space missions with human crews. Because of light-time delay and other considerations, increased autonomy will be needed. This will necessitate integration of tools in such areas as anomaly detection, diagnosis, planning, and execution. In this paper we investigate an approach that integrates planning and execution by embedding planner-derived temporal constraints in an execution procedure. To avoid the need for propagation, we convert the temporal constraints to dispatchable form. We handle some uncertainty in the durations without it affecting the execution; larger variations may cause activities to be skipped.

  6. Embedding Temporal Constraints for Coordinated Execution in Habitat Automation

    Data.gov (United States)

    National Aeronautics and Space Administration — Future NASA plans call for long-duration deep space missions with human crews. Because of light-time delay and other considerations, increased autonomy will be...

  7. Automated Support for Rapid Coordination of Joint UUV Operation

    Science.gov (United States)

    2015-03-01

    integration into the fleet as a valuable asset. The image in Figure 1 depicts the Navy’s vision for UUV integration into the fleet. Figure 1...limitation on providing the warfare commander with near real-time data from AUVs on station, Marr [6] proposed and simulated rendezvous capabilities...previous example, Marr realized underwater sound propagation severely limited the acceptable distances for transferring data between the two AUVs

  8. Design Optimization of Internal Flow Devices

    DEFF Research Database (Denmark)

    Madsen, Jens Ingemann

    The power of computational fluid dynamics is boosted through the use of automated design optimization methodologies. The thesis considers both derivative-based search optimization and the use of response surface methodologies.......The power of computational fluid dynamics is boosted through the use of automated design optimization methodologies. The thesis considers both derivative-based search optimization and the use of response surface methodologies....

  9. Automation in Clinical Microbiology

    Science.gov (United States)

    Ledeboer, Nathan A.

    2013-01-01

    Historically, the trend toward automation in clinical pathology laboratories has largely bypassed the clinical microbiology laboratory. In this article, we review the historical impediments to automation in the microbiology laboratory and offer insight into the reasons why we believe that we are on the cusp of a dramatic change that will sweep a wave of automation into clinical microbiology laboratories. We review the currently available specimen-processing instruments as well as the total laboratory automation solutions. Lastly, we outline the types of studies that will need to be performed to fully assess the benefits of automation in microbiology laboratories. PMID:23515547

  10. Automation of industrial bioprocesses.

    Science.gov (United States)

    Beyeler, W; DaPra, E; Schneider, K

    2000-01-01

    The dramatic development of new electronic devices within the last 25 years has had a substantial influence on the control and automation of industrial bioprocesses. Within this short period of time the method of controlling industrial bioprocesses has changed completely. In this paper, the authors will use a practical approach focusing on the industrial applications of automation systems. From the early attempts to use computers for the automation of biotechnological processes up to the modern process automation systems some milestones are highlighted. Special attention is given to the influence of Standards and Guidelines on the development of automation systems.

  11. The problem of organization of a coastal coordinating computer center

    Science.gov (United States)

    Dyubkin, I. A.; Lodkin, I. I.

    1974-01-01

    The fundamental principles of the operation of a coastal coordinating and computing center under conditions of automation are presented. Special attention is devoted to the work of Coastal Computer Center of the Arctic and Antarctic Scientific Research Institute. This center generalizes from data collected in expeditions and also from observations made at polar stations.

  12. Autonomous Vehicle Coordination with Wireless Sensor and Actuator Networks

    NARCIS (Netherlands)

    Marin Perianu, Mihai; Bosch, S.; Marin Perianu, Raluca; Scholten, Johan; Havinga, Paul J.M.

    2010-01-01

    A coordinated team of mobile wireless sensor and actuator nodes can bring numerous benefits for various applications in the field of cooperative surveillance, mapping unknown areas, disaster management, automated highway and space exploration. This article explores the idea of mobile nodes using

  13. 49 CFR 1580.201 - Rail security coordinator.

    Science.gov (United States)

    2010-10-01

    ..., including heavy rail transit, light rail transit, automated guideway, cable car, inclined plane, funicular, and monorail systems. (4) Each operator of private cars, including business/office cars and circus..., Historic and Excursion Operators, and Private Cars § 1580.201 Rail security coordinator. (a) Applicability...

  14. Polymeric coordination compounds

    Indian Academy of Sciences (India)

    Administrator

    Ce(dipic)3Sr(dipicH2)(OH2)3·5H2O (4) (dipicH2 – dipicolinic acid) exhibits 1-D polymeric chain structure, built up of alternating nine coordinate Ce and eight coordinate. Sr polyhedra. The analogous Ce–Ba compound (5) exhibits a polymeric chain built up of nine coordinate Ba units only, arranged in a hexagonal lattice.

  15. A sensor-based automation system for handling nuclear materials

    International Nuclear Information System (INIS)

    Drotning, W.; Kimberly, H.; Wapman, W.; Darras, D.

    1997-01-01

    An automated system is being developed for handling large payloads of radioactive nuclear materials in an analytical laboratory. The automation system performs unpacking and repacking of payloads from shipping and storage containers, and delivery of the payloads to the stations in the laboratory. The system uses machine vision and force/torque sensing to provide sensor-based control of the automation system in order to enhance system safety, flexibility, and robustness, and achieve easy remote operation. The automation system also controls the operation of the laboratory measurement systems and the coordination of them with the robotic system. Particular attention has been given to system design features and analytical methods that provide an enhanced level of operational safety. Independent mechanical gripper interlock and tool release mechanisms were designed to prevent payload mishandling. An extensive Failure Modes and Effects Analysis of the automation system was developed as a safety design analysis tool

  16. AUTOMATION OF CONVEYOR BELT TRANSPORT

    Directory of Open Access Journals (Sweden)

    Nenad Marinović

    1990-12-01

    Full Text Available Belt conveyor transport, although one of the most economical mining transport system, introduce many problems to mantain the continuity of the operation. Every stop causes economical loses. Optimal operation require correct tension of the belt, correct belt position and velocity and faultless rolls, which are together input conditions for automation. Detection and position selection of the faults are essential for safety to eliminate fire hazard and for efficient maintenance. Detection and location of idler roll faults are still open problem and up to now not solved successfully (the paper is published in Croatian.

  17. Cell-Detection Technique for Automated Patch Clamping

    Science.gov (United States)

    McDowell, Mark; Gray, Elizabeth

    2008-01-01

    A unique and customizable machinevision and image-data-processing technique has been developed for use in automated identification of cells that are optimal for patch clamping. [Patch clamping (in which patch electrodes are pressed against cell membranes) is an electrophysiological technique widely applied for the study of ion channels, and of membrane proteins that regulate the flow of ions across the membranes. Patch clamping is used in many biological research fields such as neurobiology, pharmacology, and molecular biology.] While there exist several hardware techniques for automated patch clamping of cells, very few of those techniques incorporate machine vision for locating cells that are ideal subjects for patch clamping. In contrast, the present technique is embodied in a machine-vision algorithm that, in practical application, enables the user to identify good and bad cells for patch clamping in an image captured by a charge-coupled-device (CCD) camera attached to a microscope, within a processing time of one second. Hence, the present technique can save time, thereby increasing efficiency and reducing cost. The present technique involves the utilization of cell-feature metrics to accurately make decisions on the degree to which individual cells are "good" or "bad" candidates for patch clamping. These metrics include position coordinates (x,y) in the image plane, major-axis length, minor-axis length, area, elongation, roundness, smoothness, angle of orientation, and degree of inclusion in the field of view. The present technique does not require any special hardware beyond commercially available, off-the-shelf patch-clamping hardware: A standard patchclamping microscope system with an attached CCD camera, a personal computer with an imagedata- processing board, and some experience in utilizing imagedata- processing software are all that are needed. A cell image is first captured by the microscope CCD camera and image-data-processing board, then the image

  18. Coordination under the Shadow of Career Concerns

    DEFF Research Database (Denmark)

    Koch, Alexander; Morgenstern, Albrecht

    concerns arise that can both be ‘good’ (enhancing incentives for effort in developing ideas) and ‘bad’ (preventing voluntary coordination). Depending on the strength of career concerns, either group-based incentives or team production are optimal. This finding provides a possible link between the increased...... use of teams and changes in labor market returns to skills....

  19. Coordination failure caused by sunspots

    DEFF Research Database (Denmark)

    Beugnot, Julie; Gürgüç, Zeynep; Øvlisen, Frederik Roose

    2012-01-01

    In a coordination game with Pareto-ranked equilibria, we study whether a sunspot can lead to either coordination on an inferior equilibrium (mis-coordination) or to out-of equilibrium behavior (dis-coordination). While much of the literature searches for mechanisms to attain coordination on the e......In a coordination game with Pareto-ranked equilibria, we study whether a sunspot can lead to either coordination on an inferior equilibrium (mis-coordination) or to out-of equilibrium behavior (dis-coordination). While much of the literature searches for mechanisms to attain coordination...

  20. Bayesian Safety Risk Modeling of Human-Flightdeck Automation Interaction

    Science.gov (United States)

    Ancel, Ersin; Shih, Ann T.

    2015-01-01

    Usage of automatic systems in airliners has increased fuel efficiency, added extra capabilities, enhanced safety and reliability, as well as provide improved passenger comfort since its introduction in the late 80's. However, original automation benefits, including reduced flight crew workload, human errors or training requirements, were not achieved as originally expected. Instead, automation introduced new failure modes, redistributed, and sometimes increased workload, brought in new cognitive and attention demands, and increased training requirements. Modern airliners have numerous flight modes, providing more flexibility (and inherently more complexity) to the flight crew. However, the price to pay for the increased flexibility is the need for increased mode awareness, as well as the need to supervise, understand, and predict automated system behavior. Also, over-reliance on automation is linked to manual flight skill degradation and complacency in commercial pilots. As a result, recent accidents involving human errors are often caused by the interactions between humans and the automated systems (e.g., the breakdown in man-machine coordination), deteriorated manual flying skills, and/or loss of situational awareness due to heavy dependence on automated systems. This paper describes the development of the increased complexity and reliance on automation baseline model, named FLAP for FLightdeck Automation Problems. The model development process starts with a comprehensive literature review followed by the construction of a framework comprised of high-level causal factors leading to an automation-related flight anomaly. The framework was then converted into a Bayesian Belief Network (BBN) using the Hugin Software v7.8. The effects of automation on flight crew are incorporated into the model, including flight skill degradation, increased cognitive demand and training requirements along with their interactions. Besides flight crew deficiencies, automation system

  1. Coordinate measuring machines

    DEFF Research Database (Denmark)

    De Chiffre, Leonardo

    This document is used in connection with three exercises of 2 hours duration as a part of the course GEOMETRICAL METROLOGY AND MACHINE TESTING. The exercises concern three aspects of coordinate measuring: 1) Measuring and verification of tolerances on coordinate measuring machines, 2) Traceability...

  2. Automated Surveillance of Fruit Flies

    Science.gov (United States)

    Potamitis, Ilyas; Rigakis, Iraklis; Tatlas, Nicolaos-Alexandros

    2017-01-01

    Insects of the Diptera order of the Tephritidae family cause costly, annual crop losses worldwide. Monitoring traps are important components of integrated pest management programs used against fruit flies. Here we report the modification of typical, low-cost plastic traps for fruit flies by adding the necessary optoelectronic sensors to monitor the entrance of the trap in order to detect, time-stamp, GPS tag, and identify the species of incoming insects from the optoacoustic spectrum analysis of their wingbeat. We propose that the incorporation of automated streaming of insect counts, environmental parameters and GPS coordinates into informative visualization of collective behavior will finally enable better decision making across spatial and temporal scales, as well as administrative levels. The device presented is at product level of maturity as it has solved many pending issues presented in a previously reported study. PMID:28075346

  3. Automated Surveillance of Fruit Flies

    Directory of Open Access Journals (Sweden)

    Ilyas Potamitis

    2017-01-01

    Full Text Available Insects of the Diptera order of the Tephritidae family cause costly, annual crop losses worldwide. Monitoring traps are important components of integrated pest management programs used against fruit flies. Here we report the modification of typical, low-cost plastic traps for fruit flies by adding the necessary optoelectronic sensors to monitor the entrance of the trap in order to detect, time-stamp, GPS tag, and identify the species of incoming insects from the optoacoustic spectrum analysis of their wingbeat. We propose that the incorporation of automated streaming of insect counts, environmental parameters and GPS coordinates into informative visualization of collective behavior will finally enable better decision making across spatial and temporal scales, as well as administrative levels. The device presented is at product level of maturity as it has solved many pending issues presented in a previously reported study.

  4. Automated Surveillance of Fruit Flies.

    Science.gov (United States)

    Potamitis, Ilyas; Rigakis, Iraklis; Tatlas, Nicolaos-Alexandros

    2017-01-08

    Insects of the Diptera order of the Tephritidae family cause costly, annual crop losses worldwide. Monitoring traps are important components of integrated pest management programs used against fruit flies. Here we report the modification of typical, low-cost plastic traps for fruit flies by adding the necessary optoelectronic sensors to monitor the entrance of the trap in order to detect, time-stamp, GPS tag, and identify the species of incoming insects from the optoacoustic spectrum analysis of their wingbeat. We propose that the incorporation of automated streaming of insect counts, environmental parameters and GPS coordinates into informative visualization of collective behavior will finally enable better decision making across spatial and temporal scales, as well as administrative levels. The device presented is at product level of maturity as it has solved many pending issues presented in a previously reported study.

  5. Automated diagnostics scoping study. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Quadrel, R.W.; Lash, T.A.

    1994-06-01

    The objective of the Automated Diagnostics Scoping Study was to investigate the needs for diagnostics in building operation and to examine some of the current technologies in automated diagnostics that can address these needs. The study was conducted in two parts. In the needs analysis, the authors interviewed facility managers and engineers at five building sites. In the technology survey, they collected published information on automated diagnostic technologies in commercial and military applications as well as on technologies currently under research. The following describe key areas that the authors identify for the research, development, and deployment of automated diagnostic technologies: tools and techniques to aid diagnosis during building commissioning, especially those that address issues arising from integrating building systems and diagnosing multiple simultaneous faults; technologies to aid diagnosis for systems and components that are unmonitored or unalarmed; automated capabilities to assist cause-and-effect exploration during diagnosis; inexpensive, reliable sensors, especially those that expand the current range of sensory input; technologies that aid predictive diagnosis through trend analysis; integration of simulation and optimization tools with building automation systems to optimize control strategies and energy performance; integration of diagnostic, control, and preventive maintenance technologies. By relating existing technologies to perceived and actual needs, the authors reached some conclusions about the opportunities for automated diagnostics in building operation. Some of a building operator`s needs can be satisfied by off-the-shelf hardware and software. Other needs are not so easily satisfied, suggesting directions for future research. Their conclusions and suggestions are offered in the final section of this study.

  6. Lighting Automation - Flying an Earthlike Habitat

    Science.gov (United States)

    Clark, Tori A. (Principal Investigator); Kolomenski, Andrei

    2017-01-01

    Currently, spacecraft lighting systems are not demonstrating innovations in automation due to perceived costs in designing circuitry for the communication and automation of lights. The majority of spacecraft lighting systems employ lamps or zone specific manual switches and dimmers. This type of 'hardwired' solution does not easily convert to automation. With advances in solid state lighting, the potential to enhance a spacecraft habitat is lost if the communication and automation problem is not tackled. If we are to build long duration environments, which provide earth-like habitats, minimize crew time, and optimize spacecraft power reserves, innovation in lighting automation is a must. This project researched the use of the DMX512 communication protocol originally developed for high channel count lighting systems. DMX512 is an internationally governed, industry-accepted, lighting communication protocol with wide industry support. The lighting industry markets a wealth of hardware and software that utilizes DMX512, and there may be incentive to space certify the system. Our goal in this research is to enable the development of automated spacecraft habitats for long duration missions. To transform how spacecraft lighting environments are automated, our project conducted a variety of tests to determine a potential scope of capability. We investigated utilization and application of an industry accepted lighting control protocol, DMX512 by showcasing how the lighting system could help conserve power, assist with lighting countermeasures, and utilize spatial body tracking. We hope evaluation and the demonstrations we built will inspire other NASA engineers, architects and researchers to consider employing DMX512 "smart lighting" capabilities into their system architecture. By using DMX512 we will prove the 'wheel' does not need to be reinvented in terms of smart lighting and future spacecraft can use a standard lighting protocol to produce an effective, optimized and

  7. Lighting Automation Flying an Earthlike Habitat

    Science.gov (United States)

    Clark, Toni A.; Kolomenski, Andrei

    2017-01-01

    Currently, spacecraft lighting systems are not demonstrating innovations in automation due to perceived costs in designing circuitry for the communication and automation of lights. The majority of spacecraft lighting systems employ lamps or zone specific manual switches and dimmers. This type of 'hardwired' solution does not easily convert to automation. With advances in solid state lighting, the potential to enhance a spacecraft habitat is lost if the communication and automation problem is not tackled. If we are to build long duration environments, which provide earth-like habitats, minimize crew time, and optimize spacecraft power reserves, innovation in lighting automation is a must. This project researched the use of the DMX512 communication protocol originally developed for high channel count lighting systems. DMX512 is an internationally governed, industry-accepted, lighting communication protocol with wide industry support. The lighting industry markets a wealth of hardware and software that utilizes DMX512, and there may be incentive to space certify the system. Our goal in this research is to enable the development of automated spacecraft habitats for long duration missions. To transform how spacecraft lighting environments are automated, our project conducted a variety of tests to determine a potential scope of capability. We investigated utilization and application of an industry accepted lighting control protocol, DMX512 by showcasing how the lighting system could help conserve power, assist with lighting countermeasures, and utilize spatial body tracking. We hope evaluation and the demonstrations we built will inspire other NASA engineers, architects and researchers to consider employing DMX512 "smart lighting" capabilities into their system architecture. By using DMX512 we will prove the 'wheel' does not need to be reinvented in terms of smart lighting and future spacecraft can use a standard lighting protocol to produce an effective, optimized and

  8. Unified Selenocentric Reference Coordinates Net in the Dynamic System

    Science.gov (United States)

    Nefedyev, Yuri; Petrova, Natalia; Varaksina, Natalia

    In this report the task of the making selenocentric inertial reference net is solved. The purpose is making summary reference net by expansion KSC-1162 selenodetic system using 12 cosmic and ground selenodesic catalogues. The prospective analysis of this net was performed. These selenocentric reference catalogue covers full visible and a part of far lunar sides. Modern cosmic technologies need the accurate coordinate - temporal support including reference frame realization, inertial and dynamic system orientation and studying dynamic and geometry celestial bodies. That refers to dynamic and geometric selenocentric lunar parameters. The catalogue based on mission “Apollo” and reference nets of the west lunar hemisphere made by missions “Zond 5”, ”Zond 8” cover small part of the Moon surface. Three ALSEP stations were used to transform “Apollo” topographic coordinates. Transformation mean-square errors are less than 80 meters and measurement’s errors are about 60 meters. On this account positions inaccuracy near and between ALSEP stations are less 150 meters. The offset from place of the location ALSEP enlarges the supposed mistake is more than 300 m and this is a major part of the lunar surface. In solving the problem of high-precision condensation and expansion of fundamental selenocentric net KSC-1162 on the visible side of the Moon and lunar far side were obtained following new results: a) the analysis and investigation of the accuracy of basic net contained in ULCN were carried out; b) the decryption of common objects for coordinate systems which are being explored was executed; c) the extension of the mathematical content package TSC was carried out; d) the development of TSC as an expert system of universal transformation planet's coordinates was carried out; e) the possibility of applying the ARM-approach to the problem TC on common objects, which allows to find optimal parameter estimation and model structure of TC was confirmed; f) the

  9. Automation systems for radioimmunoassay

    International Nuclear Information System (INIS)

    Yamasaki, Paul

    1974-01-01

    The application of automation systems for radioimmunoassay (RIA) was discussed. Automated systems could be useful in the second step, of the four basic processes in the course of RIA, i.e., preparation of sample for reaction. There were two types of instrumentation, a semi-automatic pipete, and a fully automated pipete station, both providing for fast and accurate dispensing of the reagent or for the diluting of sample with reagent. Illustrations of the instruments were shown. (Mukohata, S.)

  10. Automated stopcock actuator

    OpenAIRE

    Vandehey, N. T.; O\\'Neil, J. P.

    2015-01-01

    Introduction We have developed a low-cost stopcock valve actuator for radiochemistry automation built using a stepper motor and an Arduino, an open-source single-board microcontroller. The con-troller hardware can be programmed to run by serial communication or via two 5–24 V digital lines for simple integration into any automation control system. This valve actuator allows for automated use of a single, disposable stopcock, providing a number of advantages over stopcock manifold systems ...

  11. Automated Analysis of Accountability

    DEFF Research Database (Denmark)

    Bruni, Alessandro; Giustolisi, Rosario; Schürmann, Carsten

    2017-01-01

    that are amenable to automated verification. Our definitions are general enough to be applied to different classes of protocols and different automated security verification tools. Furthermore, we point out formally the relation between verifiability and accountability. We validate our definitions...... with the automatic verification of three protocols: a secure exam protocol, Google’s Certificate Transparency, and an improved version of Bingo Voting. We find through automated verification that all three protocols satisfy verifiability while only the first two protocols meet accountability....

  12. Management Planning for Workplace Automation.

    Science.gov (United States)

    McDole, Thomas L.

    Several factors must be considered when implementing office automation. Included among these are whether or not to automate at all, the effects of automation on employees, requirements imposed by automation on the physical environment, effects of automation on the total organization, and effects on clientele. The reasons behind the success or…

  13. Laboratory Automation and Middleware.

    Science.gov (United States)

    Riben, Michael

    2015-06-01

    The practice of surgical pathology is under constant pressure to deliver the highest quality of service, reduce errors, increase throughput, and decrease turnaround time while at the same time dealing with an aging workforce, increasing financial constraints, and economic uncertainty. Although not able to implement total laboratory automation, great progress continues to be made in workstation automation in all areas of the pathology laboratory. This report highlights the benefits and challenges of pathology automation, reviews middleware and its use to facilitate automation, and reviews the progress so far in the anatomic pathology laboratory. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. Automated cloning methods.; TOPICAL

    International Nuclear Information System (INIS)

    Collart, F.

    2001-01-01

    Argonne has developed a series of automated protocols to generate bacterial expression clones by using a robotic system designed to be used in procedures associated with molecular biology. The system provides plate storage, temperature control from 4 to 37 C at various locations, and Biomek and Multimek pipetting stations. The automated system consists of a robot that transports sources from the active station on the automation system. Protocols for the automated generation of bacterial expression clones can be grouped into three categories (Figure 1). Fragment generation protocols are initiated on day one of the expression cloning procedure and encompass those protocols involved in generating purified coding region (PCR)

  15. Complacency and Automation Bias in the Use of Imperfect Automation.

    Science.gov (United States)

    Wickens, Christopher D; Clegg, Benjamin A; Vieane, Alex Z; Sebok, Angelia L

    2015-08-01

    We examine the effects of two different kinds of decision-aiding automation errors on human-automation interaction (HAI), occurring at the first failure following repeated exposure to correctly functioning automation. The two errors are incorrect advice, triggering the automation bias, and missing advice, reflecting complacency. Contrasts between analogous automation errors in alerting systems, rather than decision aiding, have revealed that alerting false alarms are more problematic to HAI than alerting misses are. Prior research in decision aiding, although contrasting the two aiding errors (incorrect vs. missing), has confounded error expectancy. Participants performed an environmental process control simulation with and without decision aiding. For those with the aid, automation dependence was created through several trials of perfect aiding performance, and an unexpected automation error was then imposed in which automation was either gone (one group) or wrong (a second group). A control group received no automation support. The correct aid supported faster and more accurate diagnosis and lower workload. The aid failure degraded all three variables, but "automation wrong" had a much greater effect on accuracy, reflecting the automation bias, than did "automation gone," reflecting the impact of complacency. Some complacency was manifested for automation gone, by a longer latency and more modest reduction in accuracy. Automation wrong, creating the automation bias, appears to be a more problematic form of automation error than automation gone, reflecting complacency. Decision-aiding automation should indicate its lower degree of confidence in uncertain environments to avoid the automation bias. © 2015, Human Factors and Ergonomics Society.

  16. Coordinating Interactions: The Event Coordination Notation

    DEFF Research Database (Denmark)

    Kindler, Ekkart

    on a much more technical level. The Event Coordination Notation (ECNO) allows modelling the behaviour of an application on a high level of abstraction that is closer to the application’s domain than to the software realizing it. Still, these models contain all necessary details for actually executing...... the models and for generating code from them. In order to be able to model the behaviour of a domain, the ECNO makes the events in which the different elements of the domain could engage explicit. The local behaviour of an element defines at which time an element can engage or participate in an event....... The global behaviour of the application results from different elements jointly engaging in such events, which is called an interaction. Which events are supposed to be jointly executed and which elements need to join in is defined by so-called coordination diagrams of the ECNO. Together, the models...

  17. The role of automation and artificial intelligence

    Science.gov (United States)

    Schappell, R. T.

    1983-07-01

    Consideration is given to emerging technologies that are not currently in common use, yet will be mature enough for implementation in a space station. Artificial intelligence (AI) will permit more autonomous operation and improve the man-machine interfaces. Technology goals include the development of expert systems, a natural language query system, automated planning systems, and AI image understanding systems. Intelligent robots and teleoperators will be needed, together with improved sensory systems for the robotics, housekeeping, vehicle control, and spacecraft housekeeping systems. Finally, NASA is developing the ROBSIM computer program to evaluate level of automation, perform parametric studies and error analyses, optimize trajectories and control systems, and assess AI technology.

  18. Dictionary descent in optimization

    OpenAIRE

    Temlyakov, Vladimir

    2015-01-01

    The problem of convex optimization is studied. Usually in convex optimization the minimization is over a d-dimensional domain. Very often the convergence rate of an optimization algorithm depends on the dimension d. The algorithms studied in this paper utilize dictionaries instead of a canonical basis used in the coordinate descent algorithms. We show how this approach allows us to reduce dimensionality of the problem. Also, we investigate which properties of a dictionary are beneficial for t...

  19. Distributed coordination of energy storage with distributed generators

    NARCIS (Netherlands)

    Yang, Tao; Wu, Di; Stoorvogel, Antonie Arij; Stoustrup, Jakob

    2016-01-01

    With a growing emphasis on energy efficiency and system flexibility, a great effort has been made recently in developing distributed energy resources (DER), including distributed generators and energy storage systems. This paper first formulates an optimal DER coordination problem considering

  20. Improving coordination between regional power markets

    Science.gov (United States)

    Giberson, Michael A.

    Restructuring of the electric power industry---both in the United States and elsewhere---has fostered the development of regional wholesale power markets closely integrated with power grid operations. The natural focus of the system optimizations used in these markets has been on maximizing the value of in-system resources. Where cross-border flows are possible, accommodations are made, but relative to the optimization such adjustments are ad hoc. Cross-border flows are growing, however, and present an increasing challenge to transmission system operators. Industry efforts at interregional coordination have focused on practical barriers to trade between regions; academic research has addressed some of the engineering challenges of coordinating separate regional grid optimizations. The existing research has for the most part neglected a number of issues traditionally of interest to economists. The present research uses the methods of experimental economics to examine the consequences of a market design to promote more efficient use of interconnections.

  1. Optimization of Human NK Cell Manufacturing: Fully Automated Separation, Improved Ex Vivo Expansion Using IL-21 with Autologous Feeder Cells, and Generation of Anti-CD123-CAR-Expressing Effector Cells.

    Science.gov (United States)

    Klöß, Stephan; Oberschmidt, Olaf; Morgan, Michael; Dahlke, Julia; Arseniev, Lubomir; Huppert, Volker; Granzin, Markus; Gardlowski, Tanja; Matthies, Nadine; Soltenborn, Stephanie; Schambach, Axel; Koehl, Ulrike

    2017-10-01

    depletion and CD56 enrichment steps. Manually performed experiments to test different culture media demonstrated significantly higher NK cell expansion rates and an approximately equal distribution of CD56 dim CD16 pos and CD56 bright CD16 dim&neg NK subsets on day 14 with cells cultivated in NK MACS ® media. Moreover, effector cell expansion in manually performed experiments with NK MACS ® containing IL-2 and irradiated autologous FCs and IL-21, both added at the initiation of the culture, induced an 85-fold NK cell expansion. Compared to freshly isolated NK cells, expanded NK cells expressed significantly higher levels of NKp30, NKp44, NKG2D, TRAIL, FasL, CD69, and CD137, and showed comparable cell viabilities and killing/degranulation activities against tumor and leukemic cell lines in vitro. NK cells used for CAR transduction showed the highest anti-CD123 CAR expression on day 3 after gene modification. These anti-CD123 CAR-engineered NK cells demonstrated improved cytotoxicity against the CD123 pos AML cell line KG1a and primary AML blasts. In addition, CAR NK cells showed higher degranulation and enhanced secretion of tumor necrosis factor alpha, interferon gamma, and granzyme A and B. In fluorescence imaging, specific interactions that initiated apoptotic processes in the AML target cells were detected between CAR NK cells and KG1a. After the fully automated NK cell separation process on Prodigy, a new NK cell expansion protocol was generated that resulted in high numbers of NK cells with potent antitumor activity, which could be modified efficiently by novel third-generation, alpha-retroviral SIN vector constructs. Next steps are the integration of the manual expansion procedure in the fully integrated platform for a standardized GMP-compliant overall process in this closed system that also may include gene modification of NK cells to optimize target-specific antitumor activity.

  2. Decentralized Control Using Global Optimization (DCGO) (Preprint)

    National Research Council Canada - National Science Library

    Flint, Matthew; Khovanova, Tanya; Curry, Michael

    2007-01-01

    The coordination of a team of distributed air vehicles requires a complex optimization, balancing limited communication bandwidths, non-instantaneous planning times and network delays, while at the...

  3. Automated System Marketplace 1994.

    Science.gov (United States)

    Griffiths, Jose-Marie; Kertis, Kimberly

    1994-01-01

    Reports results of the 1994 Automated System Marketplace survey based on responses from 60 vendors. Highlights include changes in the library automation marketplace; estimated library systems revenues; minicomputer and microcomputer-based systems; marketplace trends; global markets and mergers; research needs; new purchase processes; and profiles…

  4. Automation benefits BWR customers

    International Nuclear Information System (INIS)

    Anon.

    1982-01-01

    A description is given of the increasing use of automation at General Electric's Wilmington fuel fabrication plant. Computerised systems and automated equipment perform a large number of inspections, inventory and process operations, and new advanced systems are being continuously introduced to reduce operator errors and expand product reliability margins. (U.K.)

  5. Automate functional testing

    Directory of Open Access Journals (Sweden)

    Ramesh Kalindri

    2014-06-01

    Full Text Available Currently, software engineers are increasingly turning to the option of automating functional tests, but not always have successful in this endeavor. Reasons range from low planning until over cost in the process. Some principles that can guide teams in automating these tests are described in this article.

  6. Automation in Warehouse Development

    NARCIS (Netherlands)

    Hamberg, R.; Verriet, J.

    2012-01-01

    The warehouses of the future will come in a variety of forms, but with a few common ingredients. Firstly, human operational handling of items in warehouses is increasingly being replaced by automated item handling. Extended warehouse automation counteracts the scarcity of human operators and

  7. Identity Management Processes Automation

    Directory of Open Access Journals (Sweden)

    A. Y. Lavrukhin

    2010-03-01

    Full Text Available Implementation of identity management systems consists of two main parts, consulting and automation. The consulting part includes development of a role model and identity management processes description. The automation part is based on the results of consulting part. This article describes the most important aspects of IdM implementation.

  8. Work and Programmable Automation.

    Science.gov (United States)

    DeVore, Paul W.

    A new industrial era based on electronics and the microprocessor has arrived, an era that is being called intelligent automation. Intelligent automation, in the form of robots, replaces workers, and the new products, using microelectronic devices, require significantly less labor to produce than the goods they replace. The microprocessor thus…

  9. Library Automation in Pakistan.

    Science.gov (United States)

    Haider, Syed Jalaluddin

    1998-01-01

    Examines the state of library automation in Pakistan. Discusses early developments; financial support by the Netherlands Library Development Project (Pakistan); lack of automated systems in college/university and public libraries; usage by specialist libraries; efforts by private-sector libraries and the National Library in Pakistan; commonly used…

  10. Library Automation Style Guide.

    Science.gov (United States)

    Gaylord Bros., Liverpool, NY.

    This library automation style guide lists specific terms and names often used in the library automation industry. The terms and/or acronyms are listed alphabetically and each is followed by a brief definition. The guide refers to the "Chicago Manual of Style" for general rules, and a notes section is included for the convenience of individual…

  11. Planning for Office Automation.

    Science.gov (United States)

    Sherron, Gene T.

    1982-01-01

    The steps taken toward office automation by the University of Maryland are described. Office automation is defined and some types of word processing systems are described. Policies developed in the writing of a campus plan are listed, followed by a section on procedures adopted to implement the plan. (Author/MLW)

  12. The Automated Office.

    Science.gov (United States)

    Naclerio, Nick

    1979-01-01

    Clerical personnel may be able to climb career ladders as a result of office automation and expanded job opportunities in the word processing area. Suggests opportunities in an automated office system and lists books and periodicals on word processing for counselors and teachers. (MF)

  13. Automating the Small Library.

    Science.gov (United States)

    Skapura, Robert

    1987-01-01

    Discusses the use of microcomputers for automating school libraries, both for entire systems and for specific library tasks. Highlights include available library management software, newsletters that evaluate software, constructing an evaluation matrix, steps to consider in library automation, and a brief discussion of computerized card catalogs.…

  14. Developmental coordination disorder

    Science.gov (United States)

    Developmental coordination disorder can lead to: Learning problems Low self-esteem resulting from poor ability at sports and teasing by other children Repeated injuries Weight gain as a result of not wanting to participate ...

  15. Supercritical Airfoil Coordinates

    Data.gov (United States)

    National Aeronautics and Space Administration — Rectangular Supercritical Wing (Ricketts) - design and measured locations are provided in an Excel file RSW_airfoil_coordinates_ricketts.xls . One sheet is with Non...

  16. Regional transit coordination guidebook.

    Science.gov (United States)

    2009-01-01

    Constant growth in rural areas and extensive suburban development have contributed to increasingly more people needing seamless and adequate public transportation into and from nearby cities. Coordinating existing services or determining the need for...

  17. Environmental Compliance Issue Coordination

    Science.gov (United States)

    An order to establish the Department of Energy (DOE) requirements for coordination of significant environmental compliance issues to ensure timely development and consistent application of Departmental environmental policy and guidance

  18. Haskell_#: Coordinating Functional Processes

    OpenAIRE

    Junior, Francisco Heron de Carvalho; Lins, Rafael Dueire

    2012-01-01

    This paper presents Haskell#, a coordination language targeted at the efficient implementation of parallel scientific applications on loosely coupled parallel architectures, using the functional language Haskell. Examples of applications, their implementation details and performance figures are presented.

  19. Understanding social motor coordination.

    Science.gov (United States)

    Schmidt, R C; Fitzpatrick, Paula; Caron, Robert; Mergeche, Joanna

    2011-10-01

    Recently there has been much interest in social coordination of motor movements, or as it is referred to by some researchers, joint action. This paper reviews the cognitive perspective's common coding/mirror neuron theory of joint action, describes some of its limitations and then presents the behavioral dynamics perspective as an alternative way of understanding social motor coordination. In particular, behavioral dynamics' ability to explain the temporal coordination of interacting individuals is detailed. Two experiments are then described that demonstrate how dynamical processes of synchronization are apparent in the coordination underlying everyday joint actions such as martial art exercises, hand-clapping games, and conversations. The import of this evidence is that emergent dynamic patterns such as synchronization are the behavioral order that any neural substrate supporting joint action (e.g., mirror systems) would have to sustain. Copyright © 2010 Elsevier B.V. All rights reserved.

  20. Advances in inspection automation

    Science.gov (United States)

    Weber, Walter H.; Mair, H. Douglas; Jansen, Dion; Lombardi, Luciano

    2013-01-01

    This new session at QNDE reflects the growing interest in inspection automation. Our paper describes a newly developed platform that makes the complex NDE automation possible without the need for software programmers. Inspection tasks that are tedious, error-prone or impossible for humans to perform can now be automated using a form of drag and drop visual scripting. Our work attempts to rectify the problem that NDE is not keeping pace with the rest of factory automation. Outside of NDE, robots routinely and autonomously machine parts, assemble components, weld structures and report progress to corporate databases. By contrast, components arriving in the NDT department typically require manual part handling, calibrations and analysis. The automation examples in this paper cover the development of robotic thickness gauging and the use of adaptive contour following on the NRU reactor inspection at Chalk River.

  1. Automated model building

    CERN Document Server

    Caferra, Ricardo; Peltier, Nicholas

    2004-01-01

    This is the first book on automated model building, a discipline of automated deduction that is of growing importance Although models and their construction are important per se, automated model building has appeared as a natural enrichment of automated deduction, especially in the attempt to capture the human way of reasoning The book provides an historical overview of the field of automated deduction, and presents the foundations of different existing approaches to model construction, in particular those developed by the authors Finite and infinite model building techniques are presented The main emphasis is on calculi-based methods, and relevant practical results are provided The book is of interest to researchers and graduate students in computer science, computational logic and artificial intelligence It can also be used as a textbook in advanced undergraduate courses

  2. Automation in Warehouse Development

    CERN Document Server

    Verriet, Jacques

    2012-01-01

    The warehouses of the future will come in a variety of forms, but with a few common ingredients. Firstly, human operational handling of items in warehouses is increasingly being replaced by automated item handling. Extended warehouse automation counteracts the scarcity of human operators and supports the quality of picking processes. Secondly, the development of models to simulate and analyse warehouse designs and their components facilitates the challenging task of developing warehouses that take into account each customer’s individual requirements and logistic processes. Automation in Warehouse Development addresses both types of automation from the innovative perspective of applied science. In particular, it describes the outcomes of the Falcon project, a joint endeavour by a consortium of industrial and academic partners. The results include a model-based approach to automate warehouse control design, analysis models for warehouse design, concepts for robotic item handling and computer vision, and auton...

  3. Automation in Immunohematology

    Directory of Open Access Journals (Sweden)

    Meenu Bajpai

    2012-01-01

    Full Text Available There have been rapid technological advances in blood banking in South Asian region over the past decade with an increasing emphasis on quality and safety of blood products. The conventional test tube technique has given way to newer techniques such as column agglutination technique, solid phase red cell adherence assay, and erythrocyte-magnetized technique. These new technologies are adaptable to automation and major manufacturers in this field have come up with semi and fully automated equipments for immunohematology tests in the blood bank. Automation improves the objectivity and reproducibility of tests. It reduces human errors in patient identification and transcription errors. Documentation and traceability of tests, reagents and processes and archiving of results is another major advantage of automation. Shifting from manual methods to automation is a major undertaking for any transfusion service to provide quality patient care with lesser turnaround time for their ever increasing workload. This article discusses the various issues involved in the process.

  4. Coordinated garbage collection for raid array of solid state disks

    Science.gov (United States)

    Dillow, David A; Ki, Youngjae; Oral, Hakki S; Shipman, Galen M; Wang, Feiyi

    2014-04-29

    An optimized redundant array of solid state devices may include an array of one or more optimized solid-state devices and a controller coupled to the solid-state devices for managing the solid-state devices. The controller may be configured to globally coordinate the garbage collection activities of each of said optimized solid-state devices, for instance, to minimize the degraded performance time and increase the optimal performance time of the entire array of devices.

  5. Developments towards a fully automated AMS system

    International Nuclear Information System (INIS)

    Steier, P.; Puchegger, S.; Golser, R.; Kutschera, W.; Priller, A.; Rom, W.; Wallner, A.; Wild, E.

    2000-01-01

    The possibilities of computer-assisted and automated accelerator mass spectrometry (AMS) measurements were explored. The goal of these efforts is to develop fully automated procedures for 'routine' measurements at the Vienna Environmental Research Accelerator (VERA), a dedicated 3-MV Pelletron tandem AMS facility. As a new tool for automatic tuning of the ion optics we developed a multi-dimensional optimization algorithm robust to noise, which was applied for 14 C and 10 Be. The actual isotope ratio measurements are performed in a fully automated fashion and do not require the presence of an operator. Incoming data are evaluated online and the results can be accessed via Internet. The system was used for 14 C, 10 Be, 26 Al and 129 I measurements

  6. Automated imaging system for single molecules

    Science.gov (United States)

    Schwartz, David Charles; Runnheim, Rodney; Forrest, Daniel

    2012-09-18

    There is provided a high throughput automated single molecule image collection and processing system that requires minimal initial user input. The unique features embodied in the present disclosure allow automated collection and initial processing of optical images of single molecules and their assemblies. Correct focus may be automatically maintained while images are collected. Uneven illumination in fluorescence microscopy is accounted for, and an overall robust imaging operation is provided yielding individual images prepared for further processing in external systems. Embodiments described herein are useful in studies of any macromolecules such as DNA, RNA, peptides and proteins. The automated image collection and processing system and method of same may be implemented and deployed over a computer network, and may be ergonomically optimized to facilitate user interaction.

  7. AIRCRAFT POWER SUPPLY SYSTEM DESIGN PROCESS AS AN AUTOMATION OBJECT

    Directory of Open Access Journals (Sweden)

    Boris V. Zhmurov

    2018-01-01

    aircraft and take into account all the requirements of the customer and the regulatory and technical documentation is its automation.Automation of the design of EPS aircraft as an optimization task involves the formalization of the object of optimization, as well as the choice of the criterion of efficiency and control actions. Under the object of optimization in this case we mean the design process of the EPS, the formalization of which includes formalization and the design object – the aircraft power supply system.

  8. Automation model of sewerage rehabilitation planning.

    Science.gov (United States)

    Yang, M D; Su, T C

    2006-01-01

    The major steps of sewerage rehabilitation include inspection of sewerage, assessment of structural conditions, computation of structural condition grades, and determination of rehabilitation methods and materials. Conventionally, sewerage rehabilitation planning relies on experts with professional background that is tedious and time-consuming. This paper proposes an automation model of planning optimal sewerage rehabilitation strategies for the sewer system by integrating image process, clustering technology, optimization, and visualization display. Firstly, image processing techniques, such as wavelet transformation and co-occurrence features extraction, were employed to extract various characteristics of structural failures from CCTV inspection images. Secondly, a classification neural network was established to automatically interpret the structural conditions by comparing the extracted features with the typical failures in a databank. Then, to achieve optimal rehabilitation efficiency, a genetic algorithm was used to determine appropriate rehabilitation methods and substitution materials for the pipe sections with a risk of mal-function and even collapse. Finally, the result from the automation model can be visualized in a geographic information system in which essential information of the sewer system and sewerage rehabilitation plans are graphically displayed. For demonstration, the automation model of optimal sewerage rehabilitation planning was applied to a sewer system in east Taichung, Chinese Taiwan.

  9. New hardware and workflows for semi-automated correlative cryo-fluorescence and cryo-electron microscopy/tomography.

    Science.gov (United States)

    Schorb, Martin; Gaechter, Leander; Avinoam, Ori; Sieckmann, Frank; Clarke, Mairi; Bebeacua, Cecilia; Bykov, Yury S; Sonnen, Andreas F-P; Lihl, Reinhard; Briggs, John A G

    2017-02-01

    Correlative light and electron microscopy allows features of interest defined by fluorescence signals to be located in an electron micrograph of the same sample. Rare dynamic events or specific objects can be identified, targeted and imaged by electron microscopy or tomography. To combine it with structural studies using cryo-electron microscopy or tomography, fluorescence microscopy must be performed while maintaining the specimen vitrified at liquid-nitrogen temperatures and in a dry environment during imaging and transfer. Here we present instrumentation, software and an experimental workflow that improves the ease of use, throughput and performance of correlated cryo-fluorescence and cryo-electron microscopy. The new cryo-stage incorporates a specially modified high-numerical aperture objective lens and provides a stable and clean imaging environment. It is combined with a transfer shuttle for contamination-free loading of the specimen. Optimized microscope control software allows automated acquisition of the entire specimen area by cryo-fluorescence microscopy. The software also facilitates direct transfer of the fluorescence image and associated coordinates to the cryo-electron microscope for subsequent fluorescence-guided automated imaging. Here we describe these technological developments and present a detailed workflow, which we applied for automated cryo-electron microscopy and tomography of various specimens. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  10. Protection coordination of the Kennedy Space Center electric distribution network

    Science.gov (United States)

    1976-01-01

    A computer technique is described for visualizing the coordination and protection of any existing system of devices and settings by plotting the tripping characteristics of the involved devices on a common basis. The program determines the optimum settings of a given set of protective devices and configuration in the sense of the best expected coordinated operation of these devices. Subroutines are given for simulating time versus current characteristics of the different relays, circuit breakers, and fuses in the system; coordination index computation; protection checks; plotting; and coordination optimation.

  11. On-Site School Library Automation: Automation Anywhere with Laptops.

    Science.gov (United States)

    Gunn, Holly; Oxner, June

    2000-01-01

    Four years after the Halifax Regional School Board was formed through amalgamation, over 75% of its school libraries were automated. On-site automation with laptops was a quicker, more efficient way of automating than sending a shelf list to the Technical Services Department. The Eastern Shore School Library Automation Project was a successful…

  12. Retrieval-based Face Annotation by Weak Label Regularized Local Coordinate Coding.

    Science.gov (United States)

    Wang, Dayong; Hoi, Steven C H; He, Ying; Zhu, Jianke; Mei, Tao; Luo, Jiebo

    2013-08-02

    Retrieval-based face annotation is a promising paradigm of mining massive web facial images for automated face annotation. This paper addresses a critical problem of such paradigm, i.e., how to effectively perform annotation by exploiting the similar facial images and their weak labels which are often noisy and incomplete. In particular, we propose an effective Weak Label Regularized Local Coordinate Coding (WLRLCC) technique, which exploits the principle of local coordinate coding in learning sparse features, and employs the idea of graph-based weak label regularization to enhance the weak labels of the similar facial images. We present an efficient optimization algorithm to solve the WLRLCC task. We conduct extensive empirical studies on two large-scale web facial image databases: (i) a Western celebrity database with a total of $6,025$ persons and $714,454$ web facial images, and (ii)an Asian celebrity database with $1,200$ persons and $126,070$ web facial images. The encouraging results validate the efficacy of the proposed WLRLCC algorithm. To further improve the efficiency and scalability, we also propose a PCA-based approximation scheme and an offline approximation scheme (AWLRLCC), which generally maintains comparable results but significantly saves much time cost. Finally, we show that WLRLCC can also tackle two existing face annotation tasks with promising performance.

  13. Automated electron microprobe

    International Nuclear Information System (INIS)

    Thompson, K.A.; Walker, L.R.

    1986-01-01

    The Plant Laboratory at the Oak Ridge Y-12 Plant has recently obtained a Cameca MBX electron microprobe with a Tracor Northern TN5500 automation system. This allows full stage and spectrometer automation and digital beam control. The capabilities of the system include qualitative and quantitative elemental microanalysis for all elements above and including boron in atomic number, high- and low-magnification imaging and processing, elemental mapping and enhancement, and particle size, shape, and composition analyses. Very low magnification, quantitative elemental mapping using stage control (which is of particular interest) has been accomplished along with automated size, shape, and composition analysis over a large relative area

  14. Operational proof of automation

    International Nuclear Information System (INIS)

    Jaerschky, R.; Reifenhaeuser, R.; Schlicht, K.

    1976-01-01

    Automation of the power plant process may imply quite a number of problems. The automation of dynamic operations requires complicated programmes often interfering in several branched areas. This reduces clarity for the operating and maintenance staff, whilst increasing the possibilities of errors. The synthesis and the organization of standardized equipment have proved very successful. The possibilities offered by this kind of automation for improving the operation of power plants will only sufficiently and correctly be turned to profit, however, if the application of these technics of equipment is further improved and if its volume is tallied with a definite etc. (orig.) [de

  15. Chef infrastructure automation cookbook

    CERN Document Server

    Marschall, Matthias

    2013-01-01

    Chef Infrastructure Automation Cookbook contains practical recipes on everything you will need to automate your infrastructure using Chef. The book is packed with illustrated code examples to automate your server and cloud infrastructure.The book first shows you the simplest way to achieve a certain task. Then it explains every step in detail, so that you can build your knowledge about how things work. Eventually, the book shows you additional things to consider for each approach. That way, you can learn step-by-step and build profound knowledge on how to go about your configuration management

  16. Euler's fluid equations: Optimal control vs optimization

    International Nuclear Information System (INIS)

    Holm, Darryl D.

    2009-01-01

    An optimization method used in image-processing (metamorphosis) is found to imply Euler's equations for incompressible flow of an inviscid fluid, without requiring that the Lagrangian particle labels exactly follow the flow lines of the Eulerian velocity vector field. Thus, an optimal control problem and an optimization problem for incompressible ideal fluid flow both yield the same Euler fluid equations, although their Lagrangian parcel dynamics are different. This is a result of the gauge freedom in the definition of the fluid pressure for an incompressible flow, in combination with the symmetry of fluid dynamics under relabeling of their Lagrangian coordinates. Similar ideas are also illustrated for SO(N) rigid body motion.

  17. Human-centered automation of testing, surveillance and maintenance

    International Nuclear Information System (INIS)

    Bhatt, S.C.; Sun, B.K.H.

    1991-01-01

    Manual surveillance and testing of instrumentation, control and protection systems at nuclear power plants involves system and human errors which can lead to substantial plant down time. Frequent manual testing can also contribute significantly to operation and maintenance cost. Automation technology offers potential for prudent applications at the power plant to reduce testing errors and cost. To help address the testing problems and to harness the benefit of automation application, input from utilities is obtained on suitable automation approaches. This paper includes lessens from successful past experience at a few plants where some island of automation exist. The results are summarized as a set of specifications for semi automatic testing. A human-centered automation methodology is proposed with the guidelines for optimal human/computer division of tasks given. Implementation obstacles for significant changes of testing practices are identified and methods acceptable to nuclear power plants for addressing these obstacles have been suggested

  18. [Binocular coordination during reading].

    Science.gov (United States)

    Bassou, L; Granié, M; Pugh, A K; Morucci, J P

    1992-01-01

    Is there an effect on binocular coordination during reading of oculomotor imbalance (heterophoria, strabismus and inadequate convergence) and of functional lateral characteristics (eye preference and perceptually privileged visual laterality)? Recordings of the binocular eye-movements of ten-year-old children show that oculomotor imbalances occur most often among children whose left visual perceptual channel is privileged, and that these subjects can present optomotor dissociation and manifest lack of motor coordination. Close binocular motor coordination is far from being the norm in reading. The faster reader displays saccades of differing spatial amplitude and the slower reader an oculomotor hyperactivity, especially during fixations. The recording of binocular movements in reading appears to be an excellent means of diagnosing difficulties related to visual laterality and to problems associated with oculomotor imbalance.

  19. Introduction to Coordination Chemistry

    CERN Document Server

    Lawrance, Geoffrey Alan

    2010-01-01

    Introduction to Coordination Chemistry examines and explains how metals and molecules that bind as ligands interact, and the consequences of this assembly process. This book describes the chemical and physical properties and behavior of the complex assemblies that form, and applications that may arise as a result of these properties. Coordination complexes are an important but often hidden part of our world?even part of us?and what they do is probed in this book. This book distills the essence of this topic for undergraduate students and for research scientists.

  20. Quantifying linguistic coordination

    DEFF Research Database (Denmark)

    Fusaroli, Riccardo; Tylén, Kristian

    task (Bahrami et al 2010, Fusaroli et al. 2012) we extend to linguistic coordination dynamical measures of recurrence employed in the analysis of sensorimotor coordination (such as heart-rate (Konvalinka et al 2011), postural sway (Shockley 2005) and eye-movements (Dale, Richardson and Kirkham 2012......). We employ nominal recurrence analysis (Orsucci et al 2005, Dale et al 2011) on the decision-making conversations between the participants. We report strong correlations between various indexes of recurrence and collective performance. We argue this method allows us to quantify the qualities...

  1. Coordinate Standard Measurement Development

    Energy Technology Data Exchange (ETDEWEB)

    Hanshaw, R.A.

    2000-02-18

    A Shelton Precision Interferometer Base, which is used for calibration of coordinate standards, was improved through hardware replacement, software geometry error correction, and reduction of vibration effects. Substantial increases in resolution and reliability, as well as reduction in sampling time, were achieved through hardware replacement; vibration effects were reduced substantially through modification of the machine component dampening and software routines; and the majority of the machine's geometry error was corrected through software geometry error correction. Because of these modifications, the uncertainty of coordinate standards calibrated on this device has been reduced dramatically.

  2. Recursive Advice for Coordination

    DEFF Research Database (Denmark)

    Terepeta, Michal Tomasz; Nielson, Hanne Riis; Nielson, Flemming

    2012-01-01

    Aspect-oriented programming is a programming paradigm that is often praised for the ability to create modular software and separate cross-cutting concerns. Recently aspects have been also considered in the context of coordination languages, offering similar advantages. However, introducing aspect...... systems. Even though primarily used for analysis of recursive programs, we are able to adapt them to fit this new context.......Aspect-oriented programming is a programming paradigm that is often praised for the ability to create modular software and separate cross-cutting concerns. Recently aspects have been also considered in the context of coordination languages, offering similar advantages. However, introducing aspects...

  3. Automation Interface Design Development

    Data.gov (United States)

    National Aeronautics and Space Administration — Our research makes its contributions at two levels. At one level, we addressed the problems of interaction between humans and computers/automation in a particular...

  4. Automated Vehicles Symposium 2014

    CERN Document Server

    Beiker, Sven; Road Vehicle Automation 2

    2015-01-01

    This paper collection is the second volume of the LNMOB series on Road Vehicle Automation. The book contains a comprehensive review of current technical, socio-economic, and legal perspectives written by experts coming from public authorities, companies and universities in the U.S., Europe and Japan. It originates from the Automated Vehicle Symposium 2014, which was jointly organized by the Association for Unmanned Vehicle Systems International (AUVSI) and the Transportation Research Board (TRB) in Burlingame, CA, in July 2014. The contributions discuss the challenges arising from the integration of highly automated and self-driving vehicles into the transportation system, with a focus on human factors and different deployment scenarios. This book is an indispensable source of information for academic researchers, industrial engineers, and policy makers interested in the topic of road vehicle automation.

  5. Fixed automated spray technology.

    Science.gov (United States)

    2011-04-19

    This research project evaluated the construction and performance of Boschungs Fixed Automated : Spray Technology (FAST) system. The FAST system automatically sprays de-icing material on : the bridge when icing conditions are about to occur. The FA...

  6. Automated Vehicles Symposium 2015

    CERN Document Server

    Beiker, Sven

    2016-01-01

    This edited book comprises papers about the impacts, benefits and challenges of connected and automated cars. It is the third volume of the LNMOB series dealing with Road Vehicle Automation. The book comprises contributions from researchers, industry practitioners and policy makers, covering perspectives from the U.S., Europe and Japan. It is based on the Automated Vehicles Symposium 2015 which was jointly organized by the Association of Unmanned Vehicle Systems International (AUVSI) and the Transportation Research Board (TRB) in Ann Arbor, Michigan, in July 2015. The topical spectrum includes, but is not limited to, public sector activities, human factors, ethical and business aspects, energy and technological perspectives, vehicle systems and transportation infrastructure. This book is an indispensable source of information for academic researchers, industrial engineers and policy makers interested in the topic of road vehicle automation.

  7. Automation synthesis modules review

    International Nuclear Information System (INIS)

    Boschi, S.; Lodi, F.; Malizia, C.; Cicoria, G.; Marengo, M.

    2013-01-01

    The introduction of 68 Ga labelled tracers has changed the diagnostic approach to neuroendocrine tumours and the availability of a reliable, long-lived 68 Ge/ 68 Ga generator has been at the bases of the development of 68 Ga radiopharmacy. The huge increase in clinical demand, the impact of regulatory issues and a careful radioprotection of the operators have boosted for extensive automation of the production process. The development of automated systems for 68 Ga radiochemistry, different engineering and software strategies and post-processing of the eluate were discussed along with impact of automation with regulations. - Highlights: ► Generators availability and robust chemistry boosted for the huge diffusion of 68Ga radiopharmaceuticals. ► Different technological approaches for 68Ga radiopharmaceuticals will be discussed. ► Generator eluate post processing and evolution to cassette based systems were the major issues in automation. ► Impact of regulations on the technological development will be also considered

  8. Automated optical sensing system for biochemical assays

    Science.gov (United States)

    Oroszlan, Peter; Duveneck, Gert L.; Ehrat, Markus; Widmer, H. M.

    1994-03-01

    In this paper, we present a new system called FOBIA that was developed and optimized with respect to automated operation of repetitive assay cycles with regenerable bioaffinity sensors. The reliability and precision of the new system is demonstrated by an application in a competitive assay for the detection of the triazine herbicide Atrazine. Using one sensor in more than 300 repetitive cycles, a signal precision better than 5% was achieved.

  9. Disassembly automation automated systems with cognitive abilities

    CERN Document Server

    Vongbunyong, Supachai

    2015-01-01

    This book presents a number of aspects to be considered in the development of disassembly automation, including the mechanical system, vision system and intelligent planner. The implementation of cognitive robotics increases the flexibility and degree of autonomy of the disassembly system. Disassembly, as a step in the treatment of end-of-life products, can allow the recovery of embodied value left within disposed products, as well as the appropriate separation of potentially-hazardous components. In the end-of-life treatment industry, disassembly has largely been limited to manual labor, which is expensive in developed countries. Automation is one possible solution for economic feasibility. The target audience primarily comprises researchers and experts in the field, but the book may also be beneficial for graduate students.

  10. Automated Lattice Perturbation Theory

    Energy Technology Data Exchange (ETDEWEB)

    Monahan, Christopher

    2014-11-01

    I review recent developments in automated lattice perturbation theory. Starting with an overview of lattice perturbation theory, I focus on the three automation packages currently "on the market": HiPPy/HPsrc, Pastor and PhySyCAl. I highlight some recent applications of these methods, particularly in B physics. In the final section I briefly discuss the related, but distinct, approach of numerical stochastic perturbation theory.

  11. Automated ISMS control auditability

    OpenAIRE

    Suomu, Mikko

    2015-01-01

    This thesis focuses on researching a possible reference model for automated ISMS’s (Information Security Management System) technical control auditability. The main objective was to develop a generic framework for automated compliance status monitoring of the ISO27001:2013 standard which could be re‐used in any ISMS system. The framework was tested with Proof of Concept (PoC) empirical research in a test infrastructure which simulates the framework target deployment environment. To fulfi...

  12. Marketing automation supporting sales

    OpenAIRE

    Sandell, Niko

    2016-01-01

    The past couple of decades has been a time of major changes in marketing. Digitalization has become a permanent part of marketing and at the same time enabled efficient collection of data. Personalization and customization of content are playing a crucial role in marketing when new customers are acquired. This has also created a need for automation to facilitate the distribution of targeted content. As a result of successful marketing automation more information of the customers is gathered ...

  13. Automated security management

    CERN Document Server

    Al-Shaer, Ehab; Xie, Geoffrey

    2013-01-01

    In this contributed volume, leading international researchers explore configuration modeling and checking, vulnerability and risk assessment, configuration analysis, and diagnostics and discovery. The authors equip readers to understand automated security management systems and techniques that increase overall network assurability and usability. These constantly changing networks defend against cyber attacks by integrating hundreds of security devices such as firewalls, IPSec gateways, IDS/IPS, authentication servers, authorization/RBAC servers, and crypto systems. Automated Security Managemen

  14. Automated lattice data generation

    Directory of Open Access Journals (Sweden)

    Ayyar Venkitesh

    2018-01-01

    Full Text Available The process of generating ensembles of gauge configurations (and measuring various observables over them can be tedious and error-prone when done “by hand”. In practice, most of this procedure can be automated with the use of a workflow manager. We discuss how this automation can be accomplished using Taxi, a minimal Python-based workflow manager built for generating lattice data. We present a case study demonstrating this technology.

  15. Coordinate measurement machines as an alignment tool

    International Nuclear Information System (INIS)

    Wand, B.T.

    1991-03-01

    In February of 1990 the Stanford Linear Accelerator Center (SLAC) purchased a LEITZ PM 12-10-6 CMM (Coordinate measurement machine). The machine is shared by the Quality Control Team and the Alignment Team. One of the alignment tasks in positioning beamline components in a particle accelerator is to define the component's magnetic centerline relative to external fiducials. This procedure, called fiducialization, is critical to the overall positioning tolerance of a magnet. It involves the definition of the magnetic center line with respect to the mechanical centerline and the transfer of the mechanical centerline to the external fiducials. To perform the latter a magnet coordinate system has to be established. This means defining an origin and the three rotation angles of the magnet. The datum definition can be done by either optical tooling techniques or with a CMM. As optical tooling measurements are very time consuming, not automated and are prone to errors, it is desirable to use the CMM fiducialization method instead. The establishment of a magnet coordinate system based on the mechanical center and the transfer to external fiducials will be discussed and presented with 2 examples from the Stanford Linear Collider (SLC). 7 figs

  16. Automated Verification of IGRT-based Patient Positioning.

    Science.gov (United States)

    Jiang, Xiaojun; Fox, Tim; Cordova, James S; Schreibmann, Eduard

    2015-11-08

    A system for automated quality assurance in radiotherapy of a therapist's registration was designed and tested in clinical practice. The approach compliments the clinical software's automated registration in terms of algorithm configuration and performance, and constitutes a practical approach for ensuring safe patient setups. Per our convergence analysis, evolutionary algorithms perform better in finding the global optima of the cost function with discrepancies from a deterministic optimizer seen sporadically.

  17. MultiRefactor: Automated Refactoring To Improve Software Quality

    OpenAIRE

    Mohan, Michael; Greer, Des

    2017-01-01

    In this paper, a new approach is proposed for automated software maintenance. The tool is able to perform 26 different refactorings. It also contains a large selection of metrics to measure the impact of the refactorings on the software and six different search based optimization algorithms to improve the software. This tool contains both mono-objective and multi-objective search techniques for software improvement and is fully automated. The paper describes the various capabilities of the to...

  18. Cassini Tour Atlas Automated Generation

    Science.gov (United States)

    Grazier, Kevin R.; Roumeliotis, Chris; Lange, Robert D.

    2011-01-01

    During the Cassini spacecraft s cruise phase and nominal mission, the Cassini Science Planning Team developed and maintained an online database of geometric and timing information called the Cassini Tour Atlas. The Tour Atlas consisted of several hundreds of megabytes of EVENTS mission planning software outputs, tables, plots, and images used by mission scientists for observation planning. Each time the nominal mission trajectory was altered or tweaked, a new Tour Atlas had to be regenerated manually. In the early phases of Cassini s Equinox Mission planning, an a priori estimate suggested that mission tour designers would develop approximately 30 candidate tours within a short period of time. So that Cassini scientists could properly analyze the science opportunities in each candidate tour quickly and thoroughly so that the optimal series of orbits for science return could be selected, a separate Tour Atlas was required for each trajectory. The task of manually generating the number of trajectory analyses in the allotted time would have been impossible, so the entire task was automated using code written in five different programming languages. This software automates the generation of the Cassini Tour Atlas database. It performs with one UNIX command what previously took a day or two of human labor.

  19. AUTOMATION OF IMAGE DATA PROCESSING

    Directory of Open Access Journals (Sweden)

    Preuss Ryszard

    2014-12-01

    Full Text Available This article discusses the current capabilities of automate processing of the image data on the example of using PhotoScan software by Agisoft . At present, image data obtained by various registration systems (metric and non - metric cameras placed on airplanes , satellites , or more often on UAVs is used to create photogrammetric products. Multiple registrations of object or land area (large groups of photos are captured are usually performed in order to eliminate obscured area as well as to raise the final accuracy of the photogrammetric product. Because of such a situation t he geometry of the resulting image blocks is far from the typical configuration of images . For fast images georeferencing automatic image matching algorithms are currently applied . They can create a model of a block in the local coordinate system or using initial exterior orientation and measured control points can provide image georeference in an external reference frame. In the case of non - metric image application, it is also possible to carry out self - calibration process at this stage . Image matching algorithm is also used in generation of dense point clouds reconstructing spatial shape of the object ( area. In subsequent processing steps it is possible to obtain typical photogrammetric products such as orthomosaic , DSM or DTM and a photorealistic solid model of an object . All aforementioned processing steps are implemented in a single program in contrary to standard commercial software dividing all steps into dedicated modules . I mage processing leading to final geo referenced products can be fully automated including sequential implementation of the processing steps at predetermined control parameters . The paper presents the practical results of the application fully automatic generation of othomosaic for both images obtained by a metric Vexell camera and a block of images acquired by a non - metric UAV system.

  20. Towards full automation of accelerators through computer control

    International Nuclear Information System (INIS)

    Gamble, J.; Hemery, J.-Y.; Kemp, D.; Keyser, R.; Koutchouk, J.-P.; Martucci, P.; Tausch, L.; Vos, L.

    1980-01-01

    The computer control system of the Intersecting Storage Rings (ISR) at CERN has always laid emphasis on two particular operational aspects, the first being the reproducibility of machine conditions and the second that of giving the operators the possibility to work in terms of machine parameters such as the tune. Already certain phases of the operation are optimized by the control system, whilst others are automated with a minimum of manual intervention. The paper describes this present control system with emphasis on the existing automated facilities and the features of the control system which make it possible. It then discusses the steps needed to completely automate the operational procedure of accelerators. (Auth.)

  1. Towards full automation of accelerators through computer control

    CERN Document Server

    Gamble, J; Kemp, D; Keyser, R; Koutchouk, Jean-Pierre; Martucci, P P; Tausch, Lothar A; Vos, L

    1980-01-01

    The computer control system of the Intersecting Storage Rings (ISR) at CERN has always laid emphasis on two particular operational aspects, the first being the reproducibility of machine conditions and the second that of giving the operators the possibility to work in terms of machine parameters such as the tune. Already certain phases of the operation are optimized by the control system, whilst others are automated with a minimum of manual intervention. The authors describe this present control system with emphasis on the existing automated facilities and the features of the control system which make it possible. It then discusses the steps needed to completely automate the operational procedure of accelerators. (7 refs).

  2. Coordination failure caused by sunspots

    DEFF Research Database (Denmark)

    Beugnot, Julie; Gürgüç, Zeynep; Øvlisen, Frederik Roose

    2012-01-01

    on the efficient equilibrium, we consider sunspots as a potential reason for coordination failure. We conduct an experiment with a three player 2x2x2 game in which coordination on the efficient equilibrium is easy and should normally occur. In the control session, we find almost perfect coordination on the payoff......-dominant equilibrium, but in the sunspot treatment, dis-coordination is frequent. Sunspots lead to significant inefficiency, and we conclude that sunspots can indeed cause coordination failure....

  3. Fast Automated Decoupling at RHIC

    CERN Document Server

    Beebe-Wang, Joanne

    2005-01-01

    Coupling correction is essential for the operational performance of RHIC. The independence of the transverse degrees of freedom makes diagnostics and tune control easier, and it is advantageous to operate an accelerator close to the coupling resonance to minimize nearby nonlinear sidebands. An automated decoupling application has been developed at RHIC for coupling correction during routine operations. The application decouples RHIC globally by minimizing the tune separation through finding the optimal settings of two orthogonal skew quadrupole families. The program provides options of automatic, semi-automatic and manual decoupling operations. It accesses tune information from all RHIC tune measurement systems: the PLL (Phase Lock Loop), the high frequency Schottky system, and the tune meter. It also supplies tune and skew quadrupole scans, finding the minimum tune separation, display the real time results and interface with the RHIC control system. We summarize the capabilities of the decoupling application...

  4. Coordination Compounds in Biology

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 4; Issue 6. Coordination Compounds in Biology - The Chemistry of Vitamin B12 and Model Compounds. K Hussian Reddy. General Article Volume 4 Issue 6 June 1999 pp 67-77 ...

  5. Coordination Compounds in Biology

    Indian Academy of Sciences (India)

    Coordination Compounds in Biology equatorial ligand, there are two axial ligands in most B. 12 derivatives. Derivatives of B12. The various derivatives of B. 12 result most commonly from changes in the axial ligands bound to cobalt. Often it is convenient to draw a greatly abbreviated structure for a B. 12 molecule using a ...

  6. Polymeric coordination compounds

    Indian Academy of Sciences (India)

    Administrator

    Metal coordination polymers with one- and two-dimensional structures are of current interest due to their possible relevance to material science 1. In continuation of our previous studies 2,3, several new polymeric compounds are reported here. Among the complexes of silver with aminomethyl pyridine (amp) ...

  7. Coordination of mobile labor

    Czech Academy of Sciences Publication Activity Database

    Steiner, Jakub

    2008-01-01

    Roč. 139, č. 1 (2008), s. 25-46 ISSN 0022-0531 R&D Projects: GA MŠk LC542 Institutional research plan: CEZ:AV0Z70850503 Keywords : coordination * general equilibrium * global games Subject RIV: AH - Economics Impact factor: 1.224, year: 2008

  8. Block coordination copolymers

    Energy Technology Data Exchange (ETDEWEB)

    Koh, Kyoung Moo; Wong-Foy, Antek G.; Matzger, Adam J.; Benin, Annabelle I.; Willis, Richard R.

    2012-12-04

    The present invention provides compositions of crystalline coordination copolymers wherein multiple organic molecules are assembled to produce porous framework materials with layered or core-shell structures. These materials are synthesized by sequential growth techniques such as the seed growth technique. In addition, the invention provides a simple procedure for controlling functionality.

  9. Block coordination copolymers

    Energy Technology Data Exchange (ETDEWEB)

    Koh, Kyoung Moo; Wong-Foy, Antek G; Matzger, Adam J; Benin, Annabelle I; Willis, Richard R

    2014-11-11

    The present invention provides compositions of crystalline coordination copolymers wherein multiple organic molecules are assembled to produce porous framework materials with layered or core-shell structures. These materials are synthesized by sequential growth techniques such as the seed growth technique. In addition, the invention provides a simple procedure for controlling functionality.

  10. Block coordination copolymers

    Energy Technology Data Exchange (ETDEWEB)

    Koh, Kyoung Moo; Wong-Foy, Antek G; Matzger, Adam J; Benin, Annabelle I; Willis, Richard R

    2012-11-13

    The present invention provides compositions of crystalline coordination copolymers wherein multiple organic molecules are assembled to produce porous framework materials with layered or core-shell structures. These materials are synthesized by sequential growth techniques such as the seed growth technique. In addition, the invention provides a simple procedure for controlling functionality.

  11. Reusability of coordination programs

    NARCIS (Netherlands)

    F. Arbab (Farhad); C.L. Blom (Kees); F.J. Burger (Freek); C.T.H. Everaars (Kees)

    1996-01-01

    textabstractIsolating computation and communication concerns into separate pure computation and pure coordination modules enhances modularity, understandability, and reusability of parallel and/or distributed software. This can be achieved by moving communication primitives (such as SendMessage and

  12. [Civilian-military coordination].

    Science.gov (United States)

    de Montravel, G

    2002-01-01

    Current humanitarian emergencies create complex, mutidimensional situations that stimulate simultaneous responses from a wide variety of sources including governments, non-governmental organizations (NGO), United Nations agencies, and private individuals. As a result, it has become essential to establish a coherent framework in which each actor can contribute promptly and effectively to the overall effort. This is the role of the United Nations Office for the Coordination of Humanitarian Affairs. Regardless of the circumstances and level of coordination, cooperation and collaboration between humanitarian and military personnel, it is necessary to bear in mind their objectives. The purpose of humanitarian action is to reduce human suffering. The purpose of military intervention is to stop warfare. The author of this article will discuss the three major obstacles to civilian-military coordination (strategic, tactical, and operational). Operations cannot be conducted smoothly and differences cannot be ironed out without mutual respect between the two parties, an explicit definition of their respective duties and responsibilities, a clear understanding of their cultural differences, and the presence of an organization and facilities for coordination and arbitrage by a neutral referee.

  13. Launch Control System Software Development System Automation Testing

    Science.gov (United States)

    Hwang, Andrew

    2017-01-01

    ) tool to Brandon Echols, a fellow intern, and I. The purpose of the OCR tool is to analyze an image and find the coordinates of any group of text. Some issues that arose while installing the OCR tool included the absence of certain libraries needed to train the tool and an outdated software version. We eventually resolved the issues and successfully installed the OCR tool. Training the tool required many images and different fonts and sizes, but in the end the tool learned to accurately decipher the text in the images and their coordinates. The OCR tool produced a file that contained significant metadata for each section of text, but only the text and coordinates of the text was required for our purpose. The team made a script to parse the information we wanted from the OCR file to a different file that would be used by automation functions within the automated framework. Since a majority of development and testing for the automated test cases for the GUI in question has been done using live simulated data on the workstations at the Launch Control Center (LCC), a large amount of progress has been made. As of this writing, about 60% of all of automated testing has been implemented. Additionally, the OCR tool will help make our automated tests more robust due to the tool's text recognition being highly scalable to different text fonts and text sizes. Soon we will have the whole test system automated, allowing for more full-time engineers working on development projects.

  14. The Wood Strengthening and Decorative Automated Production Line ZY-06L-Type Manipulator Motion Analysis and Simulation

    Directory of Open Access Journals (Sweden)

    Wu Hao

    2014-03-01

    Full Text Available ZY-06L-type manipulator is the important components of Wood Surface layer Strengthening and Decorative Automated Production Line, designed for a variety of sheet metal or semi-finished components handling, distribution and laying work in the workshop. Through the establishment of the manipulator kinematics equations, combined the position coordinates of the manipulator and object in the actual work environment, and solving this equation to get the relationship between the manipulator end-effectors position and posture with the joint variables. The use of engineering software UG NX8.0 for modeling and simulation of the manipulator, and analysis the reasonableness of structural and workflow design. Provide the basis and reference to the manipulator structure optimization and control systems developments.

  15. Automated high-dose rate brachytherapy treatment planning for a single-channel vaginal cylinder applicator

    Science.gov (United States)

    Zhou, Yuhong; Klages, Peter; Tan, Jun; Chi, Yujie; Stojadinovic, Strahinja; Yang, Ming; Hrycushko, Brian; Medin, Paul; Pompos, Arnold; Jiang, Steve; Albuquerque, Kevin; Jia, Xun

    2017-06-01

    High dose rate (HDR) brachytherapy treatment planning is conventionally performed manually and/or with aids of preplanned templates. In general, the standard of care would be elevated by conducting an automated process to improve treatment planning efficiency, eliminate human error, and reduce plan quality variations. Thus, our group is developing AutoBrachy, an automated HDR brachytherapy planning suite of modules used to augment a clinical treatment planning system. This paper describes our proof-of-concept module for vaginal cylinder HDR planning that has been fully developed. After a patient CT scan is acquired, the cylinder applicator is automatically segmented using image-processing techniques. The target CTV is generated based on physician-specified treatment depth and length. Locations of the dose calculation point, apex point and vaginal surface point, as well as the central applicator channel coordinates, and the corresponding dwell positions are determined according to their geometric relationship with the applicator and written to a structure file. Dwell times are computed through iterative quadratic optimization techniques. The planning information is then transferred to the treatment planning system through a DICOM-RT interface. The entire process was tested for nine patients. The AutoBrachy cylindrical applicator module was able to generate treatment plans for these cases with clinical grade quality. Computation times varied between 1 and 3 min on an Intel Xeon CPU E3-1226 v3 processor. All geometric components in the automated treatment plans were generated accurately. The applicator channel tip positions agreed with the manually identified positions with submillimeter deviations and the channel orientations between the plans agreed within less than 1 degree. The automatically generated plans obtained clinically acceptable quality.

  16. Coordinated Pitch & Torque Control of Large-Scale Wind Turbine Based on Pareto Eciency Analysis

    DEFF Research Database (Denmark)

    Lin, Zhongwei; Chen, Zhenyu; Wu, Qiuwei

    2018-01-01

    control subsystem and blade pitch control subsystem. Then, the pole positions in each control subsystem are adjusted coordinately to evaluate the controller participation and used as the objective of optimization. A two-level parameters-controllers coordinated optimization scheme is proposed and applied...

  17. System automation for a bacterial colony detection and identification instrument via forward scattering

    International Nuclear Information System (INIS)

    Bae, Euiwon; Hirleman, E Daniel; Aroonnual, Amornrat; Bhunia, Arun K; Robinson, J Paul

    2009-01-01

    A system design and automation of a microbiological instrument that locates bacterial colonies and captures the forward-scattering signatures are presented. The proposed instrument integrates three major components: a colony locator, a forward scatterometer and a motion controller. The colony locator utilizes an off-axis light source to illuminate a Petri dish and an IEEE1394 camera to capture the diffusively scattered light to provide the number of bacterial colonies and two-dimensional coordinate information of the bacterial colonies with the help of a segmentation algorithm with region-growing. Then the Petri dish is automatically aligned with the respective centroid coordinate with a trajectory optimization method, such as the Traveling Salesman Algorithm. The forward scatterometer automatically computes the scattered laser beam from a monochromatic image sensor via quadrant intensity balancing and quantitatively determines the centeredness of the forward-scattering pattern. The final scattering signatures are stored to be analyzed to provide rapid identification and classification of the bacterial samples

  18. Coordination chemistry of technetium

    International Nuclear Information System (INIS)

    Together with development of chemistry of radiopharmaceuticals, coordination chemistry of technetium has recently progressed. The synthetic procedures of technetium complexes have also been established in a various oxidation states. Especially, technetium(V) complexes which possess a Tc=0 or 0=Tc = 0 core are interested from a view point kinetic stability. In the present paper, substitution reaction mechanisms of Tc(V), Tc(IV) and Tc(III) complexes coordinated with a β-diketone as ligand are discussed in connection with the structural chemistry of technetium complexes. The base hydrolysis of halobis(β-diketonato)oxotechnetium(V) and dihalobis (β-diketonato)technetium(IV) complexes liberate halide ion by the attack of hydroxide ion, followed by the liberation of β-diketones. Technetium is found to be pertechnetate as a final product

  19. Coordinating Work with Groupware

    DEFF Research Database (Denmark)

    Pors, Jens Kaaber; Simonsen, Jesper

    2003-01-01

    One important goal of employing groupware is to make possible complex collaboration between geographically distributed groups. This requires a dual transformation of both technology and work practice. The challenge is to re­duce the complexity of the coordination work by successfully inte......­grating the protocol stipulating the collaboration and the ar­te­fact, in form of the groupware application, mediating the col­laboration. This paper analyses a generic groupware application that was deployed in a large financial organisation in order to support working groups distributed throughout four countries....... Using the CSCW frame­work of coordination mechanisms, we have elicited six general factors influencing the integration of the groupware application in two situations....

  20. Coordinated Voltage Control Scheme for VSC-HVDC Connected Wind Power Plants

    OpenAIRE

    Guo, Yifei; Gao, Houlei; Wu, Qiuwei; Zhao, Haoran; Østergaard, Jacob

    2017-01-01

    This paper proposes a coordinated voltage control scheme based on model predictive control (MPC) for voltage source converter‐based high voltage direct current (VSC‐HVDC) connected wind power plants (WPPs). In the proposed scheme, voltage regulation capabilities of VSC and WTGs are fully utilized and optimally coordinated. Two control modes, namely operation optimization mode and corrective mode, are designed to coordinate voltage control and economic operation of the system. In the first mod...

  1. Coordinating towards a Common Good

    Science.gov (United States)

    Santos, Francisco C.; Pacheco, Jorge M.

    2010-09-01

    Throughout their life, humans often engage in collective endeavors ranging from family related issues to global warming. In all cases, the tragedy of the commons threatens the possibility of reaching the optimal solution associated with global cooperation, a scenario predicted by theory and demonstrated by many experiments. Using the toolbox of evolutionary game theory, I will address two important aspects of evolutionary dynamics that have been neglected so far in the context of public goods games and evolution of cooperation. On one hand, the fact that often there is a threshold above which a public good is reached [1, 2]. On the other hand, the fact that individuals often participate in several games, related to the their social context and pattern of social ties, defined by a social network [3, 4, 5]. In the first case, the existence of a threshold above which collective action is materialized dictates a rich pattern of evolutionary dynamics where the direction of natural selection can be inverted compared to standard expectations. Scenarios of defector dominance, pure coordination or coexistence may arise simultaneously. Both finite and infinite population models are analyzed. In networked games, cooperation blooms whenever the act of contributing is more important than the effort contributed. In particular, the heterogeneous nature of social networks naturally induces a symmetry breaking of the dilemmas of cooperation, as contributions made by cooperators may become contingent on the social context in which the individual is embedded. This diversity in context provides an advantage to cooperators, which is particularly strong when both wealth and social ties follow a power-law distribution, providing clues on the self-organization of social communities. Finally, in both situations, it can be shown that individuals no longer play a defection dominance dilemma, but effectively engage in a general N-person coordination game. Even if locally defection may seem

  2. Improving Project Manufacturing Coordination

    Directory of Open Access Journals (Sweden)

    Korpivaara Ville

    2014-09-01

    Full Text Available The objective of this research is to develop firms’ project manufacturing coordination. The development will be made by centralizing the manufacturing information flows in one system. To be able to centralize information, a deep user need assessment is required. After user needs have been identified, the existing system will be developed to match these needs. The theoretical background is achieved through exploring the literature of project manufacturing, development project success factors and different frameworks and tools for development project execution. The focus of this research is rather in customer need assessment than in system’s technical expertise. To ensure the deep understanding of customer needs this study is executed by action research method. As a result of this research the information system for project manufacturing coordination was developed to respond revealed needs of the stakeholders. The new system improves the quality of the manufacturing information, eliminates waste in manufacturing coordination processes and offers a better visibility to the project manufacturing. Hence it provides a solid base for the further development of project manufacturing.

  3. Universal mechatronics coordinator

    Science.gov (United States)

    Muir, Patrick F.

    1999-11-01

    Mechatronic systems incorporate multiple actuators and sensor which must be properly coordinated to achieve the desired system functionality. Many mechatronic systems are designed as one-of-a-kind custom projects without consideration for facilitating future system or alterations and extensions to the current syste. Thus, subsequent changes to the system are slow, different, and costly. It has become apparent that manufacturing processes, and thus the mechatronics which embody them, need to be agile in order to more quickly and easily respond to changing customer demands or market pressures. To achieve agility, both the hardware and software of the system need to be designed such that the creation of new system and the alteration and extension of current system is fast and easy. This paper describes the design of a Universal Mechatronics Coordinator (UMC) which facilitates agile setup and changeover of coordination software for mechatronic systems. The UMC is capable of sequencing continuous and discrete actions that are programmed as stimulus-response pairs, as state machines, or a combination of the two. It facilitates the modular, reusable programing of continuous actions such as servo control algorithms, data collection code, and safety checking routines; and discrete actions such as reporting achieved states, and turning on/off binary devices. The UMC has been applied to the control of a z- theta assembly robot for the Minifactory project and is applicable to a spectrum of widely differing mechatronic systems.

  4. Coordinating Shared Activities

    Science.gov (United States)

    Clement, Bradley

    2004-01-01

    Shared Activity Coordination (ShAC) is a computer program for planning and scheduling the activities of an autonomous team of interacting spacecraft and exploratory robots. ShAC could also be adapted to such terrestrial uses as helping multiple factory managers work toward competing goals while sharing such common resources as floor space, raw materials, and transports. ShAC iteratively invokes the Continuous Activity Scheduling Planning Execution and Replanning (CASPER) program to replan and propagate changes to other planning programs in an effort to resolve conflicts. A domain-expert specifies which activities and parameters thereof are shared and reports the expected conditions and effects of these activities on the environment. By specifying these conditions and effects differently for each planning program, the domain-expert subprogram defines roles that each spacecraft plays in a coordinated activity. The domain-expert subprogram also specifies which planning program has scheduling control over each shared activity. ShAC enables sharing of information, consensus over the scheduling of collaborative activities, and distributed conflict resolution. As the other planning programs incorporate new goals and alter their schedules in the changing environment, ShAC continually coordinates to respond to unexpected events.

  5. Global coordination: weighted voting

    Directory of Open Access Journals (Sweden)

    Jan-Erik Lane

    2014-03-01

    Full Text Available In order to halt the depletion of global ecological capital, a number of different kinds of meetings between Governments of countries in the world has been scheduled. The need for global coordination of environmental policies has become ever more obvious, supported by more and more evidence of the running down of ecological capital. But there are no formal or binding arrangements in sight, as global environmental coordination suffers from high transaction costs (qualitative voting. The CO2 equivalent emissions, resulting in global warming, are driven by the unstoppable economic expansion in the global market economy, employing mainly fossil fuel generated energy, although at the same time lifting sharply the GDP per capita of several emerging countries. Only global environmental coordination on the successful model of the World Band and the IMF (quantitative voting can stem the rising emissions numbers and stop further environmental degradation. However, the system of weighted voting in the WB and the IMF must be reformed by reducing the excessive voting power disparities, for instance by reducing all member country votes by the cube root expression.

  6. Incremental learning for automated knowledge capture

    Energy Technology Data Exchange (ETDEWEB)

    Benz, Zachary O. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Basilico, Justin Derrick [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Davis, Warren Leon [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dixon, Kevin R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jones, Brian S. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Martin, Nathaniel [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wendt, Jeremy Daniel [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2013-12-01

    People responding to high-consequence national-security situations need tools to help them make the right decision quickly. The dynamic, time-critical, and ever-changing nature of these situations, especially those involving an adversary, require models of decision support that can dynamically react as a situation unfolds and changes. Automated knowledge capture is a key part of creating individualized models of decision making in many situations because it has been demonstrated as a very robust way to populate computational models of cognition. However, existing automated knowledge capture techniques only populate a knowledge model with data prior to its use, after which the knowledge model is static and unchanging. In contrast, humans, including our national-security adversaries, continually learn, adapt, and create new knowledge as they make decisions and witness their effect. This artificial dichotomy between creation and use exists because the majority of automated knowledge capture techniques are based on traditional batch machine-learning and statistical algorithms. These algorithms are primarily designed to optimize the accuracy of their predictions and only secondarily, if at all, concerned with issues such as speed, memory use, or ability to be incrementally updated. Thus, when new data arrives, batch algorithms used for automated knowledge capture currently require significant recomputation, frequently from scratch, which makes them ill suited for use in dynamic, timecritical, high-consequence decision making environments. In this work we seek to explore and expand upon the capabilities of dynamic, incremental models that can adapt to an ever-changing feature space.

  7. Automating the CMS DAQ

    Energy Technology Data Exchange (ETDEWEB)

    Bauer, G.; et al.

    2014-01-01

    We present the automation mechanisms that have been added to the Data Acquisition and Run Control systems of the Compact Muon Solenoid (CMS) experiment during Run 1 of the LHC, ranging from the automation of routine tasks to automatic error recovery and context-sensitive guidance to the operator. These mechanisms helped CMS to maintain a data taking efficiency above 90% and to even improve it to 95% towards the end of Run 1, despite an increase in the occurrence of single-event upsets in sub-detector electronics at high LHC luminosity.

  8. Control and automation systems

    International Nuclear Information System (INIS)

    Schmidt, R.; Zillich, H.

    1986-01-01

    A survey is given of the development of control and automation systems for energy uses. General remarks about control and automation schemes are followed by a description of modern process control systems along with process control processes as such. After discussing the particular process control requirements of nuclear power plants the paper deals with the reliability and availability of process control systems and refers to computerized simulation processes. The subsequent paragraphs are dedicated to descriptions of the operating floor, ergonomic conditions, existing systems, flue gas desulfurization systems, the electromagnetic influences on digital circuits as well as of light wave uses. (HAG) [de

  9. Automated nuclear materials accounting

    International Nuclear Information System (INIS)

    Pacak, P.; Moravec, J.

    1982-01-01

    An automated state system of accounting for nuclear materials data was established in Czechoslovakia in 1979. A file was compiled of 12 programs in the PL/1 language. The file is divided into four groups according to logical associations, namely programs for data input and checking, programs for handling the basic data file, programs for report outputs in the form of worksheets and magnetic tape records, and programs for book inventory listing, document inventory handling and materials balance listing. A similar automated system of nuclear fuel inventory for a light water reactor was introduced for internal purposes in the Institute of Nuclear Research (UJV). (H.S.)

  10. A scheme for a future distribution automation system in Finnish utilities

    Energy Technology Data Exchange (ETDEWEB)

    Lehtonen, M.; Kaerkkaeinen, S. [VTT Energy, Espoo (Finland); Partanen, J. [Lappeenranta Univ. of Technology (Finland)

    1998-08-01

    This presentation summarizes the results of a project, the aim of which was to define the optimal set of functions for the future distribution automation (DA) systems in Finland. The general factors, which affect the automation needs, are first discussed. The benefits of various functions of DA and demand side management (DSM) are then studied. Next a computer model for a DA feasibility analysis is presented, and some computation results are given. From these, the proposed automation scheme is finally concluded

  11. Altering user' acceptance of automation through prior automation exposure.

    Science.gov (United States)

    Bekier, Marek; Molesworth, Brett R C

    2017-06-01

    Air navigation service providers worldwide see increased use of automation as one solution to overcome the capacity constraints imbedded in the present air traffic management (ATM) system. However, increased use of automation within any system is dependent on user acceptance. The present research sought to determine if the point at which an individual is no longer willing to accept or cooperate with automation can be manipulated. Forty participants underwent training on a computer-based air traffic control programme, followed by two ATM exercises (order counterbalanced), one with and one without the aid of automation. Results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation ('tipping point') decreased; suggesting it is indeed possible to alter automation acceptance. Practitioner Summary: This paper investigates whether the point at which a user of automation rejects automation (i.e. 'tipping point') is constant or can be manipulated. The results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation decreased; suggesting it is possible to alter automation acceptance.

  12. Effects of an Automated Maintenance Management System on organizational communication

    International Nuclear Information System (INIS)

    Bauman, M.B.; VanCott, H.P.

    1988-01-01

    The primary purpose of the project was to evaluate the effectiveness of two techniques for improving organizational communication: (1) an Automated Maintenance Management System (AMMS) and (2) Interdepartmental Coordination Meetings. Additional objectives concerned the preparation of functional requirements for an AMMS, and training modules to improve group communication skills. Four nuclear power plants participated in the evaluation. Two plants installed AMMSs, one plant instituted interdepartmental job coordination meetings, and the fourth plant served as a control for the evaluation. Questionnaires and interviews were used to collect evaluative data. The evaluation focused on five communication or information criteria: timeliness, redundancy, withholding or gatekeeping, feedback, and accuracy/amount

  13. Symmetric two-coordinate photodiode

    Directory of Open Access Journals (Sweden)

    Dobrovolskiy Yu. G.

    2008-12-01

    Full Text Available The two-coordinate photodiode is developed and explored on the longitudinal photoeffect, which allows to get the coordinate descriptions symmetric on the steepness and longitudinal resistance great exactness. It was shown, that the best type of the coordinate description is observed in the case of scanning by the optical probe on the central part of the photosensitive element. The ways of improvement of steepness and linear of its coordinate description were analyzed.

  14. Idaho: Library Automation and Connectivity.

    Science.gov (United States)

    Bolles, Charles

    1996-01-01

    Provides an overview of the development of cooperative library automation and connectivity in Idaho, including telecommunications capacity, library networks, the Internet, and the role of the state library. Information on six shared automation systems in Idaho is included. (LRW)

  15. Work Coordination Engine

    Science.gov (United States)

    Zendejas, Silvino; Bui, Tung; Bui, Bach; Malhotra, Shantanu; Chen, Fannie; Kim, Rachel; Allen, Christopher; Luong, Ivy; Chang, George; Sadaqathulla, Syed

    2009-01-01

    The Work Coordination Engine (WCE) is a Java application integrated into the Service Management Database (SMDB), which coordinates the dispatching and monitoring of a work order system. WCE de-queues work orders from SMDB and orchestrates the dispatching of work to a registered set of software worker applications distributed over a set of local, or remote, heterogeneous computing systems. WCE monitors the execution of work orders once dispatched, and accepts the results of the work order by storing to the SMDB persistent store. The software leverages the use of a relational database, Java Messaging System (JMS), and Web Services using Simple Object Access Protocol (SOAP) technologies to implement an efficient work-order dispatching mechanism capable of coordinating the work of multiple computer servers on various platforms working concurrently on different, or similar, types of data or algorithmic processing. Existing (legacy) applications can be wrapped with a proxy object so that no changes to the application are needed to make them available for integration into the work order system as "workers." WCE automatically reschedules work orders that fail to be executed by one server to a different server if available. From initiation to completion, the system manages the execution state of work orders and workers via a well-defined set of events, states, and actions. It allows for configurable work-order execution timeouts by work-order type. This innovation eliminates a current processing bottleneck by providing a highly scalable, distributed work-order system used to quickly generate products needed by the Deep Space Network (DSN) to support space flight operations. WCE is driven by asynchronous messages delivered via JMS indicating the availability of new work or workers. It runs completely unattended in support of the lights-out operations concept in the DSN.

  16. Coordinating a Two-Echelon Supply Chain under Carbon Tax

    Directory of Open Access Journals (Sweden)

    Wei Yu

    2017-12-01

    Full Text Available In this paper, we study the impact of carbon tax on carbon emission and retail price in a two-echelon supply chain consisting of a manufacturer and a retailer. Specifically, by adopting two types of contracts, i.e., the modified wholesale price (MW and the modified cost-sharing contract (MS, supply chain coordination is achieved, which promotes the supply chain efficiency. Our study shows that: (1 with the increase of carbon tax, both the optimal emission reduction level and the optimal retail price increase, and then keep unchanged; (2 neither MW nor MS benefits the manufacturer after the supply chain coordination; and (3 to effectively coordinate the supply chain, we propose an innovative supply chain contract that integrates the firms’ optimal decisions under MW or MS with a two part tariff contract (TPT and a fixed fee the retailer can pay to ensure a win–win solution.

  17. Automated HAZOP revisited

    DEFF Research Database (Denmark)

    Taylor, J. R.

    2017-01-01

    Hazard and operability analysis (HAZOP) has developed from a tentative approach to hazard identification for process plants in the early 1970s to an almost universally accepted approach today, and a central technique of safety engineering. Techniques for automated HAZOP analysis were developed...

  18. Automated data model evaluation

    International Nuclear Information System (INIS)

    Kazi, Zoltan; Kazi, Ljubica; Radulovic, Biljana

    2012-01-01

    Modeling process is essential phase within information systems development and implementation. This paper presents methods and techniques for analysis and evaluation of data model correctness. Recent methodologies and development results regarding automation of the process of model correctness analysis and relations with ontology tools has been presented. Key words: Database modeling, Data model correctness, Evaluation

  19. Automated Vehicle Monitoring System

    OpenAIRE

    Wibowo, Agustinus Deddy Arief; Heriansyah, Rudi

    2014-01-01

    An automated vehicle monitoring system is proposed in this paper. The surveillance system is based on image processing techniques such as background subtraction, colour balancing, chain code based shape detection, and blob. The proposed system will detect any human's head as appeared at the side mirrors. The detected head will be tracked and recorded for further action.

  20. Automated Accounting. Instructor Guide.

    Science.gov (United States)

    Moses, Duane R.

    This curriculum guide was developed to assist business instructors using Dac Easy Accounting College Edition Version 2.0 software in their accounting programs. The module consists of four units containing assignment sheets and job sheets designed to enable students to master competencies identified in the area of automated accounting. The first…

  1. Mechatronic Design Automation

    DEFF Research Database (Denmark)

    Fan, Zhun

    successfully design analogue filters, vibration absorbers, micro-electro-mechanical systems, and vehicle suspension systems, all in an automatic or semi-automatic way. It also investigates the very important issue of co-designing plant-structures and dynamic controllers in automated design of Mechatronic...

  2. Protokoller til Home Automation

    DEFF Research Database (Denmark)

    Kjær, Kristian Ellebæk

    2008-01-01

    computer, der kan skifte mellem foruddefinerede indstillinger. Nogle gange kan computeren fjernstyres over internettet, så man kan se hjemmets status fra en computer eller måske endda fra en mobiltelefon. Mens nævnte anvendelser er klassiske indenfor home automation, er yderligere funktionalitet dukket op...

  3. Automated Water Extraction Index

    DEFF Research Database (Denmark)

    Feyisa, Gudina Legese; Meilby, Henrik; Fensholt, Rasmus

    2014-01-01

    of various sorts of environmental noise and at the same time offers a stable threshold value. Thus we introduced a new Automated Water Extraction Index (AWEI) improving classification accuracy in areas that include shadow and dark surfaces that other classification methods often fail to classify correctly...

  4. Myths in test automation

    Directory of Open Access Journals (Sweden)

    Jazmine Francis

    2015-01-01

    Full Text Available Myths in automation of software testing is an issue of discussion that echoes about the areas of service in validation of software industry. Probably, the first though that appears in knowledgeable reader would be Why this old topic again? What's New to discuss the matter? But, for the first time everyone agrees that undoubtedly automation testing today is not today what it used to be ten or fifteen years ago, because it has evolved in scope and magnitude. What began as a simple linear scripts for web applications today has a complex architecture and a hybrid framework to facilitate the implementation of testing applications developed with various platforms and technologies. Undoubtedly automation has advanced, but so did the myths associated with it. The change in perspective and knowledge of people on automation has altered the terrain. This article reflects the points of views and experience of the author in what has to do with the transformation of the original myths in new versions, and how they are derived; also provides his thoughts on the new generation of myths.

  5. Multi-energy Coordinated Evaluation for Energy Internet

    Science.gov (United States)

    Jia, Dongqiang; Sun, Jian; Wang, Cunping; Hong, Xiao; Ma, Xiufan; Xiong, Wenting; Shen, Yaqi

    2017-05-01

    This paper reviews the current research status of multi-energy coordinated evaluation for energy Internet. Taking the coordinated optimization effect of wind energy, solar energy and other energy sources into consideration, 17 evaluation indexes, such as the substitution coefficient of cold heat and power, the ratio of wind and solar energy, and the rate of energy storage ratio, were designed from five aspects, including the acceptance of renewable energy, energy complementary alternative benefits, peak valley difference, the degree of equipment utilization and user needs. At the same time, this article attaches importance to the economic and social benefits of the coordination of multiple energy sources. Ultimately, a comprehensive multi-energy coordination evaluation index system of regional energy Internet was put forward from the safe operation, coordination and optimization, economic and social benefits four aspects, and a comprehensive evaluation model was established. This model uses the optimal combination weighting method based on moment estimation and Topsis evaluation analysis method, so both the subjective and objective weight of the index are considered and the coordinate evaluation of multi-energy is realized. Finally the perfection of the index system and the validity of the evaluation method are verified by a case analysis.

  6. Driver Psychology during Automated Platooning

    NARCIS (Netherlands)

    Heikoop, D.D.

    2017-01-01

    With the rapid increase in vehicle automation technology, the call for understanding how humans behave while driving in an automated vehicle becomes more urgent. Vehicles that have automated systems such as Lane Keeping Assist (LKA) or Adaptive Cruise Control (ACC) not only support drivers in their

  7. Advanced automation for in-space vehicle processing

    Science.gov (United States)

    Sklar, Michael; Wegerif, D.

    1990-01-01

    The primary objective of this 3-year planned study is to assure that the fully evolved Space Station Freedom (SSF) can support automated processing of exploratory mission vehicles. Current study assessments show that required extravehicular activity (EVA) and to some extent intravehicular activity (IVA) manpower requirements for required processing tasks far exceeds the available manpower. Furthermore, many processing tasks are either hazardous operations or they exceed EVA capability. Thus, automation is essential for SSF transportation node functionality. Here, advanced automation represents the replacement of human performed tasks beyond the planned baseline automated tasks. Both physical tasks such as manipulation, assembly and actuation, and cognitive tasks such as visual inspection, monitoring and diagnosis, and task planning are considered. During this first year of activity both the Phobos/Gateway Mars Expedition and Lunar Evolution missions proposed by the Office of Exploration have been evaluated. A methodology for choosing optimal tasks to be automated has been developed. Processing tasks for both missions have been ranked on the basis of automation potential. The underlying concept in evaluating and describing processing tasks has been the use of a common set of 'Primitive' task descriptions. Primitive or standard tasks have been developed both for manual or crew processing and automated machine processing.

  8. Optimization and Optimal Control

    CERN Document Server

    Chinchuluun, Altannar; Enkhbat, Rentsen; Tseveendorj, Ider

    2010-01-01

    During the last four decades there has been a remarkable development in optimization and optimal control. Due to its wide variety of applications, many scientists and researchers have paid attention to fields of optimization and optimal control. A huge number of new theoretical, algorithmic, and computational results have been observed in the last few years. This book gives the latest advances, and due to the rapid development of these fields, there are no other recent publications on the same topics. Key features: Provides a collection of selected contributions giving a state-of-the-art accou

  9. Stepwise multi-criteria optimization for robotic radiosurgery

    International Nuclear Information System (INIS)

    Schlaefer, A.; Schweikard, A.

    2008-01-01

    Achieving good conformality and a steep dose gradient around the target volume remains a key aspect of radiosurgery. Clearly, this involves a trade-off between target coverage, conformality of the dose distribution, and sparing of critical structures. Yet, image guidance and robotic beam placement have extended highly conformal dose delivery to extracranial and moving targets. Therefore, the multi-criteria nature of the optimization problem becomes even more apparent, as multiple conflicting clinical goals need to be considered coordinate to obtain an optimal treatment plan. Typically, planning for robotic radiosurgery is based on constrained optimization, namely linear programming. An extension of that approach is presented, such that each of the clinical goals can be addressed separately and in any sequential order. For a set of common clinical goals the mapping to a mathematical objective and a corresponding constraint is defined. The trade-off among the clinical goals is explored by modifying the constraints and optimizing a simple objective, while retaining feasibility of the solution. Moreover, it becomes immediately obvious whether a desired goal can be achieved and where a trade-off is possible. No importance factors or predefined prioritizations of clinical goals are necessary. The presented framework forms the basis for interactive and automated planning procedures. It is demonstrated for a sample case that the linear programming formulation is suitable to search for a clinically optimal treatment, and that the optimization steps can be performed quickly to establish that a Pareto-efficient solution has been found. Furthermore, it is demonstrated how the stepwise approach is preferable compared to modifying importance factors

  10. Need for coordinated programs to improve global health by optimizing salt and iodine intake Necesidad de programas coordinados para mejorar la salud a escala mundial mediante la optimización de la ingesta de sal y yodo

    Directory of Open Access Journals (Sweden)

    Norm R. C. Campbell

    2012-10-01

    Full Text Available High dietary salt is a major cause of increased blood pressure, the leading risk for death worldwide. The World Health Organization (WHO has recommended that salt intake be less than 5 g/day, a goal that only a small proportion of people achieve. Iodine deficiency can cause cognitive and motor impairment and, if severe, hypothyroidism with serious mental and growth retardation. More than 2 billion people worldwide are at risk of iodine deficiency. Preventing iodine deficiency by using salt fortified with iodine is a major global public health success. Programs to reduce dietary salt are technically compatible with programs to prevent iodine deficiency through salt fortification. However, for populations to fully benefit from optimum intake of salt and iodine, the programs must be integrated. This review summarizes the scientific basis for salt reduction and iodine fortification programs, the compatibility of the programs, and the steps that need to be taken by the WHO, national governments, and nongovernmental organizations to ensure that populations fully benefit from optimal intake of salt and iodine. Specifically, expert groups must be convened to help countries implement integrated programs and context-specific case studies of successfully integrated programs; lessons learned need to be compiled and disseminated. Integrated surveillance programs will be more efficient and will enhance current efforts to optimize intake of iodine and salt. For populations to fully benefit, governments need to place a high priority on integrating these two important public health programs.El alto contenido de sal en la dieta es una causa principal de incremento de la presión arterial, el principal factor de riesgo de muerte a escala mundial. La Organización Mundial de la Salud (OMS ha recomendado que el consumo de sal sea inferior a 5 g/d, una meta que solo logran una pequeña proporción de personas. La falta de yodo puede causar deficiencia cognoscitiva y

  11. Client application for automated management training system of NPP personnel

    International Nuclear Information System (INIS)

    Pribysh, P.I.; Poplavskij, I.A.; Karpej, A.L.

    2016-01-01

    This paper describes the client side of automated management training system. This system will optimize the speed of the organization and quality of the training plan; reduce the time of collecting the necessary documentation and facilitate the analysis of the results. (authors)

  12. Automation of Geometry Input for Building Code Compliance Check

    DEFF Research Database (Denmark)

    Petrova, Ekaterina Aleksandrova; Johansen, Peter Lind; Jensen, Rasmus Lund

    2017-01-01

    . That has left the industry in constant pursuit of possibilities for integration of the tool within the Building Information Modelling environment so that the potential provided by the latter can be harvested and the processed can be optimized. This paper presents a solution for automated data extraction...

  13. Coordinating Group report

    International Nuclear Information System (INIS)

    1994-01-01

    In December 1992, western governors and four federal agencies established a Federal Advisory Committee to Develop On-site Innovative Technologies for Environmental Restoration and Waste Management (the DOIT Committee). The purpose of the Committee is to advise the federal government on ways to improve waste cleanup technology development and the cleanup of federal sites in the West. The Committee directed in January 1993 that information be collected from a wide range of potential stakeholders and that innovative technology candidate projects be identified, organized, set in motion, and evaluated to test new partnerships, regulatory approaches, and technologies which will lead to improve site cleanup. Five working groups were organized, one to develop broad project selection and evaluation criteria and four to focus on specific contaminant problems. A Coordinating Group comprised of working group spokesmen and federal and state representatives, was set up to plan and organize the routine functioning of these working groups. The working groups were charged with defining particular contaminant problems; identifying shortcomings in technology development, stakeholder involvement, regulatory review, and commercialization which impede the resolution of these problems; and identifying candidate sites or technologies which could serve as regional innovative demonstration projects to test new approaches to overcome the shortcomings. This report from the Coordinating Group to the DOIT Committee highlights the key findings and opportunities uncovered by these fact-finding working groups. It provides a basis from which recommendations from the DOIT Committee to the federal government can be made. It also includes observations from two public roundtables, one on commercialization and another on regulatory and institutional barriers impeding technology development and cleanup

  14. Controlling computers and apparatus of the automation subsystems of the ''Del'fin'' facility

    International Nuclear Information System (INIS)

    Allin, A.P.; Belen'kii, Y.M.; Borzyak, Y.V.

    1983-01-01

    The architecture of the controlling computer assembly (CCA) of the ''Del'fin'' facility, the apparatus, and the software of the system are considered. The apparatus of the ''supply'' automation system controls the energy supply to the amplifier module on the basis of a capacitor bank with 2.5-mJ energy. The elemental base of the ''adjustment'' automation subsystem is developed. It includes the mounts for the mirrors actuated by stepper motors (SM), the drivers of the stepper motors, the optical heads with coordinate-sensitive receivers, logic blocks, and other elements. The trends in the development of CCA and of automation subsystems are considered

  15. Rapid automated nuclear chemistry

    International Nuclear Information System (INIS)

    Meyer, R.A.

    1979-01-01

    Rapid Automated Nuclear Chemistry (RANC) can be thought of as the Z-separation of Neutron-rich Isotopes by Automated Methods. The range of RANC studies of fission and its products is large. In a sense, the studies can be categorized into various energy ranges from the highest where the fission process and particle emission are considered, to low energies where nuclear dynamics are being explored. This paper presents a table which gives examples of current research using RANC on fission and fission products. The remainder of this text is divided into three parts. The first contains a discussion of the chemical methods available for the fission product elements, the second describes the major techniques, and in the last section, examples of recent results are discussed as illustrations of the use of RANC

  16. ATLAS Distributed Computing Automation

    CERN Document Server

    Schovancova, J; The ATLAS collaboration; Borrego, C; Campana, S; Di Girolamo, A; Elmsheuser, J; Hejbal, J; Kouba, T; Legger, F; Magradze, E; Medrano Llamas, R; Negri, G; Rinaldi, L; Sciacca, G; Serfon, C; Van Der Ster, D C

    2012-01-01

    The ATLAS Experiment benefits from computing resources distributed worldwide at more than 100 WLCG sites. The ATLAS Grid sites provide over 100k CPU job slots, over 100 PB of storage space on disk or tape. Monitoring of status of such a complex infrastructure is essential. The ATLAS Grid infrastructure is monitored 24/7 by two teams of shifters distributed world-wide, by the ATLAS Distributed Computing experts, and by site administrators. In this paper we summarize automation efforts performed within the ATLAS Distributed Computing team in order to reduce manpower costs and improve the reliability of the system. Different aspects of the automation process are described: from the ATLAS Grid site topology provided by the ATLAS Grid Information System, via automatic site testing by the HammerCloud, to automatic exclusion from production or analysis activities.

  17. Automated Assembly Center (AAC)

    Science.gov (United States)

    Stauffer, Robert J.

    1993-01-01

    The objectives of this project are as follows: to integrate advanced assembly and assembly support technology under a comprehensive architecture; to implement automated assembly technologies in the production of high-visibility DOD weapon systems; and to document the improved cost, quality, and lead time. This will enhance the production of DOD weapon systems by utilizing the latest commercially available technologies combined into a flexible system that will be able to readily incorporate new technologies as they emerge. Automated assembly encompasses the following areas: product data, process planning, information management policies and framework, three schema architecture, open systems communications, intelligent robots, flexible multi-ability end effectors, knowledge-based/expert systems, intelligent workstations, intelligent sensor systems, and PDES/PDDI data standards.

  18. Automated drawing generation system

    International Nuclear Information System (INIS)

    Yoshinaga, Toshiaki; Kawahata, Junichi; Yoshida, Naoto; Ono, Satoru

    1991-01-01

    Since automated CAD drawing generation systems still require human intervention, improvements were focussed on an interactive processing section (data input and correcting operation) which necessitates a vast amount of work. As a result, human intervention was eliminated, the original objective of a computerized system. This is the first step taken towards complete automation. The effects of development and commercialization of the system are as described below. (1) The interactive processing time required for generating drawings was improved. It was determined that introduction of the CAD system has reduced the time required for generating drawings. (2) The difference in skills between workers preparing drawings has been eliminated and the quality of drawings has been made uniform. (3) The extent of knowledge and experience demanded of workers has been reduced. (author)

  19. Automated fingerprint identification system

    International Nuclear Information System (INIS)

    Bukhari, U.A.; Sheikh, N.M.; Khan, U.I.; Mahmood, N.; Aslam, M.

    2002-01-01

    In this paper we present selected stages of an automated fingerprint identification system. The software for the system is developed employing algorithm for two-tone conversion, thinning, feature extraction and matching. Keeping FBI standards into account, it has been assured that no details of the image are lost in the comparison process. We have deployed a general parallel thinning algorithm for specialized images like fingerprints and modified the original algorithm after a series of experimentation selecting the one giving the best results. We also proposed an application-based approach for designing automated fingerprint identification systems keeping in view systems requirements. We will show that by using our system, the precision and efficiency of current fingerprint matching techniques are increased. (author)

  20. Automated breeder fuel fabrication

    International Nuclear Information System (INIS)

    Goldmann, L.H.; Frederickson, J.R.

    1983-01-01

    The objective of the Secure Automated Fabrication (SAF) Project is to develop remotely operated equipment for the processing and manufacturing of breeder reactor fuel pins. The SAF line will be installed in the Fuels and Materials Examination Facility (FMEF). The FMEF is presently under construction at the Department of Energy's (DOE) Hanford site near Richland, Washington, and is operated by the Westinghouse Hanford Company (WHC). The fabrication and support systems of the SAF line are designed for computer-controlled operation from a centralized control room. Remote and automated fuel fabriction operations will result in: reduced radiation exposure to workers; enhanced safeguards; improved product quality; near real-time accountability, and increased productivity. The present schedule calls for installation of SAF line equipment in the FMEF beginning in 1984, with qualifying runs starting in 1986 and production commencing in 1987. 5 figures

  1. [The study of medical supplies automation replenishment algorithm in hospital on medical supplies supplying chain].

    Science.gov (United States)

    Sheng, Xi

    2012-07-01

    The thesis aims to study the automation replenishment algorithm in hospital on medical supplies supplying chain. The mathematical model and algorithm of medical supplies automation replenishment are designed through referring to practical data form hospital on the basis of applying inventory theory, greedy algorithm and partition algorithm. The automation replenishment algorithm is proved to realize automatic calculation of the medical supplies distribution amount and optimize medical supplies distribution scheme. A conclusion could be arrived that the model and algorithm of inventory theory, if applied in medical supplies circulation field, could provide theoretical and technological support for realizing medical supplies automation replenishment of hospital on medical supplies supplying chain.

  2. Learning from Automation Surprises and "Going Sour" Accidents: Progress on Human-Centered Automation

    Science.gov (United States)

    Woods, David D.; Sarter, Nadine B.

    1998-01-01

    Advances in technology and new levels of automation on commercial jet transports has had many effects. There have been positive effects from both an economic and a safety point of view. The technology changes on the flight deck also have had reverberating effects on many other aspects of the aviation system and different aspects of human performance. Operational experience, research investigations, incidents, and occasionally accidents have shown that new and sometimes surprising problems have arisen as well. What are these problems with cockpit automation, and what should we learn from them? Do they represent over-automation or human error? Or instead perhaps there is a third possibility - they represent coordination breakdowns between operators and the automation? Are the problems just a series of small independent glitches revealed by specific accidents or near misses? Do these glitches represent a few small areas where there are cracks to be patched in what is otherwise a record of outstanding designs and systems? Or do these problems provide us with evidence about deeper factors that we need to address if we are to maintain and improve aviation safety in a changing world? How do the reverberations of technology change on the flight deck provide insight into generic issues about developing human-centered technologies and systems (Winograd and Woods, 1997)? Based on a series of investigations of pilot interaction with cockpit automation (Sarter and Woods, 1992; 1994; 1995; 1997a, 1997 b), supplemented by surveys, operational experience and incident data from other studies (e.g., Degani et al., 1995; Eldredge et al., 1991; Tenney et al., 1995; Wiener, 1989), we too have found that the problems that surround crew interaction with automation are more than a series of individual glitches. These difficulties are symptoms that indicate deeper patterns and phenomena concerning human-machine cooperation and paths towards disaster. In addition, we find the same kinds of

  3. Automated Instrumentation System Verification.

    Science.gov (United States)

    1983-04-01

    fUig JDma Entered) i. _-_J I ___________ UNCLASSI FI ED SECURITY CLASSIFICATION OF TIHIS PAGE(II7,m Daca Entod) 20. ABSTRACT (Continued). ) contain...automatic measurement should arise. 15 I "_......_______.....____,_.........____ _ ’ " AFWL-TR-82-137 11. TRADITIONAL PROCEDURES The necessity to measure data...measurement (Ref. 8). Finally, when the necessity for automation was recognized and funds were provided, the effort described in this report was started

  4. Cavendish Balance Automation

    Science.gov (United States)

    Thompson, Bryan

    2000-01-01

    This is the final report for a project carried out to modify a manual commercial Cavendish Balance for automated use in cryostat. The scope of this project was to modify an off-the-shelf manually operated Cavendish Balance to allow for automated operation for periods of hours or days in cryostat. The purpose of this modification was to allow the balance to be used in the study of effects of superconducting materials on the local gravitational field strength to determine if the strength of gravitational fields can be reduced. A Cavendish Balance was chosen because it is a fairly simple piece of equipment for measuring gravity, one the least accurately known and least understood physical constants. The principle activities that occurred under this purchase order were: (1) All the components necessary to hold and automate the Cavendish Balance in a cryostat were designed. Engineering drawings were made of custom parts to be fabricated, other off-the-shelf parts were procured; (2) Software was written in LabView to control the automation process via a stepper motor controller and stepper motor, and to collect data from the balance during testing; (3)Software was written to take the data collected from the Cavendish Balance and reduce it to give a value for the gravitational constant; (4) The components of the system were assembled and fitted to a cryostat. Also the LabView hardware including the control computer, stepper motor driver, data collection boards, and necessary cabling were assembled; and (5) The system was operated for a number of periods, data collected, and reduced to give an average value for the gravitational constant.

  5. Coordinated application of electrical auxiliaries equipment

    Energy Technology Data Exchange (ETDEWEB)

    Appiarius, J.C.; Krug, R.O.

    1977-01-01

    As the power requirements of auxiliaries continue to increase, both in total and as a percentage of generating unit capacity, coordinated application of auxiliaries system components demands more sophisticated approaches to system design. Among the techniques developed to assist system designers is a method which separates the auxiliaries system into modules for conceptual design, with analyses which consider design criteria, equipment capability, and expected operating constraints. This and other techniques, intended to optimize application of metalclad switchgear and auxiliaries transformers, can provide the designer with tools for proper selection and application of equipment which can contribute significantly to overall system performance.

  6. Automating with SIMATIC S7-400 inside TIA portal configuring, programming and testing with STEP 7 Professional

    CERN Document Server

    Berger, Hans

    2014-01-01

    This book presents a comprehensive description of the configuration of devices and network for the S7-400 components inside the engineering framework TIA Portal. You learn how to formulate and test a control program with the programming languages LAD, FBD, STL, and SCL. The book is rounded off by configuring the distributed I/O with PROFIBUS DP and PROFINET IO using SIMATIC S7-400 and data exchange via Industrial Ethernet. SIMATIC is the globally established automation system for implementing industrial controllers for machines, production plants and processes. SIMATIC S7-400 is the most powerful automation system within SIMATIC. This process controller is ideal for data-intensive tasks that are especially typical for the process industry. With superb communication capability and integrated interfaces it is optimized for larger tasks such as the coordination of entire systems. Open-loop and closed-loop control tasks are formulated with the STEP 7 Professional V11 engineering software in the field-proven progr...

  7. Automated scheduler improvements and generalizations for the Automated Planet Finder

    Science.gov (United States)

    Holden, Bradford P.; Burt, Jennifer A.; Deich, William T. S.

    2016-07-01

    The Automated Planet Finder (APF) was originally designed as a single purpose facility to search for exoplanets. The APF, however, has become a general use observatory that is used by astronomers the world over. We describe the improvements to our software for operations that both optimize finding planets with known periods and supporting a much broader community of astronomers with a variety of interests and requirements. These include a variety of observing modes beyond the originally envisioned fixed target lists, such as time dependent priorities to meet the needs of rapid varying targets, and improved tools for simulating observing cadence for the planet hunting teams. We discuss the underlying software for the APF, illustrating why its simplicity of use allows users to write software that focuses on scientific productivity. Because of this simplicity, we can then develop scheduling software, which is easily integrated into the APF operations suite. We test these new scheduling modes using a nightly simulator which uses historical weather and seeing data. After discussing this new simulation tool, we measure how well the methods work after a 36 month simulated campaign to follow-up transiting targets. We find that the data yield of each of the tested schemes is similar. Therefore, we can focus on the best potential scientific return with little concern about the impact on the number or duration of observations.

  8. Autonomy, Automation, and Systems

    Science.gov (United States)

    Turner, Philip R.

    1987-02-01

    Aerospace industry interest in autonomy and automation, given fresh impetus by the national goal of establishing a Space Station, is becoming a major item of research and technology development. The promise of new technology arising from research in Artificial Intelligence (AI) has focused much attention on its potential in autonomy and automation. These technologies can improve performance in autonomous control functions that involve planning, scheduling, and fault diagnosis of complex systems. There are, however, many aspects of system and subsystem design in an autonomous system that impact AI applications, but do not directly involve AI technology. Development of a system control architecture, establishment of an operating system within the design, providing command and sensory data collection features appropriate to automated operation, and the use of design analysis tools to support system engineering are specific examples of major design issues. Aspects such as these must also receive attention and technology development support if we are to implement complex autonomous systems within the realistic limitations of mass, power, cost, and available flight-qualified technology that are all-important to a flight project.

  9. Synthesis Study on Transitions in Signal Infrastructure and Control Algorithms for Connected and Automated Transportation

    Energy Technology Data Exchange (ETDEWEB)

    Aziz, H. M. Abdul [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Wang, Hong [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Young, Stan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Sperling, Joshua [National Renewable Energy Lab. (NREL), Golden, CO (United States); Beck, John [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2017-06-01

    Documenting existing state of practice is an initial step in developing future control infrastructure to be co-deployed for heterogeneous mix of connected and automated vehicles with human drivers while leveraging benefits to safety, congestion, and energy. With advances in information technology and extensive deployment of connected and automated vehicle technology anticipated over the coming decades, cities globally are making efforts to plan and prepare for these transitions. CAVs not only offer opportunities to improve transportation systems through enhanced safety and efficient operations of vehicles. There are also significant needs in terms of exploring how best to leverage vehicle-to-vehicle (V2V) technology, vehicle-to-infrastructure (V2I) technology and vehicle-to-everything (V2X) technology. Both Connected Vehicle (CV) and Connected and Automated Vehicle (CAV) paradigms feature bi-directional connectivity and share similar applications in terms of signal control algorithm and infrastructure implementation. The discussion in our synthesis study assumes the CAV/CV context where connectivity exists with or without automated vehicles. Our synthesis study explores the current state of signal control algorithms and infrastructure, reports the completed and newly proposed CV/CAV deployment studies regarding signal control schemes, reviews the deployment costs for CAV/AV signal infrastructure, and concludes with a discussion on the opportunities such as detector free signal control schemes and dynamic performance management for intersections, and challenges such as dependency on market adaptation and the need to build a fault-tolerant signal system deployment in a CAV/CV environment. The study will serve as an initial critical assessment of existing signal control infrastructure (devices, control instruments, and firmware) and control schemes (actuated, adaptive, and coordinated-green wave). Also, the report will help to identify the future needs for the signal

  10. Automation of a Versatile Crane (the LSMS) for Lunar Outpost Construction, Maintenance and Inspection

    Science.gov (United States)

    Doggett, William R.; Roithmayr, Carlos M.; Dorsey, John T.; Jones, Thomas C.; Shen, Haijun; Seywald, Hans; King, Bruce D.; Mikulas, Martin M., Jr.

    2009-01-01

    , greatly expanding the operational versatility of the LSMS. This paper develops the equations describing the forward and inverse relation between LSMS joint angles and Cartesian coordinates of the LSMS tip. These equations allow a variety of schemes to be used to maneuver the LSMS to optimize the maneuver. One such scheme will be described in detail that eliminates undesirable swinging of the payload at the conclusion of a maneuver, even when the payload is suspended from a passive rigid link. The swinging is undesirable when performing precision maneuvers, such as aligning an object for mating or positioning a camera. Use of the equations described here enables automated control of the LSMS greatly improving its operational versatility.

  11. Optimal Design of Gradient Materials and Bi-Level Optimization of Topology Using Targets (BOTT)

    Science.gov (United States)

    Garland, Anthony

    The objective of this research is to understand the fundamental relationships necessary to develop a method to optimize both the topology and the internal gradient material distribution of a single object while meeting constraints and conflicting objectives. Functionally gradient material (FGM) objects possess continuous varying material properties throughout the object, and they allow an engineer to tailor individual regions of an object to have specific mechanical properties by locally modifying the internal material composition. A variety of techniques exists for topology optimization, and several methods exist for FGM optimization, but combining the two together is difficult. Understanding the relationship between topology and material gradient optimization enables the selection of an appropriate model and the development of algorithms, which allow engineers to design high-performance parts that better meet design objectives than optimized homogeneous material objects. For this research effort, topology optimization means finding the optimal connected structure with an optimal shape. FGM optimization means finding the optimal macroscopic material properties within an object. Tailoring the material constitutive matrix as a function of position results in gradient properties. Once, the target macroscopic properties are known, a mesostructure or a particular material nanostructure can be found which gives the target material properties at each macroscopic point. This research demonstrates that topology and gradient materials can both be optimized together for a single part. The algorithms use a discretized model of the domain and gradient based optimization algorithms. In addition, when considering two conflicting objectives the algorithms in this research generate clear 'features' within a single part. This tailoring of material properties within different areas of a single part (automated design of 'features') using computational design tools is a novel benefit

  12. Developmental coordination disorder - literature review

    OpenAIRE

    Kosová, Blanka

    2015-01-01

    Title: Developmental coordination disorder - literature review Objectives: The theoretical part talks about developmental coordination disorder, its diagnosis, investigations, etiology and division during the different ages of the child. The practical part compares the studies relating to atypical development of motor skills in children with developmental coordination disorder. In diploma thesis was often used shortcut CKP and DCD, depending on the source from which it was drawn. DCD is the E...

  13. Synchronization matters for motor coordination.

    Science.gov (United States)

    Pesce Ibarra, Luigi S

    2018-03-01

    Using electroencephalography and electromyography recordings from healthy participants during a visual-depended bimanual coordination task, de Vries and colleagues showed that functional synchronization is important in motor coordination. The authors reported that higher coordination correlated positively with intermuscular synchrony, but correlated negatively with corticomuscular synchrony. They proposed that these two diverse motor systems operate differently depending on task demands. Similar experimental paradigms could identify motor mechanisms in patients with neurological disorders to design novel rehabilitation strategies.

  14. Pull Promotions and Channel Coordination

    OpenAIRE

    Eitan Gerstner; James D. Hess

    1995-01-01

    This paper recommends that manufacturers consider a pull price promotion as a coordination device in an independent channel of distribution. Uncoordinated decisions of both manufacturer and retailer to charge high prices can break down the effort to expand the market, resulting in losses to the channel as a whole. We show that manufacturers can enhance channel price coordination by designing pull price discounts that target price-conscious consumers. The increased price coordination improves ...

  15. Coordination Analysis Using Global Structural Constraints and Alignment-based Local Features

    Science.gov (United States)

    Hara, Kazuo; Shimbo, Masashi; Matsumoto, Yuji

    We propose a hybrid approach to coordinate structure analysis that combines a simple grammar to ensure consistent global structure of coordinations in a sentence, and features based on sequence alignment to capture local symmetry of conjuncts. The weight of the alignment-based features, which in turn determines the score of coordinate structures, is optimized by perceptron training on a given corpus. A bottom-up chart parsing algorithm efficiently finds the best scoring structure, taking both nested or non-overlapping flat coordinations into account. We demonstrate that our approach outperforms existing parsers in coordination scope detection on the Genia corpus.

  16. FAST AUTOMATED DECOUPLING AT RHIC

    International Nuclear Information System (INIS)

    BEEBE-WANG, J.J.

    2005-01-01

    Coupling correction is essential for the operational performance of RHIC. The independence of the transverse degrees of freedom makes diagnostics and tune control easier, and it is advantageous to operate an accelerator close to the coupling resonance to minimize nearby nonlinear sidebands. An automated coupling correction application iDQmini has been developed for RHIC routine operations. The application decouples RHIC globally by minimizing the tune separation through finding the optimal settings of two orthogonal skew quadrupole families. The program iDQmini provides options of automatic, semi-automatic and manual decoupling operations. It accesses tune information from all RHIC tune measurement systems: the PLL (phase lock loop), the high frequency Schottky system and the tune meter. It also supplies tune and skew quadrupole scans, finding the minimum tune separation, display the real time results and interface with the RHIC control system. We summarize the capabilities of the coupling correction application iDQmini, and discuss the operational protections incorporated in the program

  17. Automated Critical Peak Pricing Field Tests: Program Descriptionand Results

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Watson, David; Motegi, Naoya; Kiliccote, Sila; Xu, Peng

    2006-04-06

    will respond to this form of automation for CPP. (4) Evaluate what type of DR shifting and shedding strategies can be automated. (5) Explore how automation of control strategies can increase participation rates and DR saving levels with CPP. (6) Identify optimal demand response control strategies. (7) Determine occupant and tenant response.

  18. Automation, Performance and International Competition

    DEFF Research Database (Denmark)

    Kromann, Lene; Sørensen, Anders

    This paper presents new evidence on trade‐induced automation in manufacturing firms using unique data combining a retrospective survey that we have assembled with register data for 2005‐2010. In particular, we establish a causal effect where firms that have specialized in product types for which...... the Chinese exports to the world market has risen sharply invest more in automated capital compared to firms that have specialized in other product types. We also study the relationship between automation and firm performance and find that firms with high increases in scale and scope of automation have faster...... productivity growth than other firms. Moreover, automation improves the efficiency of all stages of the production process by reducing setup time, run time, and inspection time and increasing uptime and quantity produced per worker. The efficiency improvement varies by type of automation....

  19. Building a framework to manage trust in automation

    Science.gov (United States)

    Metcalfe, J. S.; Marathe, A. R.; Haynes, B.; Paul, V. J.; Gremillion, G. M.; Drnec, K.; Atwater, C.; Estepp, J. R.; Lukos, J. R.; Carter, E. C.; Nothwang, W. D.

    2017-05-01

    All automations must, at some point in their lifecycle, interface with one or more humans. Whether operators, end-users, or bystanders, human responses can determine the perceived utility and acceptance of an automation. It has been long believed that human trust is a primary determinant of human-automation interactions and further presumed that calibrating trust can lead to appropriate choices regarding automation use. However, attempts to improve joint system performance by calibrating trust have not yet provided a generalizable solution. To address this, we identified several factors limiting the direct integration of trust, or metrics thereof, into an active mitigation strategy. The present paper outlines our approach to addressing this important issue, its conceptual underpinnings, and practical challenges encountered in execution. Among the most critical outcomes has been a shift in focus from trust to basic interaction behaviors and their antecedent decisions. This change in focus inspired the development of a testbed and paradigm that was deployed in two experiments of human interactions with driving automation that were executed in an immersive, full-motion simulation environment. Moreover, by integrating a behavior and physiology-based predictor within a novel consequence-based control system, we demonstrated that it is possible to anticipate particular interaction behaviors and influence humans towards more optimal choices about automation use in real time. Importantly, this research provides a fertile foundation for the development and integration of advanced, wearable technologies for sensing and inferring critical state variables for better integration of human elements into otherwise fully autonomous systems.

  20. A fully-automated software pipeline for integrating breast density and parenchymal texture analysis for digital mammograms: parameter optimization in a case-control breast cancer risk assessment study

    Science.gov (United States)

    Zheng, Yuanjie; Wang, Yan; Keller, Brad M.; Conant, Emily; Gee, James C.; Kontos, Despina

    2013-02-01

    Estimating a woman's risk of breast cancer is becoming increasingly important in clinical practice. Mammographic density, estimated as the percent of dense (PD) tissue area within the breast, has been shown to be a strong risk factor. Studies also support a relationship between mammographic texture and breast cancer risk. We have developed a fullyautomated software pipeline for computerized analysis of digital mammography parenchymal patterns by quantitatively measuring both breast density and texture properties. Our pipeline combines advanced computer algorithms of pattern recognition, computer vision, and machine learning and offers a standardized tool for breast cancer risk assessment studies. Different from many existing methods performing parenchymal texture analysis within specific breast subregions, our pipeline extracts texture descriptors for points on a spatial regular lattice and from a surrounding window of each lattice point, to characterize the local mammographic appearance throughout the whole breast. To demonstrate the utility of our pipeline, and optimize its parameters, we perform a case-control study by retrospectively analyzing a total of 472 digital mammography studies. Specifically, we investigate the window size, which is a lattice related parameter, and compare the performance of texture features to that of breast PD in classifying case-control status. Our results suggest that different window sizes may be optimal for raw (12.7mm2) versus vendor post-processed images (6.3mm2). We also show that the combination of PD and texture features outperforms PD alone. The improvement is significant (p=0.03) when raw images and window size of 12.7mm2 are used, having an ROC AUC of 0.66. The combination of PD and our texture features computed from post-processed images with a window size of 6.3 mm2 achieves an ROC AUC of 0.75.

  1. Flexoelectricity via coordinate transformations

    Science.gov (United States)

    Stengel, Massimiliano

    2014-03-01

    Flexoelectricity describes the electric polarization that is linearly induced by a strain gradient, and is being intensely investigated as a tantalizing new route to converting mechanical stimulation into electrical signals and vice versa. While several breakthough experiments have been reported in the past few years, progress on the theoretical front has been comparatively slow, especially in the context of first-principles electronic-structure theory. The main difficulty with calculating the flexoelectric response of a material is the inherent breakdown of translational periodicity that a strain gradient entails, which at first sight questions the very applicability of traditional plane-wave pseudopotential methods. In this talk I will show how these obstacles can be overcome by combining density-functional perturbation theory with generalized coordinate transformations, gaining access to the full microscopic response (in terms of electronic charge density, polarization and atomic displacements) of a crystal or nanostructure to an arbitrary deformation field. As a practical demonstration, I will present results on the full flexoelectric response of a SrTiO3 film, including atomic relaxations and surface effects.

  2. Automated Robotic Liquid Handling Assembly of Modular DNA Devices.

    Science.gov (United States)

    Ortiz, Luis; Pavan, Marilene; McCarthy, Lloyd; Timmons, Joshua; Densmore, Douglas M

    2017-12-01

    Recent advances in modular DNA assembly techniques have enabled synthetic biologists to test significantly more of the available "design space" represented by "devices" created as combinations of individual genetic components. However, manual assembly of such large numbers of devices is time-intensive, error-prone, and costly. The increasing sophistication and scale of synthetic biology research necessitates an efficient, reproducible way to accommodate large-scale, complex, and high throughput device construction. Here, a DNA assembly protocol using the Type-IIS restriction endonuclease based Modular Cloning (MoClo) technique is automated on two liquid-handling robotic platforms. Automated liquid-handling robots require careful, often times tedious optimization of pipetting parameters for liquids of different viscosities (e.g. enzymes, DNA, water, buffers), as well as explicit programming to ensure correct aspiration and dispensing of DNA parts and reagents. This makes manual script writing for complex assemblies just as problematic as manual DNA assembly, and necessitates a software tool that can automate script generation. To this end, we have developed a web-based software tool, http://mocloassembly.com, for generating combinatorial DNA device libraries from basic DNA parts uploaded as Genbank files. We provide access to the tool, and an export file from our liquid handler software which includes optimized liquid classes, labware parameters, and deck layout. All DNA parts used are available through Addgene, and their digital maps can be accessed via the Boston University BDC ICE Registry. Together, these elements provide a foundation for other organizations to automate modular cloning experiments and similar protocols. The automated DNA assembly workflow presented here enables the repeatable, automated, high-throughput production of DNA devices, and reduces the risk of human error arising from repetitive manual pipetting. Sequencing data show the automated DNA

  3. Drug policy coordination: identifying and assessing dimensions of coordination.

    Science.gov (United States)

    Hughes, Caitlin Elizabeth; Ritter, Alison; Mabbitt, Nicholas

    2013-05-01

    Coordination has been recognised as a critical ingredient for successful drug policy governance. Yet what coordination means and how we assess the processes, outputs and outcomes of drug policy coordination is seldom defined. In this article we explore the utility of internationally recognised principles of good governance for examining aspects of drug policy coordination. We describe the development of an assessment tool, and pilot it in one context that has been both praised and criticised for its approach to drug policy coordination: Australia. Eight good governance principles of the United Nations Economic and Social Commission for Asia and the Pacific (which specify the need for policy processes to be participatory, responsive, equitable etc.), were adapted to drug policy coordination. A pilot survey was created to enable assessment of their perceived importance and perceived application and administered to 36 stakeholders from peak Australian advisory bodies. The instrument was shown to have high internal reliability and high face validity. Application to the Australian context suggested that the eight principles differed in importance, and that the most important principles were 'accountability' and 'participation'. Application also revealed perceived strengths and weaknesses in coordination, most notably, an apparent need to increase 'accountability' for stakeholder actions. The instrument requires further assessment of reliability and validity. Yet, at least within the Australian context, it starts to unpack normative statements about coordination to show perceptions of what coordination is, areas where improvement may be warranted and the degree of contestation amongst different players. Further application of the good governance lens within this and other contexts will progress the assessment of a fundamental yet neglected policy process and foster a more nuanced consideration of the possibilities for coordination in the drug policy "soup". Copyright

  4. An Approach to Office Automation

    OpenAIRE

    Ischenko, A.N.; Tumeo, M.A.

    1983-01-01

    In recent years, the increasing scale of production and degree of specialization within firms has led to a significant growth in the amount of information needed for their successful management. As a result, the use of computer systems (office automation) has become increasingly common. However, no manuals or set automation procedures exist to help organizations design and implement an efficient and effective office automation system. The goals of this paper are to outline some important...

  5. Embedded system for building automation

    OpenAIRE

    Rolih, Andrej

    2014-01-01

    Home automation is a fast developing field of computer science and electronics. Companies are offering many different products for home automation. Ranging anywhere from complete systems for building management and control, to simple smart lights that can be connected to the internet. These products offer the user greater living comfort and lower their expenses by reducing the energy usage. This thesis shows the development of a simple home automation system that focuses mainly on the enhance...

  6. Action coordination and resource allocation against user profiles

    DEFF Research Database (Denmark)

    Valente, Pedro Ricardo da Nova

    on context-aware information extraction, and the second system, evolving from the first, relating end-users social interaction with resource optimization (coordination). At last, various problems are discussed in the perspective of Resident and Care Staff, as being the principal service provider, inside...

  7. Carbene→N Coordination Bonds in Drugs: A Quantum Chemical ...

    Indian Academy of Sciences (India)

    Abstract. Coordination chemistry of bonds between main group elements and electron donating ligands as .... electronic nature of the compounds so as to establish ... a transition state. Gibbs free energy was calculated for all the optimized structures. Natural Bond Orbital. (NBO)24 analysis was also carried out to estimate.

  8. A Care Coordination Program for Substance-Exposed Newborns

    Science.gov (United States)

    Twomey, Jean E.; Caldwell, Donna; Soave, Rosemary; Fontaine, Lynne Andreozzi; Lester, Barry M.

    2011-01-01

    The Vulnerable Infants Program of Rhode Island (VIP-RI) was established as a care coordination program to promote permanency for substance-exposed newborns in the child welfare system. Goals of VIP-RI were to optimize parents' opportunities for reunification and increase the efficacy of social service systems involved with families affected by…

  9. COORDINATION OF THE WORK OF BUSES IN CITY ROUTES

    Directory of Open Access Journals (Sweden)

    Fuad DASHDAMIROV

    2013-12-01

    Full Text Available The paper studied the work of bus routes passing through a street. Optimality criterion was chosen for the development of appropriate models of effective work of buses on the land. The paper proposes a new model costing time passengers at bus stops. A method of technique was developed to coordinate the buses running on the combined section of route.

  10. MVMO-based approach for optimal placement and tuning of ...

    African Journals Online (AJOL)

    DR OKE

    Variance Mapping Optimization (MVMO-S) to solve the multi-scenario formulation of the optimal placement and coordinated tuning of power system supplementary damping controllers (POCDCs). The effectiveness of the approach is evaluated ...

  11. TimeNET Optimization Environment

    Directory of Open Access Journals (Sweden)

    Christoph Bodenstein

    2015-12-01

    Full Text Available In this paper a novel tool for simulation-based optimization and design-space exploration of Stochastic Colored Petri nets (SCPN is introduced. The working title of this tool is TimeNET Optimization Environment (TOE. Targeted users of this tool are people modeling complex systems with SCPNs in TimeNET who want to find parameter sets that are optimal for a certain performance measure (fitness function. It allows users to create and simulate sets of SCPNs and to run different optimization algorithms based on parameter variation. The development of this tool was motivated by the need to automate and speed up tests of heuristic optimization algorithms to be applied for SCPN optimization. A result caching mechanism is used to avoid recalculations.

  12. Health Care Reform, Care Coordination, and Transformational Leadership.

    Science.gov (United States)

    Steaban, Robin Lea

    2016-01-01

    This article is meant to spur debate on the role of the professional nurse in care coordination as well as the role of nursing leaders for defining and leading to a future state. This work highlights the opportunity and benefits associated with transformation of professional nursing practice in response to the mandates of the Affordable Care Act of 2010. An understanding of core concepts and the work of care coordination are used to propose a model of care coordination based on the population health pyramid. This maximizes the roles of nurses across the continuum as transformational leaders in the patient/family and nursing relationship. The author explores the role of the nurse in a transactional versus transformational relationship with patients, leading to actualization of the nurse in care coordination. Focusing on the role of the nurse leader, the challenges and necessary actions for optimization of the professional nurse role are explored, using principles of transformational leadership.

  13. Embedded Systems Design: Optimization Challenges

    DEFF Research Database (Denmark)

    Pop, Paul

    2005-01-01

    -to-market, and reduce development and manufacturing costs. In this paper, the author introduces several embedded systems design problems, and shows how they can be formulated as optimization problems. Solving such challenging design optimization problems are the key to the success of the embedded systems design...... of designing such systems is becoming increasingly important and difficult at the same time. New automated design optimization techniques are needed, which are able to: successfully manage the complexity of embedded systems, meet the constraints imposed by the application domain, shorten the time...

  14. Directional Overcurrent Relays Coordination Problems in Distributed Generation Systems

    Directory of Open Access Journals (Sweden)

    Jakub Ehrenberger

    2017-09-01

    Full Text Available This paper proposes a new approach to the distributed generation system protection coordination based on directional overcurrent protections with inverse-time characteristics. The key question of protection coordination is the determination of correct values of all inverse-time characteristics coefficients. The coefficients must be correctly chosen considering the sufficiently short tripping times and the sufficiently long selectivity times. In the paper a new approach to protection coordination is designed, in which not only some, but all the required types of short-circuit contributions are taken into account. In radial systems, if the pickup currents are correctly chosen, protection coordination for maximum contributions is enough to ensure selectivity times for all the required short-circuit types. In distributed generation systems, due to different contributions flowing through the primary and selective protections, coordination for maximum contributions is not enough, but all the short-circuit types must be taken into account, and the protection coordination becomes a complex problem. A possible solution to the problem, based on an appropriately designed optimization, has been proposed in the paper. By repeating a simple optimization considering only one short-circuit type, the protection coordination considering all the required short-circuit types has been achieved. To show the importance of considering all the types of short-circuit contributions, setting optimizations with one (the highest and all the types of short-circuit contributions have been performed. Finally, selectivity time values are explored throughout the entire protected section, and both the settings are compared.

  15. World-wide distribution automation systems

    International Nuclear Information System (INIS)

    Devaney, T.M.

    1994-01-01

    A worldwide power distribution automation system is outlined. Distribution automation is defined and the status of utility automation is discussed. Other topics discussed include a distribution management system, substation feeder, and customer functions, potential benefits, automation costs, planning and engineering considerations, automation trends, databases, system operation, computer modeling of system, and distribution management systems

  16. Contaminant analysis automation, an overview

    International Nuclear Information System (INIS)

    Hollen, R.; Ramos, O. Jr.

    1996-01-01

    To meet the environmental restoration and waste minimization goals of government and industry, several government laboratories, universities, and private companies have formed the Contaminant Analysis Automation (CAA) team. The goal of this consortium is to design and fabricate robotics systems that standardize and automate the hardware and software of the most common environmental chemical methods. In essence, the CAA team takes conventional, regulatory- approved (EPA Methods) chemical analysis processes and automates them. The automation consists of standard laboratory modules (SLMs) that perform the work in a much more efficient, accurate, and cost- effective manner

  17. Multi-net optimization of VLSI interconnect

    CERN Document Server

    Moiseev, Konstantin; Wimer, Shmuel

    2015-01-01

    This book covers layout design and layout migration methodologies for optimizing multi-net wire structures in advanced VLSI interconnects. Scaling-dependent models for interconnect power, interconnect delay and crosstalk noise are covered in depth, and several design optimization problems are addressed, such as minimization of interconnect power under delay constraints, or design for minimal delay in wire bundles within a given routing area. A handy reference or a guide for design methodologies and layout automation techniques, this book provides a foundation for physical design challenges of interconnect in advanced integrated circuits.  • Describes the evolution of interconnect scaling and provides new techniques for layout migration and optimization, focusing on multi-net optimization; • Presents research results that provide a level of design optimization which does not exist in commercially-available design automation software tools; • Includes mathematical properties and conditions for optimal...

  18. Coordination of Conditional Poisson Samples

    Directory of Open Access Journals (Sweden)

    Grafström Anton

    2015-12-01

    Full Text Available Sample coordination seeks to maximize or to minimize the overlap of two or more samples. The former is known as positive coordination, and the latter as negative coordination. Positive coordination is mainly used for estimation purposes and to reduce data collection costs. Negative coordination is mainly performed to diminish the response burden of the sampled units. Poisson sampling design with permanent random numbers provides an optimum coordination degree of two or more samples. The size of a Poisson sample is, however, random. Conditional Poisson (CP sampling is a modification of the classical Poisson sampling that produces a fixed-size πps sample. We introduce two methods to coordinate Conditional Poisson samples over time or simultaneously. The first one uses permanent random numbers and the list-sequential implementation of CP sampling. The second method uses a CP sample in the first selection and provides an approximate one in the second selection because the prescribed inclusion probabilities are not respected exactly. The methods are evaluated using the size of the expected sample overlap, and are compared with their competitors using Monte Carlo simulation. The new methods provide a good coordination degree of two samples, close to the performance of Poisson sampling with permanent random numbers.

  19. Coordinate-Free Rotation Operator.

    Science.gov (United States)

    Leubner, C.

    1979-01-01

    Suggests the use of a coordinate-free rotation operator for the teaching of rotations in Euclidean three space because of its twofold didactic advantage. Illustrates the potentialities of the coordinate-free rotation operator approach by a number of examples. (Author/GA)

  20. Coordinated Transportation: Problems and Promise?

    Science.gov (United States)

    Fickes, Michael

    1998-01-01

    Examines the legal, administrative, and logistical barriers that have prevented the wide acceptance of coordinating community and school transportation services and why these barriers may be breaking down. Two examples of successful implementation of coordinated transportation are examined: employing a single system to serve all transportation…

  1. Introduction to coordinated linear systems

    NARCIS (Netherlands)

    Kempker, P.L.

    2014-01-01

    This chapter serves as an introduction to the concepts of coordinated linear systems, in formal as well as intuitive terms. The concept of a coordinated linear system is introduced and formulated, and some basic properties are derived, providing both a motivaton and a formal basis for the following

  2. Transformation optics in orthogonal coordinates

    OpenAIRE

    Chen, Huanyang

    2008-01-01

    The author proposes the methodology of transformation optics in orthogonal coordinates to obtain the material parameters of the transformation media from the mapping in orthogonal coordinates. Several examples are given to show the applications of such a methodology by using the full-wave simulations.

  3. Bare coordination: the semantic shift

    NARCIS (Netherlands)

    de Swart, Henriette|info:eu-repo/dai/nl/074764187; Le Bruyn, Bert|info:eu-repo/dai/nl/30484912X

    2014-01-01

    This paper develops an analysis of the syntax-semantics interface of two types of split coordination structures. In the first type, two bare singular count nouns appear as arguments in a coordinated structure, as in bride and groom were happy. We call this the N&N construction. In the second type,

  4. Researcher perspectives on competencies of return-to-work coordinators.

    Science.gov (United States)

    Gardner, Bethany T; Pransky, Glenn; Shaw, William S; Hong, Qua Nha; Loisel, Patrick

    2010-01-01

    Return-to-work (RTW) coordination programs are successful in reducing long-term work disability, but research reports have not adequately described the role and competencies of the RTW coordinator. This study was conducted to clarify the impact of RTW coordinators, and competencies (knowledge, skills, and attitudes) required to achieve optimal RTW outcomes in injured workers. Studies involving RTW coordination for injured workers were identified through literature review. Semi-structured interviews were conducted with 12 principal investigators to obtain detailed information about the RTW coordinator role and competencies not included in published articles. Interview results were synthesized into principal conceptual groups by affinity mapping. All investigators strongly endorsed the role of RTW coordinator as key to the program's success. Affinity mapping identified 10 groups of essential competencies: (1) individual traits/qualities, (2) relevant knowledge base, (3) RTW focus and attitude, (4) organizational/administrative skills, (5) assessment skills, (6) communication skills, (7) interpersonal relationship skills, (8) conflict resolution skills, (9) problem-solving skills, and (10) RTW facilitation skills. Specific consensus competencies were identified within each affinity group. Most investigators endorsed similar competencies, although there was some variation by setting or scope of RTW intervention. RTW coordinators are essential contributors in RTW facilitation programs. This study identified specific competencies required to achieve success. More emphasis on mentorship and observation will be required to develop and evaluate necessary skills in this area.

  5. Fossil power plant automation

    International Nuclear Information System (INIS)

    Divakaruni, S.M.; Touchton, G.

    1991-01-01

    This paper elaborates on issues facing the utilities industry and seeks to address how new computer-based control and automation technologies resulting from recent microprocessor evolution, can improve fossil plant operations and maintenance. This in turn can assist utilities to emerge stronger from the challenges ahead. Many presentations at the first ISA/EPRI co-sponsored conference are targeted towards improving the use of computer and control systems in the fossil and nuclear power plants and we believe this to be the right forum to share our ideas

  6. Increased coordination in public transport – which mechanisms are available?

    DEFF Research Database (Denmark)

    Sørensen, Claus Hedegaard; Longva, Frode

    2011-01-01

    and suggested to increase coordination between core stakeholders within passenger railway services and bus services. Four distinctive mechanisms of coordination are suggested, namely organisational coordination, contractual coordination, partnership coordination and discursive coordination. Each coordination...

  7. Archimedes' principle in general coordinates

    International Nuclear Information System (INIS)

    Ridgely, Charles T

    2010-01-01

    Archimedes' principle is well known to state that a body submerged in a fluid is buoyed up by a force equal to the weight of the fluid displaced by the body. Herein, Archimedes' principle is derived from first principles by using conservation of the stress-energy-momentum tensor in general coordinates. The resulting expression for the force is applied in Schwarzschild coordinates and in rotating coordinates. Using Schwarzschild coordinates for the case of a spherical mass suspended within a perfect fluid leads to the familiar expression of Archimedes' principle. Using rotating coordinates produces an expression for a centrifugal buoyancy force that agrees with accepted theory. It is then argued that Archimedes' principle ought to be applicable to non-gravitational phenomena, as well. Conservation of the energy-momentum tensor is then applied to electromagnetic phenomena. It is shown that a charged body submerged in a charged medium experiences a buoyancy force in accordance with an electromagnetic analogue of Archimedes' principle.

  8. Archimedes' principle in general coordinates

    Science.gov (United States)

    Ridgely, Charles T.

    2010-05-01

    Archimedes' principle is well known to state that a body submerged in a fluid is buoyed up by a force equal to the weight of the fluid displaced by the body. Herein, Archimedes' principle is derived from first principles by using conservation of the stress-energy-momentum tensor in general coordinates. The resulting expression for the force is applied in Schwarzschild coordinates and in rotating coordinates. Using Schwarzschild coordinates for the case of a spherical mass suspended within a perfect fluid leads to the familiar expression of Archimedes' principle. Using rotating coordinates produces an expression for a centrifugal buoyancy force that agrees with accepted theory. It is then argued that Archimedes' principle ought to be applicable to non-gravitational phenomena, as well. Conservation of the energy-momentum tensor is then applied to electromagnetic phenomena. It is shown that a charged body submerged in a charged medium experiences a buoyancy force in accordance with an electromagnetic analogue of Archimedes' principle.

  9. Expected Improvements in Work Truck Efficiency Through Connectivity and Automation

    Energy Technology Data Exchange (ETDEWEB)

    Walkowicz, Kevin A [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2018-03-12

    This presentation focuses on the potential impact of connected and automated technologies on commercial vehicle operations. It includes topics such as the U.S. Department of Energy's Energy Efficient Mobility Systems (EEMS) program and the Systems and Modeling for Accelerated Research in Transportation (SMART) Mobility Initiative. It also describes National Renewable Energy Laboratory (NREL) research findings pertaining to the potential energy impacts of connectivity and automation and stresses the need for integration and optimization to take advantage of the benefits offered by these transformative technologies while mitigating the potential negative consequences.

  10. 23rd International Conference on Flexible Automation & Intelligent Manufacturing

    CERN Document Server

    2013-01-01

    The proceedings includes the set of revised papers from the 23rd International Conference on Flexible Automation and Intelligent Manufacturing (FAIM 2013). This conference aims to provide an international forum for the exchange of leading edge scientific knowledge and industrial experience regarding the development and integration of the various aspects of Flexible Automation and Intelligent Manufacturing Systems covering the complete life-cycle of a company’s Products and Processes. Contents will include topics such as: Product, Process and Factory Integrated Design, Manufacturing Technology and Intelligent Systems, Manufacturing Operations Management and Optimization and Manufacturing Networks and MicroFactories.

  11. Numerical construction of the p(fold) (committor) reaction coordinate for a Markov process.

    Science.gov (United States)

    Krivov, Sergei V

    2011-10-06

    To simplify the description of a complex multidimensional dynamical process, one often projects it onto a single reaction coordinate. In protein folding studies, the folding probability p(fold) is an optimal reaction coordinate which preserves many important properties of the dynamics. The construction of the coordinate is difficult. Here, an efficient numerical approach to construct the p(fold) reaction coordinate for a Markov process (satisfying the detailed balance) is described. The coordinate is obtained by optimizing parameters of a chosen functional form to make a generalized cut-based free energy profile the highest. The approach is illustrated by constructing the p(fold) reaction coordinate for the equilibrium folding simulation of FIP35 protein reported by Shaw et al. (Science 2010, 330, 341-346). © 2011 American Chemical Society

  12. Enterprise Coordination on the Internet

    Directory of Open Access Journals (Sweden)

    Charles Petrie

    2011-02-01

    Full Text Available Enterprises are now connected internally and externally to other Enterprises via the Internet in ways that are increasingly difficult to manage, especially as these interconnections become more dynamic. Current methods of coordinating the effects of change as they propagate through these networks of connections are not likely to scale. What is needed is a new paradigm for how the Internet supports such coordination. Indeed, the Internet should and could provide fundamental coordination functions that are missing today. In this paper, we describe how such a “Coordinated Internet” would work (this paper is an expanded version of [1]. The key functionality of a Coordinated Internet would be that the Internet actively watches what people do (analogous to search completion on desktops today, correlates these activities, and actively notifies people when and how their current tasks affect and are affected by the activities of other people. This would be accomplished by standard coordination functions implemented as a common Internet layer that can be used as a utility by more specialized applications. Such a Coordinated Internet would revolutionize enterprise management, for all enterprises, large and small, corporate and personal. For example, static workflows would become obsolete for all but the the most routine processes. Some solutions provide existence proofs of such a coordination substrate, such as the Redux solution in concurrent engineering, which we describe herein. However, foundational research remains to be done in the new field of Coordination Engineering in order to reach the goal of a future Internet in which coordination functions are fundamental.

  13. Learning optimal embedded cascades.

    Science.gov (United States)

    Saberian, Mohammad Javad; Vasconcelos, Nuno

    2012-10-01

    The problem of automatic and optimal design of embedded object detector cascades is considered. Two main challenges are identified: optimization of the cascade configuration and optimization of individual cascade stages, so as to achieve the best tradeoff between classification accuracy and speed, under a detection rate constraint. Two novel boosting algorithms are proposed to address these problems. The first, RCBoost, formulates boosting as a constrained optimization problem which is solved with a barrier penalty method. The constraint is the target detection rate, which is met at all iterations of the boosting process. This enables the design of embedded cascades of known configuration without extensive cross validation or heuristics. The second, ECBoost, searches over cascade configurations to achieve the optimal tradeoff between classification risk and speed. The two algorithms are combined into an overall boosting procedure, RCECBoost, which optimizes both the cascade configuration and its stages under a detection rate constraint, in a fully automated manner. Extensive experiments in face, car, pedestrian, and panda detection show that the resulting detectors achieve an accuracy versus speed tradeoff superior to those of previous methods.

  14. Automated Test Case Generation

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    I would like to present the concept of automated test case generation. I work on it as part of my PhD and I think it would be interesting also for other people. It is also the topic of a workshop paper that I am introducing in Paris. (abstract below) Please note that the talk itself would be more general and not about the specifics of my PhD, but about the broad field of Automated Test Case Generation. I would introduce the main approaches (combinatorial testing, symbolic execution, adaptive random testing) and their advantages and problems. (oracle problem, combinatorial explosion, ...) Abstract of the paper: Over the last decade code-based test case generation techniques such as combinatorial testing or dynamic symbolic execution have seen growing research popularity. Most algorithms and tool implementations are based on finding assignments for input parameter values in order to maximise the execution branch coverage. Only few of them consider dependencies from outside the Code Under Test’s scope such...

  15. Automation from pictures

    International Nuclear Information System (INIS)

    Kozubal, A.J.

    1992-01-01

    The state transition diagram (STD) model has been helpful in the design of real time software, especially with the emergence of graphical computer aided software engineering (CASE) tools. Nevertheless, the translation of the STD to real time code has in the past been primarily a manual task. At Los Alamos we have automated this process. The designer constructs the STD using a CASE tool (Cadre Teamwork) using a special notation for events and actions. A translator converts the STD into an intermediate state notation language (SNL), and this SNL is compiled directly into C code (a state program). Execution of the state program is driven by external events, allowing multiple state programs to effectively share the resources of the host processor. Since the design and the code are tightly integrated through the CASE tool, the design and code never diverge, and we avoid design obsolescence. Furthermore, the CASE tool automates the production of formal technical documents from the graphic description encapsulated by the CASE tool. (author)

  16. AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT ...

    Science.gov (United States)

    The Automated Geospatial Watershed Assessment tool (AGWA) is a GIS interface jointly developed by the USDA Agricultural Research Service, the U.S. Environmental Protection Agency, the University of Arizona, and the University of Wyoming to automate the parameterization and execution of the Soil Water Assessment Tool (SWAT) and KINEmatic Runoff and EROSion (KINEROS2) hydrologic models. The application of these two models allows AGWA to conduct hydrologic modeling and watershed assessments at multiple temporal and spatial scales. AGWA’s current outputs are runoff (volumes and peaks) and sediment yield, plus nitrogen and phosphorus with the SWAT model. AGWA uses commonly available GIS data layers to fully parameterize, execute, and visualize results from both models. Through an intuitive interface the user selects an outlet from which AGWA delineates and discretizes the watershed using a Digital Elevation Model (DEM) based on the individual model requirements. The watershed model elements are then intersected with soils and land cover data layers to derive the requisite model input parameters. The chosen model is then executed, and the results are imported back into AGWA for visualization. This allows managers to identify potential problem areas where additional monitoring can be undertaken or mitigation activities can be focused. AGWA also has tools to apply an array of best management practices. There are currently two versions of AGWA available; AGWA 1.5 for

  17. Maneuver Automation Software

    Science.gov (United States)

    Uffelman, Hal; Goodson, Troy; Pellegrin, Michael; Stavert, Lynn; Burk, Thomas; Beach, David; Signorelli, Joel; Jones, Jeremy; Hahn, Yungsun; Attiyah, Ahlam; hide

    2009-01-01

    The Maneuver Automation Software (MAS) automates the process of generating commands for maneuvers to keep the spacecraft of the Cassini-Huygens mission on a predetermined prime mission trajectory. Before MAS became available, a team of approximately 10 members had to work about two weeks to design, test, and implement each maneuver in a process that involved running many maneuver-related application programs and then serially handing off data products to other parts of the team. MAS enables a three-member team to design, test, and implement a maneuver in about one-half hour after Navigation has process-tracking data. MAS accepts more than 60 parameters and 22 files as input directly from users. MAS consists of Practical Extraction and Reporting Language (PERL) scripts that link, sequence, and execute the maneuver- related application programs: "Pushing a single button" on a graphical user interface causes MAS to run navigation programs that design a maneuver; programs that create sequences of commands to execute the maneuver on the spacecraft; and a program that generates predictions about maneuver performance and generates reports and other files that enable users to quickly review and verify the maneuver design. MAS can also generate presentation materials, initiate electronic command request forms, and archive all data products for future reference.

  18. Adaptive protection coordination scheme for distribution network with distributed generation using ABC

    Directory of Open Access Journals (Sweden)

    A.M. Ibrahim

    2016-09-01

    Full Text Available This paper presents an adaptive protection coordination scheme for optimal coordination of DOCRs in interconnected power networks with the impact of DG, the used coordination technique is the Artificial Bee Colony (ABC. The scheme adapts to system changes; new relays settings are obtained as generation-level or system-topology changes. The developed adaptive scheme is applied on the IEEE 30-bus test system for both single- and multi-DG existence where results are shown and discussed.

  19. An Expanded Theoretical Framework of Care Coordination Across Transitions in Care Settings.

    Science.gov (United States)

    Radwin, Laurel E; Castonguay, Denise; Keenan, Carolyn B; Hermann, Cherice

    2016-01-01

    For many patients, high-quality, patient-centered, and cost-effective health care requires coordination among multiple clinicians and settings. Ensuring optimal care coordination requires a clear understanding of how clinician activities and continuity during transitions affect patient-centeredness and quality outcomes. This article describes an expanded theoretical framework to better understand care coordination. The framework provides clear articulation of concepts. Examples are provided of ways to measure the concepts.

  20. The Value of Information in Automated Negotiation: A Decision Model for Eliciting User Preferences

    NARCIS (Netherlands)

    T. Baarslag (Tim); M. Kaisers (Michael)

    2017-01-01

    textabstractConsider an agent that can autonomously negotiate and coordinate with others in our stead, to reach outcomes and agreements in our interest. Such automated negotiation agents are already common practice in areas such as high frequency trading, and are now finding applications in domains