WorldWideScience

Sample records for automated parameter optimization

  1. Automated Optimization of Walking Parameters for the Nao Humanoid Robot

    NARCIS (Netherlands)

    N. Girardi; C. Kooijman; A.J. Wiggers; A. Visser

    2013-01-01

    This paper describes a framework for optimizing walking parameters for a Nao humanoid robot. In this case an omnidirectional walk is learned. The parameters are learned in simulation with an evolutionary approach. The best performance was obtained for a combination of a low mutation rate and a high

  2. Automated Optimization of Walking Parameters for the Nao Humanoid Robot

    OpenAIRE

    Girardi, N.; Kooijman, C.; Wiggers, A.J.; de Visser, A.

    2013-01-01

    This paper describes a framework for optimizing walking parameters for a Nao humanoid robot. In this case an omnidirectional walk is learned. The parameters are learned in simulation with an evolutionary approach. The best performance was obtained for a combination of a low mutation rate and a high crossover rate.

  3. apsis - Framework for Automated Optimization of Machine Learning Hyper Parameters

    OpenAIRE

    Diehl, Frederik; Jauch, Andreas

    2015-01-01

    The apsis toolkit presented in this paper provides a flexible framework for hyperparameter optimization and includes both random search and a bayesian optimizer. It is implemented in Python and its architecture features adaptability to any desired machine learning code. It can easily be used with common Python ML frameworks such as scikit-learn. Published under the MIT License other researchers are heavily encouraged to check out the code, contribute or raise any suggestions. The code can be ...

  4. Sequential Model-Based Parameter Optimization: an Experimental Investigation of Automated and Interactive Approaches

    Science.gov (United States)

    Hutter, Frank; Bartz-Beielstein, Thomas; Hoos, Holger H.; Leyton-Brown, Kevin; Murphy, Kevin P.

    This work experimentally investigates model-based approaches for optimizing the performance of parameterized randomized algorithms. Such approaches build a response surface model and use this model for finding good parameter settings of the given algorithm. We evaluated two methods from the literature that are based on Gaussian process models: sequential parameter optimization (SPO) (Bartz-Beielstein et al. 2005) and sequential Kriging optimization (SKO) (Huang et al. 2006). SPO performed better "out-of-the-box," whereas SKO was competitive when response values were log transformed. We then investigated key design decisions within the SPO paradigm, characterizing the performance consequences of each. Based on these findings, we propose a new version of SPO, dubbed SPO+, which extends SPO with a novel intensification procedure and a log-transformed objective function. In a domain for which performance results for other (modelfree) parameter optimization approaches are available, we demonstrate that SPO+ achieves state-of-the-art performance. Finally, we compare this automated parameter tuning approach to an interactive, manual process that makes use of classical

  5. Development of a neuro-fuzzy technique for automated parameter optimization of inverse treatment planning

    International Nuclear Information System (INIS)

    Parameter optimization in the process of inverse treatment planning for intensity modulated radiation therapy (IMRT) is mainly conducted by human planners in order to create a plan with the desired dose distribution. To automate this tedious process, an artificial intelligence (AI) guided system was developed and examined. The AI system can automatically accomplish the optimization process based on prior knowledge operated by several fuzzy inference systems (FIS). Prior knowledge, which was collected from human planners during their routine trial-and-error process of inverse planning, has first to be 'translated' to a set of 'if-then rules' for driving the FISs. To minimize subjective error which could be costly during this knowledge acquisition process, it is necessary to find a quantitative method to automatically accomplish this task. A well-developed machine learning technique, based on an adaptive neuro fuzzy inference system (ANFIS), was introduced in this study. Based on this approach, prior knowledge of a fuzzy inference system can be quickly collected from observation data (clinically used constraints). The learning capability and the accuracy of such a system were analyzed by generating multiple FIS from data collected from an AI system with known settings and rules. Multiple analyses showed good agreements of FIS and ANFIS according to rules (error of the output values of ANFIS based on the training data from FIS of 7.77 ± 0.02%) and membership functions (3.9%), thus suggesting that the 'behavior' of an FIS can be propagated to another, based on this process. The initial experimental results on a clinical case showed that ANFIS is an effective way to build FIS from practical data, and analysis of ANFIS and FIS with clinical cases showed good planning results provided by ANFIS. OAR volumes encompassed by characteristic percentages of isodoses were reduced by a mean of between 0 and 28%. The study demonstrated a feasible way

  6. Parameter Extraction for PSpice Models by means of an Automated Optimization Tool – An IGBT model Study Case

    DEFF Research Database (Denmark)

    Suárez, Carlos Gómez; Reigosa, Paula Diaz; Iannuzzo, Francesco;

    2016-01-01

    An original tool for parameter extraction of PSpice models has been released, enabling a simple parameter identification. A physics-based IGBT model is used to demonstrate that the optimization tool is capable of generating a set of parameters which predicts the steady-state and switching behavior...... of two IGBT modules rated at 1.7 kV / 1 kA and 1.7 kV / 1.4kA....

  7. Using the ARTMO toolbox for automated retrieval of biophysical parameters through radiative transfer model inversion: Optimizing LUT-based inversion

    Science.gov (United States)

    Verrelst, J.; Rivera, J. P.; Leonenko, G.; Alonso, L.; Moreno, J.

    2012-04-01

    Radiative transfer (RT) modeling plays a key role for earth observation (EO) because it is needed to design EO instruments and to develop and test inversion algorithms. The inversion of a RT model is considered as a successful approach for the retrieval of biophysical parameters because of being physically-based and generally applicable. However, to the broader community this approach is considered as laborious because of its many processing steps and expert knowledge is required to realize precise model parameterization. We have recently developed a radiative transfer toolbox ARTMO (Automated Radiative Transfer Models Operator) with the purpose of providing in a graphical user interface (GUI) essential models and tools required for terrestrial EO applications such as model inversion. In short, the toolbox allows the user: i) to choose between various plant leaf and canopy RT models (e.g. models from the PROSPECT and SAIL family, FLIGHT), ii) to choose between spectral band settings of various air- and space-borne sensors or defining own sensor settings, iii) to simulate a massive amount of spectra based on a look up table (LUT) approach and storing it in a relational database, iv) to plot spectra of multiple models and compare them with measured spectra, and finally, v) to run model inversion against optical imagery given several cost options and accuracy estimates. In this work ARTMO was used to tackle some well-known problems related to model inversion. According to Hadamard conditions, mathematical models of physical phenomena are mathematically invertible if the solution of the inverse problem to be solved exists, is unique and depends continuously on data. This assumption is not always met because of the large number of unknowns and different strategies have been proposed to overcome this problem. Several of these strategies have been implemented in ARTMO and were here analyzed to optimize the inversion performance. Data came from the SPARC-2003 dataset

  8. Optimization of hidrocyclone work parameters

    OpenAIRE

    Golomeova, Mirjana; Krstev, Boris; Golomeov, Blagoj

    2003-01-01

    The paper presents the procedure of optimization of laboratory hydrocyclone work by the application of dispersion analysis and planning with Greek-Latin square. The application of this method makes possible significant reduction of the number of tests and close optimization of the whole process. Tests were carried out by D-100 mm hydrocyclone. Optimization parameters are as follows: contents of solid in pulp, underflow diameter, overflow diameter and inlet pressure. The influence of optimi...

  9. Multivariate optimization of ILC parameters

    CERN Document Server

    Bazarov, Ivan V

    2005-01-01

    We present results of multiobjective optimization of the International Linear Collider (ILC) which seeks to maximize luminosity at each given total cost of the linac (capital and operating costs of cryomodules, refrigeration and RF). Evolutionary algorithms allow quick exploration of optimal sets of parameters in a complicated system such as ILC in the presence of realistic constraints as well as investigation of various what-if scenarios in potential performance. Among the parameters we varied there were accelerating gradient and Q of the cavities (in a coupled manner following a realistic Q vs. E curve), the number of particles per bunch, the bunch length, number of bunches in the train, etc. We find an optimum which decreases (relative to TDR baseline) the total linac cost by 22 %, capital cost by 25 % at the same luminosity of 3·1038

  10. Optimization-based Method for Automated Road Network Extraction

    Energy Technology Data Exchange (ETDEWEB)

    Xiong, D

    2001-09-18

    Automated road information extraction has significant applicability in transportation. It provides a means for creating, maintaining, and updating transportation network databases that are needed for purposes ranging from traffic management to automated vehicle navigation and guidance. This paper is to review literature on the subject of road extraction and to describe a study of an optimization-based method for automated road network extraction.

  11. Optimization-based Method for Automated Road Network Extraction

    International Nuclear Information System (INIS)

    Automated road information extraction has significant applicability in transportation. It provides a means for creating, maintaining, and updating transportation network databases that are needed for purposes ranging from traffic management to automated vehicle navigation and guidance. This paper is to review literature on the subject of road extraction and to describe a study of an optimization-based method for automated road network extraction

  12. Automated global optimization of commercial SAGD operations

    International Nuclear Information System (INIS)

    The economic optimization of steam assisted gravity drainage (SAGD) operations has been largely conducted through the use of simulations to identify optimal steam use approaches. In this study, the cumulative steam to oil ratio (CSOR) was optimized by altering the steam injection pressure throughout the evolution of the process in a detailed, 3-d reservoir model. A generic Athabasca simulation model was used along with a thermal reservoir simulator which used a corner point grid. A line heater was specified in the grid cells containing the well bores to mimic steam circulation. During heating, the injection and production locations were allowed to produce reservoir fluids from the reservoir to relieve pressure associated with the thermal expansion of oil sand. After steam circulation, the well bores were switched to an SAGD operation. At the producer well the operating constraint imposed a maximum temperature difference between the saturation temperature corresponding to the pressure of the fluids and the temperature in the wellbore equal to 5 degrees C. At the injection well, the steam injection pressure was specified according to the optimizer. A response surface was constructed by fitting the parameter sets and corresponding cost functions to a biquadratic function. After the minimum from the cost function was determined, a new set of parameters was selected to complete the iterations. Results indicated that optimization of SAGD is feasible with complex and detailed reservoir models by using parallel calculations. The general trend determined by the optimization algorithm developed in the research indicated that before the steam chamber contacts the overburden, the operating pressure should be relatively high. After contact is made, the injection pressure should be lowered to reduce heat losses. 17 refs., 1 tab., 5 figs

  13. Applications of Intelligent Evolutionary Algorithms in Optimal Automation System Design

    OpenAIRE

    Tung-Kuan Liu; Jyh-Horng Chou

    2011-01-01

    This paper proposes an intelligent evolutionary algorithm that can be applied in the design of optimal automation systems, and employs a multimodal six-bar mechanism optimization design, job shop production scheduling for the fishing equipment industry, and dynamic real-time production scheduling system design cases to show how the technique developed in this paper is highly effective at resolving optimal automation system design problems. Major breakthroughs in artificial intelligence contin...

  14. Automated Cache Performance Analysis And Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Mohror, Kathryn [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-12-23

    While there is no lack of performance counter tools for coarse-grained measurement of cache activity, there is a critical lack of tools for relating data layout to cache behavior to application performance. Generally, any nontrivial optimizations are either not done at all, or are done ”by hand” requiring significant time and expertise. To the best of our knowledge no tool available to users measures the latency of memory reference instructions for partic- ular addresses and makes this information available to users in an easy-to-use and intuitive way. In this project, we worked to enable the Open|SpeedShop performance analysis tool to gather memory reference latency information for specific instructions and memory ad- dresses, and to gather and display this information in an easy-to-use and intuitive way to aid performance analysts in identifying problematic data structures in their codes. This tool was primarily designed for use in the supercomputer domain as well as grid, cluster, cloud-based parallel e-commerce, and engineering systems and middleware. Ultimately, we envision a tool to automate optimization of application cache layout and utilization in the Open|SpeedShop performance analysis tool. To commercialize this soft- ware, we worked to develop core capabilities for gathering enhanced memory usage per- formance data from applications and create and apply novel methods for automatic data structure layout optimizations, tailoring the overall approach to support existing supercom- puter and cluster programming models and constraints. In this Phase I project, we focused on infrastructure necessary to gather performance data and present it in an intuitive way to users. With the advent of enhanced Precise Event-Based Sampling (PEBS) counters on recent Intel processor architectures and equivalent technology on AMD processors, we are now in a position to access memory reference information for particular addresses. Prior to the introduction of PEBS counters

  15. Spray automated balancing of rotors - How process parameters influence performance

    Science.gov (United States)

    Smalley, A. J.; Baldwin, R. M.; Fleming, D. P.; Yuhas, J. S.

    1989-01-01

    This paper addresses the application of spray-automated balancing of rotors, and the influence that various operating parameters will have on balancing performance. Spray-automated balancing uses the fuel-air repetitive explosion process to imbed short, discrete bursts of high velocity, high temperature powder into a rotating part at an angle selected to reduce unbalance of the part. The shortness of the burst, the delay in firing of the gun, the speed of the disk and the variability in speed all influence the accuracy and effectiveness of the automated balancing process. The paper evaluates this influence by developing an analytical framework and supplementing the analysis with empirical data obtained while firing the gun at a rotating disk. Encouraging results are obtained, and it is shown that the process should perform satisfactorily over a wide range of operating parameters. Further experimental results demonstrate the ability of the method to reduce vibration levels induced by mass unbalance in a rotating disk.

  16. DRAM BASED PARAMETER DATABASE OPTIMIZATION

    OpenAIRE

    Marcinkevicius, Tadas

    2012-01-01

    This thesis suggests an improved parameter database implementation for one of Ericsson products. The parameter database is used during the initialization of the system as well as during the later operation. The database size is constantly growing because the parameter database is intended to be used with different hardware configurations. When a new technology platform is released, multiple revisions with additional features and functionalities are later created, resulting in introduction of ...

  17. Evaluation of GCC optimization parameters

    Directory of Open Access Journals (Sweden)

    Rodrigo D. Escobar

    2012-12-01

    Full Text Available Compile-time optimization of code can result in significant performance gains. The amount of these gains varies widely depending upon the code being optimized, the hardware being compiled for, the specific performance increase attempted (e.g. speed, throughput, memory utilization, etc. and the used compiler. We used the latest version of the SPEC CPU 2006 benchmark suite to help gain an understanding of possible performance improvements using GCC (GNU Compiler Collection options focusing mainly on speed gains made possible by tuning the compiler with the standard compiler optimization levels as well as a specific compiler option for the hardware processor. We compared the best standardized tuning options obtained for a core i7 processor, to the same relative options used on a Pentium4 to determine whether the GNU project has improved its performance tuning capabilities for specific hardware over time.

  18. Parameters Optimization of Synergetic Recognition Approach

    Institute of Scientific and Technical Information of China (English)

    GAOJun; DONGHuoming; SHAOJing; ZHAOJing

    2005-01-01

    Synergetic pattern recognition is a novel and effective pattern recognition method, and has some advantages in image recognition. Researches have shown that attention parameters λ and parameters B, C directly influence on the recognition results, but there is no general research theory to control these parameters in the recognition process. We abstractly analyze these parameters in this paper, and purpose a novel parameters optimization method based on simulated annealing algorithm. SA algorithm has good optimization performance and is used to search the global optimized solution of these parameters. Theoretic analysis and experimental results both show that the proposed parameters optimization method is effective, which can fully improve the performance of synergetic recognition approach, and the algorithm realization is simple and fast.

  19. Video Superresolution via Parameter-Optimized Particle Swarm Optimization

    OpenAIRE

    2014-01-01

    Video superresolution (VSR) aims to reconstruct a high-resolution video sequence from a low-resolution sequence. We propose a novel particle swarm optimization algorithm named as parameter-optimized multiple swarms PSO (POMS-PSO). We assessed the optimization performance of POMS-PSO by four standard benchmark functions. To reconstruct high-resolution video, we build an imaging degradation model. In view of optimization, VSR is converted to an optimization computation problem. And we take POMS...

  20. QUADRATIC OPTIMIZATION METHOD AND ITS APPLICATION ON OPTIMIZING MECHANISM PARAMETER

    Institute of Scientific and Technical Information of China (English)

    ZHAO Yun; CHEN Jianneng; YU Yaxin; YU Gaohong; ZHU Jianping

    2006-01-01

    In order that the mechanism designed meets the requirements of kinematics with optimal dynamics behaviors, a quadratic optimization method is proposed based on the different characteristics of kinematic and dynamic optimization. This method includes two steps of optimization, that is, kinematic and dynamic optimization. Meanwhile, it uses the results of the kinematic optimization as the constraint equations of dynamic optimization. This method is used in the parameters optimization of transplanting mechanism with elliptic planetary gears of high-speed rice seedling transplanter with remarkable significance. The parameters spectrum, which meets to the kinematic requirements, is obtained through visualized human-computer interactions in the kinematics optimization, and the optimal parameters are obtained based on improved genetic algorithm in dynamic optimization. In the dynamic optimization, the objective function is chosen as the optimal dynamic behavior and the constraint equations are from the results of the kinematic optimization. This method is suitable for multi-objective optimization when both the kinematic and dynamic performances act as objective functions.

  1. Optimal Parameters Multicomponent Mixtures Extruding

    Directory of Open Access Journals (Sweden)

    Ramil F. Sagitov

    2013-01-01

    Full Text Available Experimental research of multicomponent mixtures extruding from production wastes are carried out, unit for production of composites from different types of waste is presented. Having analyzed dependence of multicomponent mixtures extruding energy requirements on die length and components content at three values of angular rate of screw rotation, we received the values of energy requirements at optimal length of the die, angular speed and percent of binding additives.

  2. Optimization Tools For Automated Vehicle Systems

    OpenAIRE

    Shiller, Zvi

    1995-01-01

    This work focuses on computing time-optimal maneuvers which might be used to develop strategies for emergency maneuvers and establishing the vehicle' s performance envelope. The problem of emergency maneuvers is addressed in the context of time optimal control. Time optimal trajectories are computed along specified paths for a nonlinear vehicle model, which considers both lateral and longitudinal motions.

  3. Optimization of submerged vane parameters

    Indian Academy of Sciences (India)

    H SHARMA; B JAIN; Z AHMAD

    2016-03-01

    Submerged vanes are airfoils which are in general placed at certain angle with respect to the flow direction in a channel to induce artificial circulations downstream. By virtue of these artificially generated circulations, submerged vanes were utilized to protect banks of rivers against erosion, to control shifting of rivers, to avoid blocking of lateral intake with sediment deposition, etc. Odgaard and his associates have experimentally obtained the optimum vane sizes and recommended that it can be used for vane design. Thispaper is an attempt to review and validate the findings of Odgaard and his associates by utilizing computational fluid dynamics and experiments as a tool in which the vane generated vorticity in the downstream was maximized in order to obtain optimum vane parameters for single and multiple vane arrays.

  4. Automated firewall analytics design, configuration and optimization

    CERN Document Server

    Al-Shaer, Ehab

    2014-01-01

    This book provides a comprehensive and in-depth study of automated firewall policy analysis for designing, configuring and managing distributed firewalls in large-scale enterpriser networks. It presents methodologies, techniques and tools for researchers as well as professionals to understand the challenges and improve the state-of-the-art of managing firewalls systematically in both research and application domains. Chapters explore set-theory, managing firewall configuration globally and consistently, access control list with encryption, and authentication such as IPSec policies. The author

  5. Automated beam steering using optimal control

    International Nuclear Information System (INIS)

    We present a steering algorithm which, with the aid of a model, allows the user to specify beam behavior throughout a beamline, rather than just at specified beam position monitor (BPM) locations. The model is used primarily to compute the values of the beam phase vectors from BPM measurements, and to define cost functions that describe the steering objectives. The steering problem is formulated as constrained optimization problem; however, by applying optimal control theory we can reduce it to an unconstrained optimization whose dimension is the number of control signals.

  6. Optimization of parameters of heat exchangers vehicles

    OpenAIRE

    Andrei MELEKHIN; Aleksandr MELEKHIN

    2014-01-01

    The relevance of the topic due to the decision of problems of the economy of resources in heating systems of vehicles. To solve this problem we have developed an integrated method of research, which allows to solve tasks on optimization of parameters of heat exchangers vehicles. This method decides multicriteria optimization problem with the program nonlinear optimization on the basis of software with the introduction of an array of temperatures obtained using thermography. The authors have d...

  7. Cosmological parameter estimation using Particle Swarm Optimization

    International Nuclear Information System (INIS)

    Constraining parameters of a theoretical model from observational data is an important exercise in cosmology. There are many theoretically motivated models, which demand greater number of cosmological parameters than the standard model of cosmology uses, and make the problem of parameter estimation challenging. It is a common practice to employ Bayesian formalism for parameter estimation for which, in general, likelihood surface is probed. For the standard cosmological model with six parameters, likelihood surface is quite smooth and does not have local maxima, and sampling based methods like Markov Chain Monte Carlo (MCMC) method are quite successful. However, when there are a large number of parameters or the likelihood surface is not smooth, other methods may be more effective. In this paper, we have demonstrated application of another method inspired from artificial intelligence, called Particle Swarm Optimization (PSO) for estimating cosmological parameters from Cosmic Microwave Background (CMB) data taken from the WMAP satellite

  8. Mixed integer evolution strategies for parameter optimization.

    Science.gov (United States)

    Li, Rui; Emmerich, Michael T M; Eggermont, Jeroen; Bäck, Thomas; Schütz, M; Dijkstra, J; Reiber, J H C

    2013-01-01

    Evolution strategies (ESs) are powerful probabilistic search and optimization algorithms gleaned from biological evolution theory. They have been successfully applied to a wide range of real world applications. The modern ESs are mainly designed for solving continuous parameter optimization problems. Their ability to adapt the parameters of the multivariate normal distribution used for mutation during the optimization run makes them well suited for this domain. In this article we describe and study mixed integer evolution strategies (MIES), which are natural extensions of ES for mixed integer optimization problems. MIES can deal with parameter vectors consisting not only of continuous variables but also with nominal discrete and integer variables. Following the design principles of the canonical evolution strategies, they use specialized mutation operators tailored for the aforementioned mixed parameter classes. For each type of variable, the choice of mutation operators is governed by a natural metric for this variable type, maximal entropy, and symmetry considerations. All distributions used for mutation can be controlled in their shape by means of scaling parameters, allowing self-adaptation to be implemented. After introducing and motivating the conceptual design of the MIES, we study the optimality of the self-adaptation of step sizes and mutation rates on a generalized (weighted) sphere model. Moreover, we prove global convergence of the MIES on a very general class of problems. The remainder of the article is devoted to performance studies on artificial landscapes (barrier functions and mixed integer NK landscapes), and a case study in the optimization of medical image analysis systems. In addition, we show that with proper constraint handling techniques, MIES can also be applied to classical mixed integer nonlinear programming problems. PMID:22122384

  9. Towards Automated Design, Analysis and Optimization of Declarative Curation Workflows

    Directory of Open Access Journals (Sweden)

    Tianhong Song

    2014-10-01

    Full Text Available Data curation is increasingly important. Our previous work on a Kepler curation package has demonstrated advantages that come from automating data curation pipelines by using workflow systems. However, manually designed curation workflows can be error-prone and inefficient due to a lack of user understanding of the workflow system, misuse of actors, or human error. Correcting problematic workflows is often very time-consuming. A more proactive workflow system can help users avoid such pitfalls. For example, static analysis before execution can be used to detect the potential problems in a workflow and help the user to improve workflow design. In this paper, we propose a declarative workflow approach that supports semi-automated workflow design, analysis and optimization. We show how the workflow design engine helps users to construct data curation workflows, how the workflow analysis engine detects different design problems of workflows and how workflows can be optimized by exploiting parallelism.

  10. Automated inference procedure for the determination of cell growth parameters

    Science.gov (United States)

    Harris, Edouard A.; Koh, Eun Jee; Moffat, Jason; McMillen, David R.

    2016-01-01

    The growth rate and carrying capacity of a cell population are key to the characterization of the population's viability and to the quantification of its responses to perturbations such as drug treatments. Accurate estimation of these parameters necessitates careful analysis. Here, we present a rigorous mathematical approach for the robust analysis of cell count data, in which all the experimental stages of the cell counting process are investigated in detail with the machinery of Bayesian probability theory. We advance a flexible theoretical framework that permits accurate estimates of the growth parameters of cell populations and of the logical correlations between them. Moreover, our approach naturally produces an objective metric of avoidable experimental error, which may be tracked over time in a laboratory to detect instrumentation failures or lapses in protocol. We apply our method to the analysis of cell count data in the context of a logistic growth model by means of a user-friendly computer program that automates this analysis, and present some samples of its output. Finally, we note that a traditional least squares fit can provide misleading estimates of parameter values, because it ignores available information with regard to the way in which the data have actually been collected.

  11. Optimization of parameters of heat exchangers vehicles

    Directory of Open Access Journals (Sweden)

    Andrei MELEKHIN

    2014-09-01

    Full Text Available The relevance of the topic due to the decision of problems of the economy of resources in heating systems of vehicles. To solve this problem we have developed an integrated method of research, which allows to solve tasks on optimization of parameters of heat exchangers vehicles. This method decides multicriteria optimization problem with the program nonlinear optimization on the basis of software with the introduction of an array of temperatures obtained using thermography. The authors have developed a mathematical model of process of heat exchange in heat exchange surfaces of apparatuses with the solution of multicriteria optimization problem and check its adequacy to the experimental stand in the visualization of thermal fields, an optimal range of managed parameters influencing the process of heat exchange with minimal metal consumption and the maximum heat output fin heat exchanger, the regularities of heat exchange process with getting generalizing dependencies distribution of temperature on the heat-release surface of the heat exchanger vehicles, defined convergence of the results of research in the calculation on the basis of theoretical dependencies and solving mathematical model.

  12. Automated Multivariate Optimization Tool for Energy Analysis: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Ellis, P. G.; Griffith, B. T.; Long, N.; Torcellini, P. A.; Crawley, D.

    2006-07-01

    Building energy simulations are often used for trial-and-error evaluation of ''what-if'' options in building design--a limited search for an optimal solution, or ''optimization''. Computerized searching has the potential to automate the input and output, evaluate many options, and perform enough simulations to account for the complex interactions among combinations of options. This paper describes ongoing efforts to develop such a tool. The optimization tool employs multiple modules, including a graphical user interface, a database, a preprocessor, the EnergyPlus simulation engine, an optimization engine, and a simulation run manager. Each module is described and the overall application architecture is summarized.

  13. Analysis, Model Parameter Extraction and Optimization of Planar Inductors Using MATLAB

    OpenAIRE

    Gadjeva, Elissaveta; Durev, Vladislav; Hristov, Marin

    2010-01-01

    The extended possibilities of the general-purpose software MATLAB for modeling, simulation and optimization can be successfully used in RF microelectronic circuit design. Based on description of the device models, various optimization problems can be solved. Automated model parameter extraction procedure for on-chip wide-band spiral inductor model has been developed and realized in the MATLAB environment. The obtained results for the simulated two-port Y- and S-parameters of the spiral induct...

  14. A fully-automated software pipeline for integrating breast density and parenchymal texture analysis for digital mammograms: parameter optimization in a case-control breast cancer risk assessment study

    Science.gov (United States)

    Zheng, Yuanjie; Wang, Yan; Keller, Brad M.; Conant, Emily; Gee, James C.; Kontos, Despina

    2013-02-01

    Estimating a woman's risk of breast cancer is becoming increasingly important in clinical practice. Mammographic density, estimated as the percent of dense (PD) tissue area within the breast, has been shown to be a strong risk factor. Studies also support a relationship between mammographic texture and breast cancer risk. We have developed a fullyautomated software pipeline for computerized analysis of digital mammography parenchymal patterns by quantitatively measuring both breast density and texture properties. Our pipeline combines advanced computer algorithms of pattern recognition, computer vision, and machine learning and offers a standardized tool for breast cancer risk assessment studies. Different from many existing methods performing parenchymal texture analysis within specific breast subregions, our pipeline extracts texture descriptors for points on a spatial regular lattice and from a surrounding window of each lattice point, to characterize the local mammographic appearance throughout the whole breast. To demonstrate the utility of our pipeline, and optimize its parameters, we perform a case-control study by retrospectively analyzing a total of 472 digital mammography studies. Specifically, we investigate the window size, which is a lattice related parameter, and compare the performance of texture features to that of breast PD in classifying case-control status. Our results suggest that different window sizes may be optimal for raw (12.7mm2) versus vendor post-processed images (6.3mm2). We also show that the combination of PD and texture features outperforms PD alone. The improvement is significant (p=0.03) when raw images and window size of 12.7mm2 are used, having an ROC AUC of 0.66. The combination of PD and our texture features computed from post-processed images with a window size of 6.3 mm2 achieves an ROC AUC of 0.75.

  15. An automated workflow for enhancing microbial bioprocess optimization on a novel microbioreactor platform

    Directory of Open Access Journals (Sweden)

    Rohe Peter

    2012-10-01

    Full Text Available Abstract Background High-throughput methods are widely-used for strain screening effectively resulting in binary information regarding high or low productivity. Nevertheless achieving quantitative and scalable parameters for fast bioprocess development is much more challenging, especially for heterologous protein production. Here, the nature of the foreign protein makes it impossible to predict the, e.g. best expression construct, secretion signal peptide, inductor concentration, induction time, temperature and substrate feed rate in fed-batch operation to name only a few. Therefore, a high number of systematic experiments are necessary to elucidate the best conditions for heterologous expression of each new protein of interest. Results To increase the throughput in bioprocess development, we used a microtiter plate based cultivation system (Biolector which was fully integrated into a liquid-handling platform enclosed in laminar airflow housing. This automated cultivation platform was used for optimization of the secretory production of a cutinase from Fusarium solani pisi with Corynebacterium glutamicum. The online monitoring of biomass, dissolved oxygen and pH in each of the microtiter plate wells enables to trigger sampling or dosing events with the pipetting robot used for a reliable selection of best performing cutinase producers. In addition to this, further automated methods like media optimization and induction profiling were developed and validated. All biological and bioprocess parameters were exclusively optimized at microtiter plate scale and showed perfect scalable results to 1 L and 20 L stirred tank bioreactor scale. Conclusions The optimization of heterologous protein expression in microbial systems currently requires extensive testing of biological and bioprocess engineering parameters. This can be efficiently boosted by using a microtiter plate cultivation setup embedded into a liquid-handling system, providing more throughput

  16. Automated process parameters tuning for an injection moulding machine with soft computing§

    Institute of Scientific and Technical Information of China (English)

    Peng ZHAO; Jian-zhong FU; Hua-min ZHOU; Shu-biao CUI

    2011-01-01

    In injection moulding production, the tuning of the process parameters is a challenging job, which relies heavily on the experience of skilled operators. In this paper, taking into consideration operator assessment during moulding trials, a novel intelligent model for automated tuning of process parameters is proposed. This consists of case based reasoning (CBR), empirical model (EM), and fuzzy logic (FL) methods. CBR and EM are used to imitate recall and intuitive thoughts of skilled operators,respectively, while FL is adopted to simulate the skilled operator optimization thoughts. First, CBR is used to set up the initial process parameters. If CBR fails, EM is employed to calculate the initial parameters. Next, a moulding trial is performed using the initial parameters. Then FL is adopted to optimize these parameters and correct defects repeatedly until the moulded part is found to be satisfactory. Based on the above methodologies, intelligent software was developed and embedded in the controller of an injection moulding machine. Experimental results show that the intelligent software can be effectively used in practical production, and it greatly reduces the dependence on the experience of the operators.

  17. Video Superresolution via Parameter-Optimized Particle Swarm Optimization

    Directory of Open Access Journals (Sweden)

    Yunyi Yan

    2014-01-01

    Full Text Available Video superresolution (VSR aims to reconstruct a high-resolution video sequence from a low-resolution sequence. We propose a novel particle swarm optimization algorithm named as parameter-optimized multiple swarms PSO (POMS-PSO. We assessed the optimization performance of POMS-PSO by four standard benchmark functions. To reconstruct high-resolution video, we build an imaging degradation model. In view of optimization, VSR is converted to an optimization computation problem. And we take POMS-PSO as an optimization method to solve the VSR problem, which overcomes the poor effect, low accuracy, and large calculation cost in other VSR algorithms. The proposed VSR method does not require exact movement estimation and does not need the computation of movement vectors. In terms of peak signal-to-noise ratio (PSNR, sharpness, and entropy, the proposed VSR method based POMS-PSO showed better objective performance. Besides objective standard, experimental results also proved the proposed method could reconstruct high-resolution video sequence with better subjective quality.

  18. SAE2.py : a python script to automate parameter studies using SCREAMER with application to magnetic switching on Z.

    Energy Technology Data Exchange (ETDEWEB)

    Orndorff-Plunkett, Franklin

    2011-05-01

    The SCREAMER simulation code is widely used at Sandia National Laboratories for designing and simulating pulsed power accelerator experiments on super power accelerators. A preliminary parameter study of Z with a magnetic switching retrofit illustrates the utility of the automating script for optimizing pulsed power designs. SCREAMER is a circuit based code commonly used in pulsed-power design and requires numerous iterations to find optimal configurations. System optimization using simulations like SCREAMER is by nature inefficient and incomplete when done manually. This is especially the case when the system has many interactive elements whose emergent effects may be unforeseeable and complicated. For increased completeness, efficiency and robustness, investigators should probe a suitably confined parameter space using deterministic, genetic, cultural, ant-colony algorithms or other computational intelligence methods. I have developed SAE2 - a user-friendly, deterministic script that automates the search for optima of pulsed-power designs with SCREAMER. This manual demonstrates how to make input decks for SAE2 and optimize any pulsed-power design that can be modeled using SCREAMER. Application of SAE2 to magnetic switching on model of a potential Z refurbishment illustrates the power of SAE2. With respect to the manual optimization, the automated optimization resulted in 5% greater peak current (10% greater energy) and a 25% increase in safety factor for the most highly stressed element.

  19. SAE2.py: a python script to automate parameter studies using SCREAMER with application to magnetic switching on Z

    International Nuclear Information System (INIS)

    The SCREAMER simulation code is widely used at Sandia National Laboratories for designing and simulating pulsed power accelerator experiments on super power accelerators. A preliminary parameter study of Z with a magnetic switching retrofit illustrates the utility of the automating script for optimizing pulsed power designs. SCREAMER is a circuit based code commonly used in pulsed-power design and requires numerous iterations to find optimal configurations. System optimization using simulations like SCREAMER is by nature inefficient and incomplete when done manually. This is especially the case when the system has many interactive elements whose emergent effects may be unforeseeable and complicated. For increased completeness, efficiency and robustness, investigators should probe a suitably confined parameter space using deterministic, genetic, cultural, ant-colony algorithms or other computational intelligence methods. I have developed SAE2 - a user-friendly, deterministic script that automates the search for optima of pulsed-power designs with SCREAMER. This manual demonstrates how to make input decks for SAE2 and optimize any pulsed-power design that can be modeled using SCREAMER. Application of SAE2 to magnetic switching on model of a potential Z refurbishment illustrates the power of SAE2. With respect to the manual optimization, the automated optimization resulted in 5% greater peak current (10% greater energy) and a 25% increase in safety factor for the most highly stressed element.

  20. An automated system for measuring parameters of nematode sinusoidal movement

    Directory of Open Access Journals (Sweden)

    Stirbl Robert C

    2005-02-01

    Full Text Available Abstract Background Nematode sinusoidal movement has been used as a phenotype in many studies of C. elegans development, behavior and physiology. A thorough understanding of the ways in which genes control these aspects of biology depends, in part, on the accuracy of phenotypic analysis. While worms that move poorly are relatively easy to describe, description of hyperactive movement and movement modulation presents more of a challenge. An enhanced capability to analyze all the complexities of nematode movement will thus help our understanding of how genes control behavior. Results We have developed a user-friendly system to analyze nematode movement in an automated and quantitative manner. In this system nematodes are automatically recognized and a computer-controlled microscope stage ensures that the nematode is kept within the camera field of view while video images from the camera are stored on videotape. In a second step, the images from the videotapes are processed to recognize the worm and to extract its changing position and posture over time. From this information, a variety of movement parameters are calculated. These parameters include the velocity of the worm's centroid, the velocity of the worm along its track, the extent and frequency of body bending, the amplitude and wavelength of the sinusoidal movement, and the propagation of the contraction wave along the body. The length of the worm is also determined and used to normalize the amplitude and wavelength measurements. To demonstrate the utility of this system, we report here a comparison of movement parameters for a small set of mutants affecting the Go/Gq mediated signaling network that controls acetylcholine release at the neuromuscular junction. The system allows comparison of distinct genotypes that affect movement similarly (activation of Gq-alpha versus loss of Go-alpha function, as well as of different mutant alleles at a single locus (null and dominant negative alleles

  1. Autonomous space systems control incorporating automated maneuvers strategies in the presence of parameters uncertainties.

    Science.gov (United States)

    Mazinan, A H; Shakhesi, S

    2016-05-01

    The research attempts to deal with the autonomous space systems incorporating new automated maneuvers strategies in the presence of parameters uncertainties. The main subject behind the investigation is to realize the high-resolution small amplitude orbital maneuvers via the first control strategy. And subsequently to realize the large amplitude orbital maneuvers via the second control strategy, as well. There is a trajectory optimization to provide the three-axis referenced commends for the aforementioned overactuated autonomous space system to be able to transfer from the initial orbit to its final ones, in finite burn, as long as the uncertainties of key parameters of the system such as the thrust vector, the center of the gravity, the moments of the inertia and so on are taken into real consideration. The strategies performances are finally considered through a series of experiments and a number of benchmarks to be tangibly verified. PMID:26895709

  2. RTLS entry load relief parameter optimization

    Science.gov (United States)

    Crull, T. J.

    1975-01-01

    The results are presented of a study of a candidate load relief control law for use during the pullup phase of Return-to-Launch-Site (RTLS) abort entries. The control law parameters and cycle time which optimized performance of the normal load factor limiting phase (load relief phase) of an RTLS entry are examined. A set of control law gains, a smoothing parameter, and a normal force coefficient curve fit are established which resulted in good load relief performance considering the possible aerodynamic coefficient uncertainties defined. Also, the examination of various guidance cycle times revealed improved load relief performance with decreasing cycle time. A .5 second cycle provided smooth and adequate load relief in the presence of all the aerodynamic uncertainties examined.

  3. Optimal deadlock avoidance Petri net supervisors for automated manufacturing systems

    Institute of Scientific and Technical Information of China (English)

    Keyi XING; Feng TIAN; Xiaojun YANG

    2007-01-01

    Deadlock avoidance problems are investigated for automated manufacturing systems with flexible routings.Based on the Petri net models of the systems, this paper proposes, for the first time, the concept of perfect maximal resourcetransition circuits and their saturated states. The concept facilitates the development of system liveness characterization and deadlock avoidance Petri net supervisors. Deadlock is characterized as some perfect maximal resource-transition circuits reaching their saturated states. For a large class of manufacturing systems, which do not contain center resources, the optimal deadlock avoidance Petri net supervisors are presented. For a general manufacturing system, a method is proposed for reducing the system Petri net model so that the reduced model does not contain center resources and, hence, has optimal deadlock avoidance Petri net supervisor. The controlled reduced Petri net model can then be used as the liveness supervisor of the system.

  4. Automated assay optimization with integrated statistics and smart robotics.

    Science.gov (United States)

    Taylor, P B; Stewart, F P; Dunnington, D J; Quinn, S T; Schulz, C K; Vaidya, K S; Kurali, E; Lane, T R; Xiong, W C; Sherrill, T P; Snider, J S; Terpstra, N D; Hertzberg, R P

    2000-08-01

    The transition from manual to robotic high throughput screening (HTS) in the last few years has made it feasible to screen hundreds of thousands of chemical entities against a biological target in less than a month. This rate of HTS has increased the visibility of bottlenecks, one of which is assay optimization. In many organizations, experimental methods are generated by therapeutic teams associated with specific targets and passed on to the HTS group. The resulting assays frequently need to be further optimized to withstand the rigors and time frames inherent in robotic handling. Issues such as protein aggregation, ligand instability, and cellular viability are common variables in the optimization process. The availability of robotics capable of performing rapid random access tasks has made it possible to design optimization experiments that would be either very difficult or impossible for a person to carry out. Our approach to reducing the assay optimization bottleneck has been to unify the highly specific fields of statistics, biochemistry, and robotics. The product of these endeavors is a process we have named automated assay optimization (AAO). This has enabled us to determine final optimized assay conditions, which are often a composite of variables that we would not have arrived at by examining each variable independently. We have applied this approach to both radioligand binding and enzymatic assays and have realized benefits in both time and performance that we would not have predicted a priori. The fully developed AAO process encompasses the ability to download information to a robot and have liquid handling methods automatically created. This evolution in smart robotics has proven to be an invaluable tool for maintaining high-quality data in the context of increasing HTS demands. PMID:10992042

  5. GA based CNC turning center exploitation process parameters optimization

    OpenAIRE

    Z. Car; Barisic, B.; M. Ikonic

    2009-01-01

    This paper presents machining parameters (turning process) optimization based on the use of artificial intelligence. To obtain greater efficiency and productivity of the machine tool, optimal cutting parameters have to be obtained. In order to find optimal cutting parameters, the genetic algorithm (GA) has been used as an optimal solution finder. Optimization has to yield minimum machining time and minimum production cost, while considering technological and material constrains.

  6. GA based CNC turning center exploitation process parameters optimization

    Directory of Open Access Journals (Sweden)

    Z. Car

    2009-01-01

    Full Text Available This paper presents machining parameters (turning process optimization based on the use of artificial intelligence. To obtain greater efficiency and productivity of the machine tool, optimal cutting parameters have to be obtained. In order to find optimal cutting parameters, the genetic algorithm (GA has been used as an optimal solution finder. Optimization has to yield minimum machining time and minimum production cost, while considering technological and material constrains.

  7. Parameter optimization model in electrical discharge machining process

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Electrical discharge machining (EDM) process, at present is still an experience process, wherein selected parameters are often far from the optimum, and at the same time selecting optimization parameters is costly and time consuming. In this paper,artificial neural network (ANN) and genetic algorithm (GA) are used together to establish the parameter optimization model. An ANN model which adapts Levenberg-Marquardt algorithm has been set up to represent the relationship between material removal rate (MRR) and input parameters, and GA is used to optimize parameters, so that optimization results are obtained. The model is shown to be effective, and MRR is improved using optimized machining parameters.

  8. Applications of the Automated SMAC Modal Parameter Extraction Package

    International Nuclear Information System (INIS)

    An algorithm known as SMAC (Synthesize Modes And Correlate), based on principles of modal filtering, has been in development for a few years. The new capabilities of the automated version are demonstrated on test data from a complex shell/payload system. Examples of extractions from impact and shaker data are shown. The automated algorithm extracts 30 to 50 modes in the bandwidth from each column of the frequency response function matrix. Examples of the synthesized Mode Indicator Functions (MIFs) compared with the actual MIFs show the accuracy of the technique. A data set for one input and 170 accelerometer outputs can typically be reduced in an hour. Application to a test with some complex modes is also demonstrated

  9. Optimizing wireless LAN for longwall coal mine automation

    Energy Technology Data Exchange (ETDEWEB)

    Hargrave, C.O.; Ralston, J.C.; Hainsworth, D.W. [Exploration & Mining Commonwealth Science & Industrial Research Organisation, Pullenvale, Qld. (Australia)

    2007-01-15

    A significant development in underground longwall coal mining automation has been achieved with the successful implementation of wireless LAN (WLAN) technology for communication on a longwall shearer. WIreless-FIdelity (Wi-Fi) was selected to meet the bandwidth requirements of the underground data network, and several configurations were installed on operating longwalls to evaluate their performance. Although these efforts demonstrated the feasibility of using WLAN technology in longwall operation, it was clear that new research and development was required in order to establish optimal full-face coverage. By undertaking an accurate characterization of the target environment, it has been possible to achieve great improvements in WLAN performance over a nominal Wi-Fi installation. This paper discusses the impact of Fresnel zone obstructions and multipath effects on radio frequency propagation and reports an optimal antenna and system configuration. Many of the lessons learned in the longwall case are immediately applicable to other underground mining operations, particularly wherever there is a high degree of obstruction from mining equipment.

  10. MOS PARAMETER EXTRACTION AND OPTIMIZATION WITH GENETIC ALGORITHM

    OpenAIRE

    BAŞAK, M.Emin; KUNTMAN, Ayten; Kuntman, Hakan

    2010-01-01

    Extracting an optimal set of parameter values for a MOS device is great importance in contemporary technology is acomplex problem. Traditional methods of parameter extraction can produce far from optimal solutions because of thepresence of local optimum in the solution space. Genetic algorithms are well suited for finding near optimal solutions inirregular parameter spaces.In this study*, We have applied a genetic algorithm to the problem of device model parameter extraction and are able topr...

  11. Uncertainties in the Item Parameter Estimates and Robust Automated Test Assembly

    Science.gov (United States)

    Veldkamp, Bernard P.; Matteucci, Mariagiulia; de Jong, Martijn G.

    2013-01-01

    Item response theory parameters have to be estimated, and because of the estimation process, they do have uncertainty in them. In most large-scale testing programs, the parameters are stored in item banks, and automated test assembly algorithms are applied to assemble operational test forms. These algorithms treat item parameters as fixed values,…

  12. High dimensional real parameter optimization with teaching learning based optimization

    OpenAIRE

    Anima Naik; Suresh Chandra Satapathy; K. Parvathi

    2012-01-01

    In this paper, a new optimization technique known as Teaching–Learning-Based Optimization (TLBO) is implemented for solving high dimensional function optimization problems. Even though there are several other approaches to address this issue but the cost of computations are more in handling high dimensional problems. In this work we simulate TLBO for high dimensional benchmark function optimizations and compare its results with very widely used alternate techniques like Differential Evolution...

  13. Automated Portfolio Optimization Based on a New Test for Structural Breaks

    Directory of Open Access Journals (Sweden)

    Tobias Berens

    2014-04-01

    Full Text Available We present a completely automated optimization strategy which combines the classical Markowitz mean-variance portfolio theory with a recently proposed test for structural breaks in covariance matrices. With respect to equity portfolios, global minimum-variance optimizations, which base solely on the covariance matrix, yield considerable results in previous studies. However, financial assets cannot be assumed to have a constant covariance matrix over longer periods of time. Hence, we estimate the covariance matrix of the assets by respecting potential change points. The resulting approach resolves the issue of determining a sample for parameter estimation. Moreover, we investigate if this approach is also appropriate for timing the reoptimizations. Finally, we apply the approach to two datasets and compare the results to relevant benchmark techniques by means of an out-of-sample study. It is shown that the new approach outperforms equally weighted portfolios and plain minimum-variance portfolios on average.

  14. High dimensional real parameter optimization with teaching learning based optimization

    Directory of Open Access Journals (Sweden)

    Anima Naik

    2012-10-01

    Full Text Available In this paper, a new optimization technique known as Teaching–Learning-Based Optimization (TLBO is implemented for solving high dimensional function optimization problems. Even though there are several other approaches to address this issue but the cost of computations are more in handling high dimensional problems. In this work we simulate TLBO for high dimensional benchmark function optimizations and compare its results with very widely used alternate techniques like Differential Evolution (DE and Particle Swarm Optimization (PSO. Results clearly reveal that TLBO is able to address the computational cost issue for all simulated functions to a large dimensions compared to other two techniques.

  15. A Discrete Particle Swarm Optimization to Estimate Parameters in Vision Tasks

    Directory of Open Access Journals (Sweden)

    Benchikhi Loubna

    2016-01-01

    Full Text Available The majority of manufacturers demand increasingly powerful vision systems for quality control. To have good outcomes, the installation requires an effort in the vision system tuning, for both hardware and software. As time and accuracy are important, actors are oriented to automate parameter’s adjustment optimization at least in image processing. This paper suggests an approach based on discrete particle swarm optimization (DPSO that automates software setting and provides optimal parameters for industrial vision applications. A novel update functions for our DPSO definition are suggested. The proposed method is applied on some real examples of quality control to validate its feasibility and efficiency, which shows that the new DPSO model furnishes promising results.

  16. Optimalization of selected RFID systems Parameters

    Directory of Open Access Journals (Sweden)

    Peter Vestenicky

    2004-01-01

    Full Text Available This paper describes procedure for maximization of RFID transponder read range. This is done by optimalization of magnetics field intensity at transponder place and by optimalization of antenna and transponder coils coupling factor. Results of this paper can be used for RFID with inductive loop, i.e. system working in near electromagnetic field.

  17. Optimizing RF gun cavity geometry within an automated injector design system

    Energy Technology Data Exchange (ETDEWEB)

    Alicia Hofler ,Pavel Evtushenko

    2011-03-28

    RF guns play an integral role in the success of several light sources around the world, and properly designed and optimized cw superconducting RF (SRF) guns can provide a path to higher average brightness. As the need for these guns grows, it is important to have automated optimization software tools that vary the geometry of the gun cavity as part of the injector design process. This will allow designers to improve existing designs for present installations, extend the utility of these guns to other applications, and develop new designs. An evolutionary algorithm (EA) based system can provide this capability because EAs can search in parallel a large parameter space (often non-linear) and in a relatively short time identify promising regions of the space for more careful consideration. The injector designer can then evaluate more cavity design parameters during the injector optimization process against the beam performance requirements of the injector. This paper will describe an extension to the APISA software that allows the cavity geometry to be modified as part of the injector optimization and provide examples of its application to existing RF and SRF gun designs.

  18. The structure of optimal parameters for image restoration problems

    OpenAIRE

    de los Reyes, J. C.; Sch?nlieb, C. B.; Valkonen, T.

    2015-01-01

    We study the qualitative properties of optimal regularisation parameters in variational models for image restoration. The parameters are solutions of bilevel optimisation problems with the image restoration problem as constraint. A general type of regulariser is considered, which encompasses total variation (TV), total generalized variation (TGV) and infimal-convolution total variation (ICTV). We prove that under certain conditions on the given data optimal parameters derived by bilevel optim...

  19. A critical analysis of parameter adaptation in ant colony optimization

    OpenAIRE

    PELLEGRINI, Paola; Stützle, Thomas; Birattari, Mauro

    2012-01-01

    Applying parameter adaptation means operating on parameters of an algorithm while it is tackling an instance. For ant colony optimization, several parameter adaptation methods have been proposed. In the literature, these methods have been shown to improve the quality of the results achieved in some particular contexts. In particular, they proved to be successful when applied to novel ant colony optimization algorithms for tackling problems that are not a classical testbed for optimization alg...

  20. Automated Modal Parameter Estimation of Civil Engineering Structures

    OpenAIRE

    Andersen, Palle; Brincker, Rune; Goursat, Maurice; Mevel, Laurent

    2007-01-01

    In this paper the problems of doing automatic modal parameter extraction of ambient excited civil engineering structures is considered. Two different approaches for obtaining the modal parameters automatically are presented: The Frequency Domain Decomposition (FDD) technique and a correlation-driven Stochastic Subspace Identification (SSI) technique. Finally, the techniques are demonstrated on real data

  1. Automated Modal Parameter Estimation of Civil Engineering Structures

    DEFF Research Database (Denmark)

    Andersen, Palle; Brincker, Rune; Goursat, Maurice;

    In this paper the problems of doing automatic modal parameter extraction of ambient excited civil engineering structures is considered. Two different approaches for obtaining the modal parameters automatically are presented: The Frequency Domain Decomposition (FDD) technique and a correlation-dri......-driven Stochastic Subspace Identification (SSI) technique. Finally, the techniques are demonstrated on real data...

  2. Development of An Optimization Method for Determining Automation Rate in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Since automation was introduced in various industrial fields, it has been known that automation provides positive effects like greater efficiency and fewer human errors, and negative effect defined as out-of-the-loop (OOTL). Thus, before introducing automation in nuclear field, the estimation of the positive and negative effects of automation on human operators should be conducted. In this paper, by focusing on CPS, the optimization method to find an appropriate proportion of automation is suggested by integrating the suggested cognitive automation rate and the concepts of the level of ostracism. The cognitive automation rate estimation method was suggested to express the reduced amount of human cognitive loads, and the level of ostracism was suggested to express the difficulty in obtaining information from the automation system and increased uncertainty of human operators' diagnosis. The maximized proportion of automation that maintains the high level of attention for monitoring the situation is derived by an experiment, and the automation rate is estimated by the suggested automation rate estimation method. It is expected to derive an appropriate inclusion proportion of the automation system avoiding the OOTL problem and having maximum efficacy at the same time

  3. A New Approach for Parameter Optimization in Land Surface Model

    Institute of Scientific and Technical Information of China (English)

    LI Hongqi; GUO Weidong; SUN Guodong; ZHANG Yaocun; FU Congbin

    2011-01-01

    In this study,a new parameter optimization method was used to investigate the expansion of conditional nonlinear optimal perturbation (CNOP) in a land surface model (LSM) using long-term enhanced field observations at Tongyn station in Jilin Province,China,combined with a sophisticated LSM (common land model,CoLM).Tongyu station is a reference site of the international Coordinated Energy and Water Cycle Observations Project (CEOP) that has studied semiarid regions that have undergone desertification,salination,and degradation since late 1960s.In this study,three key land-surface parameters,namely,soil color,proportion of sand or clay in soil,and leaf-area index were chosen as parameters to be optimized.Our study comprised three experiments:First,a single-parameter optimization was performed,while the second and third experiments performed triple- and six-parameter optinizations,respectively.Notable improvements in simulating sensible heat flux (SH),latent heat flux (LH),soil temperature (TS),and moisture (MS) at shallow layers were achieved using the optimized parameters.The multiple-parameter optimization experiments performed better than the single-parameter experminent.All results demonstrate that the CNOP method can be used to optimize expanded parameters in an LSM.Moreover,clear mathematical meaning,simple design structure,and rapid computability give this method great potential for further application to parameter optimization in LSMs.

  4. Optimization parameters system maintenance transport aircraft

    Directory of Open Access Journals (Sweden)

    І.І. Ліннік

    2006-01-01

    Full Text Available  The algorithm of unconditional and conditional optimization Markov models of maintenance systems of transport airplanes of their programs of technical operation used at improvement is considered.

  5. Parameter Optimization Based on GA and HFSS

    Institute of Scientific and Technical Information of China (English)

    SUN Shu-hui; WANG Bing-zhong

    2005-01-01

    A new project based on genetic algorithm (GA) and high frequency simulation software (HFSS) is proposed to optimize microwave passive devices effectively. This project is realized with a general program named as optimization program. The program is compiled by Matlab and the macro language of HFSS which is a fast and effective way to accomplish tasks. In the paper, two examples are used to show the project's feasibility.

  6. The Robustness Optimization of Parameter Estimation in Chaotic Control Systems

    Directory of Open Access Journals (Sweden)

    Zhen Xu

    2014-10-01

    Full Text Available Standard particle swarm optimization algorithm has problems of bad adaption and weak robustness in the parameter estimation model of chaotic control systems. In light of this situation, this paper puts forward a new estimation model based on improved particle swarm optimization algorithm. It firstly constrains the search space of the population with Tent and Logistic double mapping to regulate the initialized population size, optimizes the fitness value by evolutionary state identification strategy so as to avoid its premature convergence, optimizes the inertia weight by the nonlinear decrease strategy to reach better global and local optimal solution, and then optimizes the iteration of particle swarm optimization algorithm with the hybridization concept from genetic algorithm. Finally, this paper applies it into the parameter estimation of chaotic systems control. Simulation results show that the proposed parameter estimation model shows higher accuracy, anti-noise ability and robustness compared with the model based on standard particle swarm optimization algorithm.

  7. Automated Estimation of the Orbital Parameters of Jupiter's Moons

    Science.gov (United States)

    Western, Emma; Ruch, Gerald T.

    2016-01-01

    Every semester the Physics Department at the University of St. Thomas has the Physics 104 class complete a Jupiter lab. This involves taking around twenty images of Jupiter and its moons with the telescope at the University of St. Thomas Observatory over the course of a few nights. The students then take each image and find the distance from each moon to Jupiter and plot the distances versus the elapsed time for the corresponding image. Students use the plot to fit four sinusoidal curves of the moons of Jupiter. I created a script that automates this process for the professor. It takes the list of images and creates a region file used by the students to measure the distance from the moons to Jupiter, a png image that is the graph of all the data points and the fitted curves of the four moons, and a csv file that contains the list of images, the date and time each image was taken, the elapsed time since the first image, and the distances to Jupiter for Io, Europa, Ganymede, and Callisto. This is important because it lets the professor spend more time working with the students and answering questions as opposed to spending time fitting the curves of the moons on the graph, which can be time consuming.

  8. Architecture of Automated Database Tuning Using SGA Parameters

    Directory of Open Access Journals (Sweden)

    Hitesh KUMAR SHARMA

    2012-05-01

    Full Text Available Business Data always growth from kilo byte, mega byte, giga byte, tera byte, peta byte, and so far. There is no way to avoid this increasing rate of data till business still running. Because of this issue, database tuning be critical part of a information system. Tuning a database in a cost-effective manner is a growing challenge. The total cost of ownership (TCO of information technology needs to be significantly reduced by minimizing people costs. In fact, mistakes in operations and administration of information systems are the single most reasons for system outage and unacceptable performance [3]. One way of addressing the challenge of total cost of ownership is by making information systems more self-managing. A particularly difficult piece of the ambitious vision of making database systems self-managing is the automation of database performance tuning. In this paper, we will explain the progress made thus far on this important problem. Specifically, we will propose the architecture and Algorithm for this problem.

  9. Selection of an optimal neural network architecture for computer-aided detection of microcalcifications - Comparison of automated optimization techniques

    International Nuclear Information System (INIS)

    Many computer-aided diagnosis (CAD) systems use neural networks (NNs) for either detection or classification of abnormalities. Currently, most NNs are 'optimized' by manual search in a very limited parameter space. In this work, we evaluated the use of automated optimization methods for selecting an optimal convolution neural network (CNN) architecture. Three automated methods, the steepest descent (SD), the simulated annealing (SA), and the genetic algorithm (GA), were compared. We used as an example the CNN that classifies true and false microcalcifications detected on digitized mammograms by a prescreening algorithm. Four parameters of the CNN architecture were considered for optimization, the numbers of node groups and the filter kernel sizes in the first and second hidden layers, resulting in a search space of 432 possible architectures. The area Az under the receiver operating characteristic (ROC) curve was used to design a cost function. The SA experiments were conducted with four different annealing schedules. Three different parent selection methods were compared for the GA experiments. An available data set was split into two groups with approximately equal number of samples. By using the two groups alternately for training and testing, two different cost surfaces were evaluated. For the first cost surface, the SD method was trapped in a local minimum 91% (392/432) of the time. The SA using the Boltzman schedule selected the best architecture after evaluating, on average, 167 architectures. The GA achieved its best performance with linearly scaled roulette-wheel parent selection; however, it evaluated 391 different architectures, on average, to find the best one. The second cost surface contained no local minimum. For this surface, a simple SD algorithm could quickly find the global minimum, but the SA with the very fast reannealing schedule was still the most efficient. The same SA scheme, however, was trapped in a local minimum on the first cost surface

  10. Simultaneous optimal experimental design for in vitro binding parameter estimation.

    Science.gov (United States)

    Ernest, C Steven; Karlsson, Mats O; Hooker, Andrew C

    2013-10-01

    Simultaneous optimization of in vitro ligand binding studies using an optimal design software package that can incorporate multiple design variables through non-linear mixed effect models and provide a general optimized design regardless of the binding site capacity and relative binding rates for a two binding system. Experimental design optimization was employed with D- and ED-optimality using PopED 2.8 including commonly encountered factors during experimentation (residual error, between experiment variability and non-specific binding) for in vitro ligand binding experiments: association, dissociation, equilibrium and non-specific binding experiments. Moreover, a method for optimizing several design parameters (ligand concentrations, measurement times and total number of samples) was examined. With changes in relative binding site density and relative binding rates, different measurement times and ligand concentrations were needed to provide precise estimation of binding parameters. However, using optimized design variables, significant reductions in number of samples provided as good or better precision of the parameter estimates compared to the original extensive sampling design. Employing ED-optimality led to a general experimental design regardless of the relative binding site density and relative binding rates. Precision of the parameter estimates were as good as the extensive sampling design for most parameters and better for the poorly estimated parameters. Optimized designs for in vitro ligand binding studies provided robust parameter estimation while allowing more efficient and cost effective experimentation by reducing the measurement times and separate ligand concentrations required and in some cases, the total number of samples. PMID:23943088

  11. Optimization of the main parameters of the subsoil irrigation systems

    OpenAIRE

    Elena Akytneva; Askar Akhmedov

    2014-01-01

    This article discusses the issues of optimization of the basic parameters of soil irrigation systems with application of the plan of Regardera second order. The obtained optimal parameters of soil irrigation systems that can be used for designing and construction of this method of irrigation.

  12. An Automated Tool for Optimization of FMS Scheduling With Meta Heuristic Approach

    Directory of Open Access Journals (Sweden)

    A. V. S. Sreedhar Kumar

    2014-03-01

    Full Text Available The evolutions of manufacturing systems have reflected the need and requirement of the market which varies from time to time. Flexible manufacturing systems have contributed a lot to the development of efficient manufacturing process and production of variety of customized limited volume products as per the market demand based on customer needs. Scheduling of FMS is a crucial operation in maximizing throughput, reducing the wastages and increasing the overall efficiency of the manufacturing process. The dynamic nature of the Flexible Manufacturing Systems makes them unique and hence a generalized solution for scheduling is difficult to be abstracted. Any Solution for optimizing the scheduling should take in to account a multitude of parameters before proposing any solution. The primary objective of the proposed research is to design a tool to automate the optimization of scheduling process by searching for solution in the search spaces using Meta heuristic approaches. The research also validates the use of reward as means for optimizing the scheduling by including it as one of the parameters in the Combined Objective Function.

  13. Adaptive Parameters for a Modified Comprehensive Learning Particle Swarm Optimizer

    OpenAIRE

    Yu-Jun Zheng; Hai-Feng Ling; Qiu Guan

    2012-01-01

    Particle swarm optimization (PSO) is a stochastic optimization method sensitive to parameter settings. The paper presents a modification on the comprehensive learning particle swarm optimizer (CLPSO), which is one of the best performing PSO algorithms. The proposed method introduces a self-adaptive mechanism that dynamically changes the values of key parameters including inertia weight and acceleration coefficient based on evolutionary information of individual particles and the swarm during ...

  14. Aircraft wing structural design optimization based on automated finite element modelling and ground structure approach

    Science.gov (United States)

    Yang, Weizhu; Yue, Zhufeng; Li, Lei; Wang, Peiyan

    2016-01-01

    An optimization procedure combining an automated finite element modelling (AFEM) technique with a ground structure approach (GSA) is proposed for structural layout and sizing design of aircraft wings. The AFEM technique, based on CATIA VBA scripting and PCL programming, is used to generate models automatically considering the arrangement of inner systems. GSA is used for local structural topology optimization. The design procedure is applied to a high-aspect-ratio wing. The arrangement of the integral fuel tank, landing gear and control surfaces is considered. For the landing gear region, a non-conventional initial structural layout is adopted. The positions of components, the number of ribs and local topology in the wing box and landing gear region are optimized to obtain a minimum structural weight. Constraints include tank volume, strength, buckling and aeroelastic parameters. The results show that the combined approach leads to a greater weight saving, i.e. 26.5%, compared with three additional optimizations based on individual design approaches.

  15. Identifying Model Parameters of Semiconductor Devices Using Optimization Techniques

    OpenAIRE

    Hruškovič, Lubomir; Grabner, Martin; Dobeš, Josef

    2007-01-01

    The optimization is an indispensable tool for extracting the parameters of any complicated models. Hence, advanced optimization techniques are also necessary for identifying the model parameters of semiconductor devices because their current models are very sophisticated (especially the BJT and MOSFET ones). The equations of such models contain typically one hundred parameters. Therefore, the measurement and particularly identification of the full set of the model para...

  16. The reliability parameters definition in radioelectronic devices automated designing systems

    Directory of Open Access Journals (Sweden)

    Yu. F. Zinkovskiy

    2012-11-01

    Full Text Available The reliability parameters calculating problems for radioelectronic devices determined by thermal modes are considered. It is shown that such calculations should be based on temperature definition methods for separate components of radio engineering device (RED electronic structure. The thermal modes calculating methods for electronic blocks, cells, microassemblies are considered. The analytical models may be used for the average temperatures of cells in the block; the heat exchange equations system is proposed for radio component temperature estimation on the cell plate; the analytical solution is offered for microassembly temperature estimation. The analytical mathematical models for reliability indexes calculations of radio components and whole RED are determined.

  17. ADVANTG An Automated Variance Reduction Parameter Generator, Rev. 1

    Energy Technology Data Exchange (ETDEWEB)

    Mosher, Scott W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Johnson, Seth R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bevill, Aaron M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Ibrahim, Ahmad M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Daily, Charles R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Evans, Thomas M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Wagner, John C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Johnson, Jeffrey O. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Grove, Robert E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-08-01

    The primary objective of ADVANTG is to reduce both the user effort and the computational time required to obtain accurate and precise tally estimates across a broad range of challenging transport applications. ADVANTG has been applied to simulations of real-world radiation shielding, detection, and neutron activation problems. Examples of shielding applications include material damage and dose rate analyses of the Oak Ridge National Laboratory (ORNL) Spallation Neutron Source and High Flux Isotope Reactor (Risner and Blakeman 2013) and the ITER Tokamak (Ibrahim et al. 2011). ADVANTG has been applied to a suite of radiation detection, safeguards, and special nuclear material movement detection test problems (Shaver et al. 2011). ADVANTG has also been used in the prediction of activation rates within light water reactor facilities (Pantelias and Mosher 2013). In these projects, ADVANTG was demonstrated to significantly increase the tally figure of merit (FOM) relative to an analog MCNP simulation. The ADVANTG-generated parameters were also shown to be more effective than manually generated geometry splitting parameters.

  18. Automated design and optimization of flexible booster autopilots via linear programming, volume 1

    Science.gov (United States)

    Hauser, F. D.

    1972-01-01

    A nonlinear programming technique was developed for the automated design and optimization of autopilots for large flexible launch vehicles. This technique, which resulted in the COEBRA program, uses the iterative application of linear programming. The method deals directly with the three main requirements of booster autopilot design: to provide (1) good response to guidance commands; (2) response to external disturbances (e.g. wind) to minimize structural bending moment loads and trajectory dispersions; and (3) stability with specified tolerances on the vehicle and flight control system parameters. The method is applicable to very high order systems (30th and greater per flight condition). Examples are provided that demonstrate the successful application of the employed algorithm to the design of autopilots for both single and multiple flight conditions.

  19. Temporal Parameter Optimization in Four-Dimensional Flash Trajectory Imaging

    International Nuclear Information System (INIS)

    In four-dimensional flash trajectory imaging, temporal parameters include time delay, laser pulse width, gate time, pulse pair repetition frequency and the frame rate of CCD, which directly impact on the acquisition of target trajectories over time. We propose a method of optimizing the temporal parameters of flash trajectory imaging. All the temporal parameters can be estimated by the spatial parameters of the volumes of interest, target scale and velocity, and target sample number. The formulae for optimizing temporal parameters are derived, and the method is demonstrated in an experiment with a ball oscillating as a pendulum. (general)

  20. An Automated DAKOTA and VULCAN-CFD Framework with Application to Supersonic Facility Nozzle Flowpath Optimization

    Science.gov (United States)

    Axdahl, Erik L.

    2015-01-01

    Removing human interaction from design processes by using automation may lead to gains in both productivity and design precision. This memorandum describes efforts to incorporate high fidelity numerical analysis tools into an automated framework and applying that framework to applications of practical interest. The purpose of this effort was to integrate VULCAN-CFD into an automated, DAKOTA-enabled framework with a proof-of-concept application being the optimization of supersonic test facility nozzles. It was shown that the optimization framework could be deployed on a high performance computing cluster with the flow of information handled effectively to guide the optimization process. Furthermore, the application of the framework to supersonic test facility nozzle flowpath design and optimization was demonstrated using multiple optimization algorithms.

  1. Parameters Investigation of Mathematical Model of Productivity for Automated Line with Availability by DMAIC Methodology

    Directory of Open Access Journals (Sweden)

    Tan Chan Sin

    2014-01-01

    Full Text Available Automated line is widely applied in industry especially for mass production with less variety product. Productivity is one of the important criteria in automated line as well as industry which directly present the outputs and profits. Forecast of productivity in industry accurately in order to achieve the customer demand and the forecast result is calculated by using mathematical model. Mathematical model of productivity with availability for automated line has been introduced to express the productivity in terms of single level of reliability for stations and mechanisms. Since this mathematical model of productivity with availability cannot achieve close enough productivity compared to actual one due to lack of parameters consideration, the enhancement of mathematical model is required to consider and add the loss parameters that is not considered in current model. This paper presents the investigation parameters of productivity losses investigated by using DMAIC (Define, Measure, Analyze, Improve, and Control concept and PACE Prioritization Matrix (Priority, Action, Consider, and Eliminate. The investigated parameters are important for further improvement of mathematical model of productivity with availability to develop robust mathematical model of productivity in automated line.

  2. Modeling and performance optimization of automated antenna alignment for telecommunication transceivers

    Directory of Open Access Journals (Sweden)

    Md. Ahsanul Hoque

    2015-09-01

    Full Text Available Antenna alignment is very cumbersome in telecommunication industry and it especially affects the MW links due to environmental anomalies or physical degradation over a period of time. While in recent years a more conventional approach of redundancy has been employed but to ensure the LOS link stability, novel automation techniques are needed. The basic principle is to capture the desired Received Signal Level (RSL by means of an outdoor unit installed on tower top and analyzing the RSL in indoor unit by means of a GUI interface. We have proposed a new smart antenna system where automation is initiated when the transceivers receive low signal strength and report the finding to processing comparator unit. Series architecture is used that include loop antenna, RCX Robonics, LabVIEW interface coupled with a tunable external controller. Denavit–Hartenberg parameters are used in analytical modeling and numerous control techniques have been investigated to overcome imminent overshoot problems for the transport link. With this novel approach, a solution has been put forward for the communication industry where any antenna could achieve optimal directivity for desired RSL with low overshoot and fast steady state response.

  3. Review of Automated Design and Optimization of MEMS

    DEFF Research Database (Denmark)

    Achiche, Sofiane; Fan, Zhun; Bolognini, Francesca

    2007-01-01

    In recent years MEMS saw a very rapid development. Although many advances have been reached, due to the multiphysics nature of MEMS, their design is still a difficult task carried on mainly by hand calculation. In order to help to overtake such difficulties, attempts to automate MEMS design were...... carried out. This paper presents a review of these techniques. The design task of MEMS is usually divided into four main stages: System Level, Device Level, Physical Level and the Process Level. The state of the art o automated MEMS design in each of these levels is investigated....

  4. Structural Parameter Optimization of Multilayer Conductors in HTS Cable

    Institute of Scientific and Technical Information of China (English)

    Yan Mao; Jie Qiu; Xin-Ying Liu; Zhi-Xuan Wang; Shu-Hong Wang; Jian-Guo Zhu; You-Guang Guo; Zhi-Wei Lin; Jian-Xun Jin

    2008-01-01

    In this paper, the design optimization of the structural parameters of multilayer conductors in high temperature superconducting (HTS) cable is reviewed. Various optimization methods, such as the particle swarm optimization (PSO), the genetic algorithm (GA), and a robust optimization method based on design for six sigma (DFSS), have been applied to realize uniform current distribution among the multi- layer HTS conductors. The continuous and discrete variables, such as the winding angle, radius, and winding direction of each layer, are chosen as the design parameters. Under the constraints of the mechanical properties and critical current, PSO is proven to be a more powerful tool than GA for structural parameter optimization, and DFSS can not only achieve a uniform current distribution, but also improve significantly the reliability and robustness of the HTS cable quality.

  5. Collective Tuning Initiative: automating and accelerating development and optimization of computing systems

    OpenAIRE

    Fursin, Grigori

    2009-01-01

    International audience Computing systems rarely deliver best possible performance due to ever increasing hardware and software complexity and limitations of the current optimization technology. Additional code and architecture optimizations are often required to improve execution time, size, power consumption, reliability and other important characteristics of computing systems. However, it is often a tedious, repetitive, isolated and time consuming process. In order to automate, simplify ...

  6. OPTIMIZATION OF PARAMETERS OF ELEMENTS COMPUTER SYSTEM

    Directory of Open Access Journals (Sweden)

    Nesterov G. D.

    2016-03-01

    Full Text Available The work is devoted to the topical issue of increasing the productivity of computers. It has an experimental character. Therefore, the description of a number of the carried-out tests and the analysis of their results is offered. Previously basic characteristics of modules of the computer for the regular mode of functioning are provided in the article. Further the technique of regulating their parameters in the course of experiment is described. Thus the special attention is paid to observing the necessary thermal mode in order to avoid an undesirable overheat of the central processor. Also, operability of system in the conditions of the increased energy consumption is checked. The most responsible moment thus is regulating the central processor. As a result of the test its optimum tension, frequency and delays of data reading from memory are found. The analysis of stability of characteristics of the RAM, in particular, a condition of its tires in the course of experiment is made. As the executed tests took place within the standard range of characteristics of modules, and, therefore, the margin of safety put in the computer and capacity of system wasn't used, further experiments were made at extreme dispersal in the conditions of air cooling. The received results are also given in the offered article

  7. Setting of the Optimal Parameters of Melted Glass

    Czech Academy of Sciences Publication Activity Database

    Luptáková, Natália; Matejíčka, L.; Krečmer, N.

    2015-01-01

    Roč. 10, č. 1 (2015), s. 73-79. ISSN 1802-2308 Institutional support: RVO:68081723 Keywords : Striae * Glass * Glass melting * Regression * Optimal parameters Subject RIV: JH - Ceramics, Fire-Resistant Materials and Glass

  8. Optimal z-axis scanning parameters for gynecologic cytology specimens

    OpenAIRE

    Amber D Donnelly; Mukherjee, Maheswari S.; Lyden, Elizabeth R.; Bridge, Julia A.; Subodh M Lele; Najia Wright; Mary F McGaughey; Culberson, Alicia M.; Adam J. Horn; Whitney R Wedel; Stanley J Radio

    2013-01-01

    Background: The use of virtual microscopy (VM) in clinical cytology has been limited due to the inability to focus through three dimensional (3D) cell clusters with a single focal plane (2D images). Limited information exists regarding the optimal scanning parameters for 3D scanning. Aims: The purpose of this study was to determine the optimal number of the focal plane levels and the optimal scanning interval to digitize gynecological (GYN) specimens prepared on SurePath™ glass slides while m...

  9. Integral Optimization of Systematic Parameters of Flip-Flow Screens

    Institute of Scientific and Technical Information of China (English)

    翟宏新

    2004-01-01

    The synthetic index Ks for evaluating flip-flow screens is proposed and systematically optimized in view of the whole system. A series of optimized values of relevant parameters are found and then compared with those of the current industrial specifications. The results show that the optimized value Ks approaches the one of those famous flip-flow screens in the world. Some new findings on geometric and kinematics parameters are useful for improving the flip-flow screens with a low Ks value, which is helpful in developing clean coal technology.

  10. Towards Automated Design, Analysis and Optimization of Declarative Curation Workflows

    OpenAIRE

    Tianhong Song; Sven Köhler; Bertram Ludäscher; James Hanken; Maureen Kelly; David Lowery; Macklin, James A.; Morris, Paul J.; Morris, Robert A.

    2014-01-01

    Data curation is increasingly important. Our previous work on a Kepler curation package has demonstrated advantages that come from automating data curation pipelines by using workflow systems. However, manually designed curation workflows can be error-prone and inefficient due to a lack of user understanding of the workflow system, misuse of actors, or human error. Correcting problematic workflows is often very time-consuming. A more proactive workflow system can help users avoid such pitfal...

  11. Multi-objective Genetic Algorithm for System Identification and Controller Optimization of Automated Guided Vehicle

    OpenAIRE

    Xing Wu; Peihuang Lou; Dunbing Tang

    2011-01-01

    This paper presents a multi-objective genetic algorithm (MOGA) with Pareto optimality and elitist tactics for the control system design of automated guided vehicle (AGV). The MOGA is used to identify AGV driving system model and optimize its servo control system sequentially. In system identification, the model identified by least square method is adopted as an evolution tutor who selects the individuals having balanced performances in all objectives as elitists. In controller optimization, t...

  12. Genetic Algorithm Optimizes Q-LAW Control Parameters

    Science.gov (United States)

    Lee, Seungwon; von Allmen, Paul; Petropoulos, Anastassios; Terrile, Richard

    2008-01-01

    A document discusses a multi-objective, genetic algorithm designed to optimize Lyapunov feedback control law (Q-law) parameters in order to efficiently find Pareto-optimal solutions for low-thrust trajectories for electronic propulsion systems. These would be propellant-optimal solutions for a given flight time, or flight time optimal solutions for a given propellant requirement. The approximate solutions are used as good initial solutions for high-fidelity optimization tools. When the good initial solutions are used, the high-fidelity optimization tools quickly converge to a locally optimal solution near the initial solution. Q-law control parameters are represented as real-valued genes in the genetic algorithm. The performances of the Q-law control parameters are evaluated in the multi-objective space (flight time vs. propellant mass) and sorted by the non-dominated sorting method that assigns a better fitness value to the solutions that are dominated by a fewer number of other solutions. With the ranking result, the genetic algorithm encourages the solutions with higher fitness values to participate in the reproduction process, improving the solutions in the evolution process. The population of solutions converges to the Pareto front that is permitted within the Q-law control parameter space.

  13. Screening of optimization parameters for mixing process via CFD

    International Nuclear Information System (INIS)

    In this study, the numerical simulation in a mixing vessel agitated by a six bladed Rushton turbine has been carried out to investigate the effects of effective parameters to the mixing process. The study is intended to screen the potential parameters which affect the optimization process and to provide the detail insights into the process. Three-dimensional and steady-state flow has been performed using the fully predictive Multiple Reference Frame (MRF) technique for the impeller and tank geometry. Process optimization is always used to ensure the optimum conditions are fulfilled to attain industries satisfaction or needs (for example; increase profit, low cost, yields, others). In this study, the range of recommended speed to accelerate optimization is 100, 150 and 200 rpm respectively and the range of recommended clearance is 50, 75 and 100 mm respectively for dual Rushton impeller. Thus, the computer fluid dynamics (CFD) was introduced in order to screen the suitable parameters efficiently and to accelerate optimization. (Author)

  14. Optimal filtering, parameter tracking, and control of nonlinear nuclear reactors

    International Nuclear Information System (INIS)

    This paper presents a new formulation of a class of nonlinear optimal control problems in which the system's signals are noisy and some system parameters are changing arbitrarily with time. The methodology is validated with an application to a nonlinear nuclear reactor model. A variational technique based on Pontryagin's maximum principle is used to filter the noisy signals, estimate the time-varying parameters, and calculate the optimal controls. The reformulation of the variational technique as an initial value problem allows this microprocessor-based algorithm to perform on-line filtering, parameter tracking, and control

  15. Optimizing hadoop parameter settings with gene expression programming guided PSO

    OpenAIRE

    Huang, Z; Li, M; Taylor, GA; Khan, M

    2016-01-01

    Hadoop MapReduce has become a major computing technology in support of big data analytics. The Hadoop framework has over 190 configuration parameters, and some of them can have a significant effect on the performance of a Hadoop job. Manually tuning the optimum or near optimum values of these parameters is a challenging task and also a time consuming process. This paper optimizes the performance of Hadoop by automatically tuning its configuration parameter settings. The proposed work first em...

  16. Optimizing chirped laser pulse parameters for electron acceleration in vacuum

    Energy Technology Data Exchange (ETDEWEB)

    Akhyani, Mina; Jahangiri, Fazel; Niknam, Ali Reza; Massudi, Reza, E-mail: r-massudi@sbu.ac.ir [Laser and Plasma Research Institute, Shahid Beheshti University, Tehran 1983969411 (Iran, Islamic Republic of)

    2015-11-14

    Electron dynamics in the field of a chirped linearly polarized laser pulse is investigated. Variations of electron energy gain versus chirp parameter, time duration, and initial phase of laser pulse are studied. Based on maximizing laser pulse asymmetry, a numerical optimization procedure is presented, which leads to the elimination of rapid fluctuations of gain versus the chirp parameter. Instead, a smooth variation is observed that considerably reduces the accuracy required for experimentally adjusting the chirp parameter.

  17. Influence of constructive characteristics of a room on the parameters of regulators of automated climatic systems

    Directory of Open Access Journals (Sweden)

    Samarin Oleg Dmitrievich

    Full Text Available Currently, the successful development of construction industry depends on the improved energy performance of buildings, structures and facilities, as well as on the quality assurance of the indoor climate. In view of the above, designing and operation of buildings should be aimed at the best (optimal solution of the following objective: to ensure the set-point values of indoor climate serviced by automated climate control systems, against the minimal energy consumption. In regard of its substantive structure, this paper describes the study on the relationship between the individual parameters of indoor thermal stability and the regulatory impact of automatic control systems (ACS. We analyzed the effect of structural room characteristics on the total energy consumption of the airflow processing unit in order to ensure energy saving. The final result is illustrated by numeric simulation with the use of a developed computer program and graphic examples. The proposed method is based on the assumption that the total thermal stability of the «room-ACVS-ACS» system is defined by heat absorption index of a room and the ACS control operation. This follows directly from the back-to-back connection of units corresponding to the room and ACVS in the scheme of automatic indoor climate control. Further study allowed authors to trace the influence of structural characteristics of a room on the total energy consumption needed for air intake treatment. This can be done by applying values of the main walling area. Basing on the developed algorithm, the authors made calculations using the computer program developed in Fortran. As a result a fragments of the program are presented - calculations of the parameters’ values included in the expressions and the total specific energy consumption for heating the air intake during the heating season, under varying room geometry, as well as the graphic illustration of the obtained relationships.

  18. Genetic algorithm parameter optimization: applied to sensor coverage

    Science.gov (United States)

    Sahin, Ferat; Abbate, Giuseppe

    2004-08-01

    Genetic Algorithms are powerful tools, which when set upon a solution space will search for the optimal answer. These algorithms though have some associated problems, which are inherent to the method such as pre-mature convergence and lack of population diversity. These problems can be controlled with changes to certain parameters such as crossover, selection, and mutation. This paper attempts to tackle these problems in GA by having another GA controlling these parameters. The values for crossover parameter are: one point, two point, and uniform. The values for selection parameters are: best, worst, roulette wheel, inside 50%, outside 50%. The values for the mutation parameter are: random and swap. The system will include a control GA whose population will consist of different parameters settings. While this GA is attempting to find the best parameters it will be advancing into the search space of the problem and refining the population. As the population changes due to the search so will the optimal parameters. For every control GA generation each of the individuals in the population will be tested for fitness by being run through the problem GA with the assigned parameters. During these runs the population used in the next control generation is compiled. Thus, both the issue of finding the best parameters and the solution to the problem are attacked at the same time. The goal is to optimize the sensor coverage in a square field. The test case used was a 30 by 30 unit field with 100 sensor nodes. Each sensor node had a coverage area of 3 by 3 units. The algorithm attempts to optimize the sensor coverage in the field by moving the nodes. The results show that the control GA will provide better results when compared to a system with no parameter changes.

  19. SPOT: An R Package For Automatic and Interactive Tuning of Optimization Algorithms by Sequential Parameter Optimization

    CERN Document Server

    Bartz-Beielstein, Thomas

    2010-01-01

    The sequential parameter optimization (SPOT) package for R is a toolbox for tuning and understanding simulation and optimization algorithms. Model-based investigations are common approaches in simulation and optimization. Sequential parameter optimization has been developed, because there is a strong need for sound statistical analysis of simulation and optimization algorithms. SPOT includes methods for tuning based on classical regression and analysis of variance techniques; tree-based models such as CART and random forest; Gaussian process models (Kriging), and combinations of different meta-modeling approaches. This article exemplifies how SPOT can be used for automatic and interactive tuning.

  20. Automatic parameter optimizer (APO) for multiple-point statistics

    Science.gov (United States)

    Bani Najar, Ehsanollah; Sharghi, Yousef; Mariethoz, Gregoire

    2016-04-01

    Multiple Point statistics (MPS) have gained popularity in recent years for generating stochastic realizations of complex natural processes. The main principle is that a training image (TI) is used to represent the spatial patterns to be modeled. One important feature of MPS is that the spatial model of the fields generated is made of 1) the chosen TI and 2) a set of algorithmic parameters that are specific to each MPS algorithm. While the choice of a training image can be guided by expert knowledge (e.g. for geological modeling) or by data acquisition methods (e.g. remote sensing) determining the algorithmic parameters can be more challenging. To date, only specific guidelines have been proposed for some simulation methods, and a general parameters inference methodology is still lacking, in particular for complex modeling settings such as when using multivariate training images. The common practice consists in carrying out an extensive parameters sensitivity analysis which can be cumbersome. An additional complexity is that the algorithmic parameters do influence CPU cost, and therefore finding optimal parameters is not only a modeling question, but also a computational challenge. To overcome these issues, we propose the automatic parameter optimizer (MPS-APO), a generic method based on stochastic optimization to rapidly determine acceptable parameters, in different settings and for any MPS method. The MPS automatic parameter optimizer proceeds in a 2-step approach. In the first step, it considers the set of input parameters of a given MPS algorithm and formulates an objective function that quantifies the reproduction of spatial patterns. The Simultaneous Perturbation Stochastic Approximation (SPSA) optimization method is used to minimize the objective function. SPSA is chosen because it is able to deal with the stochastic nature of the objective function and for its computational efficiency. At each iteration, small gaps are randomly placed in the input image

  1. APPLICATION OF GENETIC ALGORITHMS FOR ROBUST PARAMETER OPTIMIZATION

    Directory of Open Access Journals (Sweden)

    N. Belavendram

    2010-12-01

    Full Text Available Parameter optimization can be achieved by many methods such as Monte-Carlo, full, and fractional factorial designs. Genetic algorithms (GA are fairly recent in this respect but afford a novel method of parameter optimization. In GA, there is an initial pool of individuals each with its own specific phenotypic trait expressed as a ‘genetic chromosome’. Different genes enable individuals with different fitness levels to reproduce according to natural reproductive gene theory. This reproduction is established in terms of selection, crossover and mutation of reproducing genes. The resulting child generation of individuals has a better fitness level akin to natural selection, namely evolution. Populations evolve towards the fittest individuals. Such a mechanism has a parallel application in parameter optimization. Factors in a parameter design can be expressed as a genetic analogue in a pool of sub-optimal random solutions. Allowing this pool of sub-optimal solutions to evolve over several generations produces fitter generations converging to a pre-defined engineering optimum. In this paper, a genetic algorithm is used to study a seven factor non-linear equation for a Wheatstone bridge as the equation to be optimized. A comparison of the full factorial design against a GA method shows that the GA method is about 1200 times faster in finding a comparable solution.

  2. An Automated Tool for Optimizing Waste Transportation Routing and Scheduling

    International Nuclear Information System (INIS)

    An automated software tool has been developed and implemented to increase the efficiency and overall life-cycle productivity of site cleanup by scheduling vehicle and container movement between waste generators and disposal sites on the Department of Energy's Oak Ridge Reservation. The software tool identifies the best routes or accepts specifically requested routes and transit times, looks at fleet availability, selects the most cost effective route for each waste stream, and creates a transportation schedule in advance of waste movement. This tool was accepted by the customer and has been implemented. (authors)

  3. Estimating cellular parameters through optimization procedures: elementary principles and applications

    Directory of Open Access Journals (Sweden)

    Akatsuki eKimura

    2015-03-01

    Full Text Available Construction of quantitative models is a primary goal of quantitative biology, which aims to understand cellular and organismal phenomena in a quantitative manner. In this article, we introduce optimization procedures to search for parameters in a quantitative model that can reproduce experimental data. The aim of optimization is to minimize the sum of squared errors (SSE in a prediction or to maximize likelihood. A (local maximum of likelihood or (local minimum of the SSE can efficiently be identified using gradient approaches. Addition of a stochastic process enables us to identify the global maximum/minimum without becoming trapped in local maxima/minima. Sampling approaches take advantage of increasing computational power to test numerous sets of parameters in order to determine the optimum set. By combining Bayesian inference with gradient or sampling approaches, we can estimate both the optimum parameters and the form of the likelihood function related to the parameters. Finally, we introduce four examples of research that utilize parameter optimization to obtain biological insights from quantified data: transcriptional regulation, bacterial chemotaxis, morphogenesis, and cell cycle regulation. With practical knowledge of parameter optimization, cell and developmental biologists can develop realistic models that reproduce their observations and thus, obtain mechanistic insights into phenomena of interest.

  4. Parameter estimation for chaotic systems by particle swarm optimization

    International Nuclear Information System (INIS)

    Parameter estimation for chaotic systems is an important issue in nonlinear science and has attracted increasing interests from various research fields, which could be essentially formulated as a multi-dimensional optimization problem. As a novel evolutionary computation technique, particle swarm optimization (PSO) has attracted much attention and wide applications, owing to its simple concept, easy implementation and quick convergence. However, to the best of our knowledge, there is no published work on PSO for estimating parameters of chaotic systems. In this paper, a PSO approach is applied to estimate the parameters of Lorenz system. Numerical simulation and the comparisons demonstrate the effectiveness and robustness of PSO. Moreover, the effect of population size on the optimization performances is investigated as well

  5. Parameter optimization of pharmacokinetics based on artificial immune network

    Institute of Scientific and Technical Information of China (English)

    LIU Li; ZHOU Shao-dan; LU Hong-wen; XIE Fen; XU Wen-bo

    2008-01-01

    A new method for parameter optimization of pharmacokinetics based on an artificial immune network named PKAIN is proposed.To improve local searching ability of the artificial immune network,a partition-based concurrent simplex mutation is developed.By means of evolution of network cells in the PKAIN artificial immune network,an optimal set of parameters of a given pharmacokinetic model is obtained.The Laplace transform is applied to the pharmacokinetic difierential equations of remifentanil and its major metabolite,remifentanil acid.The PKAIN method is used to optimize parameters of the derived compartment models.Experimental results show that the twocompartment model is sufficient for the pharmacokinetic study of remifentanil acid for patients with mild degree of renal impairment.

  6. Complicated problem solution techniques in optimal parameter searching

    International Nuclear Information System (INIS)

    An algorithm is presented of a global search for numerical solution of multidimentional multiextremal multicriteria optimization problems with complicated constraints. A boundedness of object characteristic changes is assumed at restricted changes of its parameters (Lipschitz condition). The algorithm was realized as a computer code. The algorithm was realized as a computer code. The programme was used to solve in practice the different applied optimization problems. 10 refs.; 3 figs

  7. Damage localization using experimental modal parameters and topology optimization

    OpenAIRE

    Niemann, Hanno; Morlier, Joseph; Shahdin, Amir; Gourinat, Yves

    2010-01-01

    This work focuses on the developement of a damage detection and localization tool using the Topology Optimization feature of MSC.Nastran. This approach is based on the correlation of a local stiness loss and the change in modal parameters due to damages in structures. The loss in stiness is accounted by the Topology Optimization approach for updating undamaged numerical models towards similar models with embedded damages. Hereby, only a mass penalization and the changes in experimentally obta...

  8. Aerodynamic optimization by simultaneously updating flow variables and design parameters

    Science.gov (United States)

    Rizk, M. H.

    1990-01-01

    The application of conventional optimization schemes to aerodynamic design problems leads to inner-outer iterative procedures that are very costly. An alternative approach is presented based on the idea of updating the flow variable iterative solutions and the design parameter iterative solutions simultaneously. Two schemes based on this idea are applied to problems of correcting wind tunnel wall interference and optimizing advanced propeller designs. The first of these schemes is applicable to a limited class of two-design-parameter problems with an equality constraint. It requires the computation of a single flow solution. The second scheme is suitable for application to general aerodynamic problems. It requires the computation of several flow solutions in parallel. In both schemes, the design parameters are updated as the iterative flow solutions evolve. Computations are performed to test the schemes' efficiency, accuracy, and sensitivity to variations in the computational parameters.

  9. Automated Finite Element Modeling of Wing Structures for Shape Optimization

    Science.gov (United States)

    Harvey, Michael Stephen

    1993-01-01

    The displacement formulation of the finite element method is the most general and most widely used technique for structural analysis of airplane configurations. Modem structural synthesis techniques based on the finite element method have reached a certain maturity in recent years, and large airplane structures can now be optimized with respect to sizing type design variables for many load cases subject to a rich variety of constraints including stress, buckling, frequency, stiffness and aeroelastic constraints (Refs. 1-3). These structural synthesis capabilities use gradient based nonlinear programming techniques to search for improved designs. For these techniques to be practical a major improvement was required in computational cost of finite element analyses (needed repeatedly in the optimization process). Thus, associated with the progress in structural optimization, a new perspective of structural analysis has emerged, namely, structural analysis specialized for design optimization application, or.what is known as "design oriented structural analysis" (Ref. 4). This discipline includes approximation concepts and methods for obtaining behavior sensitivity information (Ref. 1), all needed to make the optimization of large structural systems (modeled by thousands of degrees of freedom and thousands of design variables) practical and cost effective.

  10. Automated evolutionary optimization of ion channel conductances and kinetics in models of young and aged rhesus monkey pyramidal neurons.

    Science.gov (United States)

    Rumbell, Timothy H; Draguljić, Danel; Yadav, Aniruddha; Hof, Patrick R; Luebke, Jennifer I; Weaver, Christina M

    2016-08-01

    Conductance-based compartment modeling requires tuning of many parameters to fit the neuron model to target electrophysiological data. Automated parameter optimization via evolutionary algorithms (EAs) is a common approach to accomplish this task, using error functions to quantify differences between model and target. We present a three-stage EA optimization protocol for tuning ion channel conductances and kinetics in a generic neuron model with minimal manual intervention. We use the technique of Latin hypercube sampling in a new way, to choose weights for error functions automatically so that each function influences the parameter search to a similar degree. This protocol requires no specialized physiological data collection and is applicable to commonly-collected current clamp data and either single- or multi-objective optimization. We applied the protocol to two representative pyramidal neurons from layer 3 of the prefrontal cortex of rhesus monkeys, in which action potential firing rates are significantly higher in aged compared to young animals. Using an idealized dendritic topology and models with either 4 or 8 ion channels (10 or 23 free parameters respectively), we produced populations of parameter combinations fitting the target datasets in less than 80 hours of optimization each. Passive parameter differences between young and aged models were consistent with our prior results using simpler models and hand tuning. We analyzed parameter values among fits to a single neuron to facilitate refinement of the underlying model, and across fits to multiple neurons to show how our protocol will lead to predictions of parameter differences with aging in these neurons. PMID:27106692

  11. Concurrently adjusting interrelated control parameters to achieve optimal engine performance

    Science.gov (United States)

    Jiang, Li; Lee, Donghoon; Yilmaz, Hakan; Stefanopoulou, Anna

    2015-12-01

    Methods and systems for real-time engine control optimization are provided. A value of an engine performance variable is determined, a value of a first operating condition and a value of a second operating condition of a vehicle engine are detected, and initial values for a first engine control parameter and a second engine control parameter are determined based on the detected first operating condition and the detected second operating condition. The initial values for the first engine control parameter and the second engine control parameter are adjusted based on the determined value of the engine performance variable to cause the engine performance variable to approach a target engine performance variable. In order to cause the engine performance variable to approach the target engine performance variable, adjusting the initial value for the first engine control parameter necessitates a corresponding adjustment of the initial value for the second engine control parameter.

  12. Automated Discovery of Elementary Chemical Reaction Steps Using Freezing String and Berny Optimization Methods

    OpenAIRE

    Suleimanov, Yury V.; Green, William H.

    2015-01-01

    We present a simple protocol which allows fully automated discovery of elementary chemical reaction steps using in cooperation single- and double-ended transition-state optimization algorithms - the freezing string and Berny optimization methods, respectively. To demonstrate the utility of the proposed approach, the reactivity of several systems of combustion and atmospheric chemistry importance is investigated. The proposed algorithm allowed us to detect without any human intervention not on...

  13. OPTIMIZATION OF ELECTROCHEMICAL MACHINING PROCESS PARAMETERS USING TAGUCHI APPROACH

    Directory of Open Access Journals (Sweden)

    R.Goswami

    2013-05-01

    Full Text Available this research paper, Taguchi method is applied to find optimum process parameters for Electrochemical machining (ECM. The objective of experimental investigation is to conduct research of machining parameters impact on MRR and SR of work piece of Aluminum and Mild steel . The approach was based on Taguchi’s method, analysis of variance and signal to noise ratio (S/N Ratio to optimize the Electrochemical machining process parameters for effective machining and to predict the optimal choice for each ECM parameter such asvoltage, tool feed and current. In this research three level of parameter is considered for experiment. There is L9 orthogonal array used by varying A,B,C respectively and for each combination we have conducted three experiments and with the help of Signal to Noise ratio we find out the optimum results for ECM. It was confirmed that determined optimal combination of ECM process parameters satisfy the real need for machining of Aluminum and Mild steel in actual practice.

  14. Optimization of Cutting Parameters Based on Surface Roughness and Assistance of Workpiece Surface Temperature in Turning Process

    Directory of Open Access Journals (Sweden)

    Adeel H. Suhail

    2010-01-01

    Full Text Available Problem statement: In machining operation, the quality of surface finish is an important requirement for many turned workpieces. Thus, the choice of optimized cutting parameters is very important for controlling the required surface quality. Approach: The focus of present experimental study is to optimize the cutting parameters using two performance measures, workpiece surface temperature and surface roughness. Optimal cutting parameters for each performance measure were obtained employing Taguchi techniques. The orthogonal array, signal to noise ratio and analysis of variance were employed to study the performance characteristics in turning operation. Results: The experimental results showed that the workpiece surface temperature can be sensed and used effectively as an indicator to control the cutting performance and improves the optimization process. Conclusion: Thus, it is possible to increase machine utilization and decrease production cost in an automated manufacturing environment.

  15. Classical algorithms for automated parameter-search methods in compartmental neural models - A critical survey based on simulations using neuron

    International Nuclear Information System (INIS)

    Parameter-search methods are problem-sensitive. All methods depend on some meta-parameters of their own, which must be determined experimentally in advance. A better choice of these intrinsic parameters for a certain parameter-search method may improve its performance. Moreover, there are various implementations of the same method, which may also affect its performance. The choice of the matching (error) function has a great impact on the search process in terms of finding the optimal parameter set and minimizing the computational cost. An initial assessment of the matching function ability to distinguish between good and bad models is recommended, before launching exhaustive computations. However, different runs of a parameter search method may result in the same optimal parameter set or in different parameter sets (the model is insufficiently constrained to accurately characterize the real system). Robustness of the parameter set is expressed by the extent to which small perturbations in the parameter values are not affecting the best solution. A parameter set that is not robust is unlikely to be physiologically relevant. Robustness can also be defined as the stability of the optimal parameter set to small variations of the inputs. When trying to estimate things like the minimum, or the least-squares optimal parameters of a nonlinear system, the existence of multiple local minima can cause problems with the determination of the global optimum. Techniques such as Newton's method, the Simplex method and Least-squares Linear Taylor Differential correction technique can be useful provided that one is lucky enough to start sufficiently close to the global minimum. All these methods suffer from the inability to distinguish a local minimum from a global one because they follow the local gradients towards the minimum, even if some methods are resetting the search direction when it is likely to get stuck in presumably a local minimum. Deterministic methods based on

  16. Wrapped Progressive Sampling Search for Optimizing Learning Algorithm Parameters

    NARCIS (Netherlands)

    Bosch, Antal van den

    2005-01-01

    We present a heuristic meta-learning search method for finding a set of optimized algorithmic parameters for a range of machine learning algo- rithms. The method, wrapped progressive sampling, is a combination of classifier wrapping and progressive sampling of training data. A series of experiments

  17. Introduction of IMRT in Macedonia: optimizing the MLC parameters

    International Nuclear Information System (INIS)

    Intensity modulated radiotherapy (IMRT) for the Varian Eclipse Treatment Planning System (TPS) requires optimization of the values of two parameters of the Multi Leaf Collimator (MLC) – the transmission of the MLC and the so called Dosimetric Leaf Gap (DLG). This paper describes the optimization of those parameters for one of the linear accelerators at the University Clinic for Radiotherapy and Oncology in Skopje. The starting values for the MLC parameters were determined by dose measurements with ionization chambers. Those measured values were introduced in the TPS and an IMRT test plan was created. The acquired test plan was used for irradiation of the two-dimensional chamber array 'MatriXX', and for comparison of the measured results with the corresponding results calculated by the TPS. By iteratively changing the two MLC parameters we optimized their values, so that the calculation corresponds to the measurement as much as possible. The final results of the optimization were introduced in the TPS thus enabling calculation of IMRT plans and proceed towards the phase of clinical introduction of this radiotherapy technique. (Author)

  18. FindFoci: a focus detection algorithm with automated parameter training that closely matches human assignments, reduces human inconsistencies and increases speed of analysis.

    Directory of Open Access Journals (Sweden)

    Alex D Herbert

    Full Text Available Accurate and reproducible quantification of the accumulation of proteins into foci in cells is essential for data interpretation and for biological inferences. To improve reproducibility, much emphasis has been placed on the preparation of samples, but less attention has been given to reporting and standardizing the quantification of foci. The current standard to quantitate foci in open-source software is to manually determine a range of parameters based on the outcome of one or a few representative images and then apply the parameter combination to the analysis of a larger dataset. Here, we demonstrate the power and utility of using machine learning to train a new algorithm (FindFoci to determine optimal parameters. FindFoci closely matches human assignments and allows rapid automated exploration of parameter space. Thus, individuals can train the algorithm to mirror their own assignments and then automate focus counting using the same parameters across a large number of images. Using the training algorithm to match human assignments of foci, we demonstrate that applying an optimal parameter combination from a single image is not broadly applicable to analysis of other images scored by the same experimenter or by other experimenters. Our analysis thus reveals wide variation in human assignment of foci and their quantification. To overcome this, we developed training on multiple images, which reduces the inconsistency of using a single or a few images to set parameters for focus detection. FindFoci is provided as an open-source plugin for ImageJ.

  19. Optimization of polyetherimide processing parameters for optical interconnect applications

    Science.gov (United States)

    Zhao, Wei; Johnson, Peter; Wall, Christopher

    2015-10-01

    ULTEM® polyetherimide (PEI) resins have been used in opto-electronic markets since the optical properties of these materials enable the design of critical components under tight tolerances. PEI resins are the material of choice for injection molded integrated lens applications due to good dimensional stability, near infrared (IR) optical transparency, low moisture uptake and high heat performance. In most applications, parts must be produced consistently with minimal deviations to insure compatibility throughout the lifetime of the part. With the large number of lenses needed for this market, injection molding has been optimized to maximize the production rate. These optimized parameters for high throughput may or may not translate to an optimized optical performance. In this paper, we evaluate and optimize PEI injection molding processes with a focus on optical property performance. A commonly used commercial grade was studied to determine factors and conditions which contribute to optical transparency, color, and birefringence. Melt temperature, mold temperature, injection speed and cycle time were varied to develop optimization trials and evaluate optical properties. These parameters could be optimized to reduce in-plane birefringence from 0.0148 to 0.0006 in this study. In addition, we have studied an optically smooth, sub-10nm roughness mold to re-evaluate material properties with minimal influence from mold quality and further refine resin and process effects for the best optical performance.

  20. AMMOS: Automated Molecular Mechanics Optimization tool for in silico Screening

    Directory of Open Access Journals (Sweden)

    Pajeva Ilza

    2008-10-01

    Full Text Available Abstract Background Virtual or in silico ligand screening combined with other computational methods is one of the most promising methods to search for new lead compounds, thereby greatly assisting the drug discovery process. Despite considerable progresses made in virtual screening methodologies, available computer programs do not easily address problems such as: structural optimization of compounds in a screening library, receptor flexibility/induced-fit, and accurate prediction of protein-ligand interactions. It has been shown that structural optimization of chemical compounds and that post-docking optimization in multi-step structure-based virtual screening approaches help to further improve the overall efficiency of the methods. To address some of these points, we developed the program AMMOS for refining both, the 3D structures of the small molecules present in chemical libraries and the predicted receptor-ligand complexes through allowing partial to full atom flexibility through molecular mechanics optimization. Results The program AMMOS carries out an automatic procedure that allows for the structural refinement of compound collections and energy minimization of protein-ligand complexes using the open source program AMMP. The performance of our package was evaluated by comparing the structures of small chemical entities minimized by AMMOS with those minimized with the Tripos and MMFF94s force fields. Next, AMMOS was used for full flexible minimization of protein-ligands complexes obtained from a mutli-step virtual screening. Enrichment studies of the selected pre-docked complexes containing 60% of the initially added inhibitors were carried out with or without final AMMOS minimization on two protein targets having different binding pocket properties. AMMOS was able to improve the enrichment after the pre-docking stage with 40 to 60% of the initially added active compounds found in the top 3% to 5% of the entire compound collection

  1. Parameters Optimization of Low Carbon Low Alloy Steel Annealing Process

    Institute of Scientific and Technical Information of China (English)

    Maoyu ZHAO; Qianwang CHEN

    2013-01-01

    A suitable match of annealing process parameters is critical for obtaining the fine microstructure of material.Low carbon low alloy steel (20CrMnTi) was heated for various durations near Ac temperature to obtain fine pearlite and ferrite grains.Annealing temperature and time were used as independent variables,and material property data were acquired by orthogonal experiment design under intercritical process followed by subcritical annealing process (IPSAP).The weights of plasticity (hardness,yield strength,section shrinkage and elongation) of annealed material were calculated by analytic hierarchy process,and then the process parameters were optimized by the grey theory system.The results observed by SEM images show that microstructure of optimization annealing material are consisted of smaller lamellar pearlites (ferrite-cementite)and refining ferrites which distribute uniformly.Morphologies on tension fracture surface of optimized annealing material indicate that the numbers of dimple fracture show more finer toughness obviously comparing with other annealing materials.Moreover,the yield strength value of optimization annealing material decreases apparently by tensile test.Thus,the new optimized strategy is accurate and feasible.

  2. Novel Approach to Nonlinear PID Parameter Optimization Using Ant Colony Optimization Algorithm

    Institute of Scientific and Technical Information of China (English)

    Duan Hai-bin; Wang Dao-bo; Yu Xiu-fen

    2006-01-01

    This paper presents an application of an Ant Colony Optimization (ACO) algorithm to optimize the parameters in the design of a type of nonlinear PID controller. The ACO algorithm is a novel heuristic bionic algorithm, which is based on the behaviour of real ants in nature searching for food. In order to optimize the parameters of the nonlinear PID controller using ACO algorithm,an objective function based on position tracing error was constructed, and elitist strategy was adopted in the improved ACO algorithm. Detailed simulation steps are presented. This nonlinear PID controller using the ACO algorithm has high precision of control and quick response.

  3. Automation for pattern library creation and in-design optimization

    Science.gov (United States)

    Deng, Rock; Zou, Elain; Hong, Sid; Wang, Jinyan; Zhang, Yifan; Sweis, Jason; Lai, Ya-Chieh; Ding, Hua; Huang, Jason

    2015-03-01

    contain remedies built in so that fixing happens either automatically or in a guided manner. Building a comprehensive library of patterns is a very difficult task especially when a new technology node is being developed or the process keeps changing. The main dilemma is not having enough representative layouts to use for model simulation where pattern locations can be marked and extracted. This paper will present an automatic pattern library creation flow by using a few known yield detractor patterns to systematically expand the pattern library and generate optimized patterns. We will also look at the specific fixing hints in terms of edge movements, additive, or subtractive changes needed during optimization. Optimization will be shown for both the digital physical implementation and custom design methods.

  4. An automatic and effective parameter optimization method for model tuning

    Science.gov (United States)

    Zhang, T.; Li, L.; Lin, Y.; Xue, W.; Xie, F.; Xu, H.; Huang, X.

    2015-11-01

    Physical parameterizations in general circulation models (GCMs), having various uncertain parameters, greatly impact model performance and model climate sensitivity. Traditional manual and empirical tuning of these parameters is time-consuming and ineffective. In this study, a "three-step" methodology is proposed to automatically and effectively obtain the optimum combination of some key parameters in cloud and convective parameterizations according to a comprehensive objective evaluation metrics. Different from the traditional optimization methods, two extra steps, one determining the model's sensitivity to the parameters and the other choosing the optimum initial value for those sensitive parameters, are introduced before the downhill simplex method. This new method reduces the number of parameters to be tuned and accelerates the convergence of the downhill simplex method. Atmospheric GCM simulation results show that the optimum combination of these parameters determined using this method is able to improve the model's overall performance by 9 %. The proposed methodology and software framework can be easily applied to other GCMs to speed up the model development process, especially regarding unavoidable comprehensive parameter tuning during the model development stage.

  5. Identification of optimal parameter combinations for the emergence of bistability

    Science.gov (United States)

    Májer, Imre; Hajihosseini, Amirhossein; Becskei, Attila

    2015-12-01

    Bistability underlies cellular memory and maintains alternative differentiation states. Bistability can emerge only if its parameter range is either physically realizable or can be enlarged to become realizable. We derived a general rule and showed that the bistable range of a reaction parameter is maximized by a pair of other parameters in any gene regulatory network provided they satisfy a general condition. The resulting analytical expressions revealed whether or not such reaction pairs are present in prototypical positive feedback loops. They are absent from the feedback loop enclosed by protein dimers but present in both the toggle-switch and the feedback circuit inhibited by sequestration. Sequestration can generate bistability even at narrow feedback expression range at which cooperative binding fails to do so, provided inhibition is set to an optimal value. These results help to design bistable circuits and cellular reprogramming and reveal whether bistability is possible in gene networks in the range of realistic parameter values.

  6. Cosmological parameter estimation using Particle Swarm Optimization (PSO)

    CERN Document Server

    Prasad, Jayanti

    2011-01-01

    Obtaining the set of cosmological parameters consistent with observational data is an important exercise in current cosmological research. It involves finding the global maximum of the likelihood function in the multi-dimensional parameter space. Currently sampling based methods, which are in general stochastic in nature, like Markov-Chain Monte Carlo(MCMC), are being commonly used for parameter estimation. The beauty of stochastic methods is that the computational cost grows, at the most, linearly in place of exponentially (as in grid based approaches) with the dimensionality of the search space. MCMC methods sample the full joint probability distribution (posterior) from which one and two dimensional probability distributions, best fit (average) values of parameters and then error bars can be computed. In the present work we demonstrate the application of another stochastic method, named Particle Swarm Optimization (PSO), that is widely used in the field of engineering and artificial intelligence, for cosmo...

  7. Identification of metabolic system parameters using global optimization methods

    Directory of Open Access Journals (Sweden)

    Gatzke Edward P

    2006-01-01

    Full Text Available Abstract Background The problem of estimating the parameters of dynamic models of complex biological systems from time series data is becoming increasingly important. Methods and results Particular consideration is given to metabolic systems that are formulated as Generalized Mass Action (GMA models. The estimation problem is posed as a global optimization task, for which novel techniques can be applied to determine the best set of parameter values given the measured responses of the biological system. The challenge is that this task is nonconvex. Nonetheless, deterministic optimization techniques can be used to find a global solution that best reconciles the model parameters and measurements. Specifically, the paper employs branch-and-bound principles to identify the best set of model parameters from observed time course data and illustrates this method with an existing model of the fermentation pathway in Saccharomyces cerevisiae. This is a relatively simple yet representative system with five dependent states and a total of 19 unknown parameters of which the values are to be determined. Conclusion The efficacy of the branch-and-reduce algorithm is illustrated by the S. cerevisiae example. The method described in this paper is likely to be widely applicable in the dynamic modeling of metabolic networks.

  8. Using Neural Networks to Tune Heuristic Parameters in Evolutionary Optimization

    Czech Academy of Sciences Publication Activity Database

    Holeňa, Martin

    Athens : WSEAS Press, 2006 - (Espi, P.; Giron-Sierra, J.; Drigas, A.), s. 1-6 ISBN 960-8457-41-6. [AIKED'06. WSEAS International Conference on Artificial Intelligence, Knowledge Engineering and Data Bases. Madrid (ES), 15.02.2006-17.02.2006] R&D Projects: GA ČR(CZ) GA201/05/0325 Institutional research plan: CEZ:AV0Z10300504 Keywords : evolutionary optimization * genetic algorithms * heuristic parameters * parameter tuning * artificial neural network s * convergence speed * population diversity Subject RIV: IN - Informatics, Computer Science

  9. Study on optimization of parameters in a biological model

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    According to the data observed in a China- Japan Joint Investigation, the parameters of an ecosystem dynamics model (Qiao et al., 2000) were optimized. The values of eighteen parameters for the model were obtained, with nutrient haft saturation constant, Kn = 1.4 μmol/dm3, Kp = 0.129 μmol/dm3 and Ks= 1.16μmol/dm3 for the diatom and Kn=0.345μmol/dm3, Kp=0.113 μmol/dm3 for the flagellate. Three proposals to set up a function for this multiple objective problem were discussed in detail.

  10. A Parameter Optimization for a National SASE FEL Facility

    International Nuclear Information System (INIS)

    The parameter optimization for a national SASE FEL facility was studied. Turkish State Planing Organization (DPT) gave financial support as an inter-universities project to begin technical design studies and test facility of National Accelerator Complex starting from 2006. In addition to a particle factory, the complex will contain a linac based free electron laser, positron ring based synchrotron radiation facilities and a proton accelerator. In this paper, we have given some results of main parameters of SASE FEL facility based on 130 MeV linac, application potential in basic and applied research

  11. EVALUATION OF ANAEMIA USING RED CELL AND RETICULOCYTE PARAMETERS USING AUTOMATED HAEMATOLOGY ANALYSER

    Directory of Open Access Journals (Sweden)

    Vidyadhar Rao

    2016-06-01

    Full Text Available Use of current models of Automated Haematology Analysers help in calculating the haemoglobin contents of the mature Red cells, Reticulocytes and percentages of Microcytic and hypochromic Red cells. This has helped the clinician in reaching early diagnosis and management of Different haemopoietic disorders like Iron Deficiency Anaemia, Thalassaemia and anaemia of chronic diseases. AIM This study is conducted using an Automated Haematology Analyser to evaluate anaemia using the Red Cell and Reticulocyte parameters. Three types of anaemia were evaluated; iron deficiency anaemia, anaemia of long duration and anaemia associated with chronic disease and Iron deficiency. MATERIALS AND METHODS The blood samples were collected from 287 adult patients with anaemia differentiated depending upon their iron status, haemoglobinopathies and inflammatory activity. Iron deficiency anaemia (n=132, anaemia of long duration (ACD, (n=97 and anaemia associated with chronic disease with iron deficiency (ACD Combi, (n=58. Microcytic Red cells, hypochromic red cells percentage and levels of haemoglobin in reticulocytes and matured RBCs were calculated. The accuracy of the parameters was analysed using receiver operating characteristic analyser to differentiate between the types of anaemia. OBSERVATIONS AND RESULTS There was no difference in parameters between the iron deficiency group or anaemia associated with chronic disease and iron deficiency. The hypochromic red cells percentage was the best parameter in differentiating anaemia of chronic disease with or without absolute iron deficiency with a sensitivity of 72.7% and a specificity of 70.4%. CONCLUSIONS The parameters of red cells and reticulocytes were of reasonably good indicators in differentiating the absolute iron deficiency anaemia with chronic disease.

  12. Using string invariants for prediction searching for optimal parameters

    Science.gov (United States)

    Bundzel, Marek; Kasanický, Tomáš; Pinčák, Richard

    2016-02-01

    We have developed a novel prediction method based on string invariants. The method does not require learning but a small set of parameters must be set to achieve optimal performance. We have implemented an evolutionary algorithm for the parametric optimization. We have tested the performance of the method on artificial and real world data and compared the performance to statistical methods and to a number of artificial intelligence methods. We have used data and the results of a prediction competition as a benchmark. The results show that the method performs well in single step prediction but the method's performance for multiple step prediction needs to be improved. The method works well for a wide range of parameters.

  13. Optimization of E. coli Cultivation Model Parameters Using Firefly Algorithm

    Directory of Open Access Journals (Sweden)

    Olympia Roeva

    2012-04-01

    Full Text Available In this paper, a novel meta-heuristics algorithm, namely the Firefly Algorithm (FA, is adapted and applied for a model parameter identification of an E. coli fed-batch cultivation process. A system of ordinary nonlinear differential equations is used to model the biomass growth and substrate utilization. Parameter optimization is performed using real experimental data set from an E. coli MC4110 fed-batch cultivation process. The FA adjustments are done based on several pre-tests according to the optimization problem considered here. The simulation results indicate that the applied algorithm is effective and efficient. As a result, a model with high degree of accuracy is obtained applying the FA.

  14. Multidimensional optimization of signal space distance parameters in WLAN positioning.

    Science.gov (United States)

    Brković, Milenko; Simić, Mirjana

    2014-01-01

    Accurate indoor localization of mobile users is one of the challenging problems of the last decade. Besides delivering high speed Internet, Wireless Local Area Network (WLAN) can be used as an effective indoor positioning system, being competitive both in terms of accuracy and cost. Among the localization algorithms, nearest neighbor fingerprinting algorithms based on Received Signal Strength (RSS) parameter have been extensively studied as an inexpensive solution for delivering indoor Location Based Services (LBS). In this paper, we propose the optimization of the signal space distance parameters in order to improve precision of WLAN indoor positioning, based on nearest neighbor fingerprinting algorithms. Experiments in a real WLAN environment indicate that proposed optimization leads to substantial improvements of the localization accuracy. Our approach is conceptually simple, is easy to implement, and does not require any additional hardware. PMID:24757443

  15. Optimization of the drying parameters of a veneer roller dryer

    OpenAIRE

    Marttila, Heikki

    2014-01-01

    The objective of the master’s thesis was to experimentally find the optimal drying parameters for spruce heartwood veneers in terms of veneer quality and drying capacity. The quality was referred to moisture content, moisture deviation, tensile strength (across the grain direction), surface roughness, wettability, waviness and other visual defects. The strength properties of plywood were excluded from the study. The mill experiments were conducted at UPM Pellos 3 jet roller dryer in June ...

  16. Kinetic parameter estimation from TGA: Optimal design of TGA experiments

    OpenAIRE

    Dirion, Jean-Louis; Reverte, Cédric; Cabassud, Michel

    2008-01-01

    This work presents a general methodology to determine kinetic models of solid thermal decomposition with thermogravimetric analysis (TGA) instruments. The goal is to determine a simple and robust kinetic model for a given solid with the minimum of TGA experiments. From this last point of view, this work can be seen as an attempt to find the optimal design of TGA experiments for kinetic modelling. Two computation tools were developed. The first is a nonlinear parameter estimation procedure for...

  17. Optimal upper bounds for non-negative parameters

    OpenAIRE

    Tkachov, Fyodor V.

    2009-01-01

    Using the techniques of [arXiv:0911.4271], upper bounds for a given confidence level are modified in an optimal fashion to incorporate the a priori information that the parameter being estimated is non-negative. A paradox with different confidence intervals for the same confidence level is clarified. The "lossy compression" nature of the device of confidence intervals is discussed and a "lossless" option to present results is pointed out.

  18. OPTIMIZATION OF PARAMETER FOR METAL MATRIX COMPOSITE IN WIRE EDM

    OpenAIRE

    Nagaraja, R.; K.Chandrasekaran; S.Shenbhgaraj

    2015-01-01

    The bronze alumina (Al2O3) alloy is an Metal Matrix Composite (MMC) of interest in several applications like bearing sleeve, piston and cylinder liners etc., The reinforcement used in this MMC makes it difficult to machine using traditional technique. Wire-Electric Discharge Machine (WEDM) seems to be a viable option to machine. This paper presents an investigation on the optimization of machining parameters in WEDM of bronze-alumina MMC. The main objective is to find the optimum ...

  19. Limiting Behaviour in Parameter Optimal Iterative Learning Control

    Institute of Scientific and Technical Information of China (English)

    David H. Owens; Maria Tomas-Rodriguez; Jari J. Hat(o)nen

    2006-01-01

    This paper analyses the concept of a Limit Set in Parameter Optimal Iterative Learning Control (ILC). We investigate the existence of stable and unstable parts of Limit Set and demonstrates that they will often exist in practice.This is illustrated via a 2-dimensional example where the convergence of the learning algorithm is analyzed from the error's dynamic behaviour. These ideas are extended to the N-dimensional cases by analogy and example.

  20. Automated gamma knife radiosurgery treatment planning with image registration, data-mining, and Nelder-Mead simplex optimization

    International Nuclear Information System (INIS)

    Gamma knife treatments are usually planned manually, requiring much expertise and time. We describe a new, fully automatic method of treatment planning. The treatment volume to be planned is first compared with a database of past treatments to find volumes closely matching in size and shape. The treatment parameters of the closest matches are used as starting points for the new treatment plan. Further optimization is performed with the Nelder-Mead simplex method: the coordinates and weight of the isocenters are allowed to vary until a maximally conformal plan specific to the new treatment volume is found. The method was tested on a randomly selected set of 10 acoustic neuromas and 10 meningiomas. Typically, matching a new volume took under 30 seconds. The time for simplex optimization, on a 3 GHz Xeon processor, ranged from under a minute for small volumes (30 000 cubic mm,>20 isocenters). In 8/10 acoustic neuromas and 8/10 meningiomas, the automatic method found plans with conformation number equal or better than that of the manual plan. In 4/10 acoustic neuromas and 5/10 meningiomas, both overtreatment and undertreatment ratios were equal or better in automated plans. In conclusion, data-mining of past treatments can be used to derive starting parameters for treatment planning. These parameters can then be computer optimized to give good plans automatically

  1. PARAMETER ESTIMATION OF VALVE STICTION USING ANT COLONY OPTIMIZATION

    Directory of Open Access Journals (Sweden)

    S. Kalaivani

    2012-07-01

    Full Text Available In this paper, a procedure for quantifying valve stiction in control loops based on ant colony optimization has been proposed. Pneumatic control valves are widely used in the process industry. The control valve contains non-linearities such as stiction, backlash, and deadband that in turn cause oscillations in the process output. Stiction is one of the long-standing problems and it is the most severe problem in the control valves. Thus the measurement data from an oscillating control loop can be used as a possible diagnostic signal to provide an estimate of the stiction magnitude. Quantification of control valve stiction is still a challenging issue. Prior to doing stiction detection and quantification, it is necessary to choose a suitable model structure to describe control-valve stiction. To understand the stiction phenomenon, the Stenman model is used. Ant Colony Optimization (ACO, an intelligent swarm algorithm, proves effective in various fields. The ACO algorithm is inspired from the natural trail following behaviour of ants. The parameters of the Stenman model are estimated using ant colony optimization, from the input-output data by minimizing the error between the actual stiction model output and the simulated stiction model output. Using ant colony optimization, Stenman model with known nonlinear structure and unknown parameters can be estimated.

  2. Analysis of optimization parameters in chest radiographs procedures

    International Nuclear Information System (INIS)

    The risks associated with ionizing radiation became evident soon after the discovery of the X radiation. Therefore, any medical practices that make use of any type of ionizing radiation should be subjected to the basic principles of radiological protection: justification, optimization of protection and application of dose limits. In diagnostic radiology, it means to seek the lowest dose reasonably practicable, without compromising the image quality. The purpose of this project was to evaluate optimization parameters, specifically image quality, exposure levels and radiographs rejection rates, in radiological chest examinations. The image quality evaluation was performed using two forms, one for adults and other for children, based on European standards. By the results, we can conclude that the evaluated sector is not in agreement to the principle of optimization and this reality is not different from most health institutions. The entrance surface air kerma (Ka,e) results were below the national diagnostic reference levels. However, the several image quality parameters showed insufficient ratings and the film rejection rates were high. The lack of optimization generates poor quality images, causing inaccurate diagnostic reports, and increasing operating costs. Therefore, the research warns of the urgency of implementing Quality Control Assurance Program in all radiology services in the country. (author)

  3. OPTIMIZATION OF PARAMETER FOR METAL MATRIX COMPOSITE IN WIRE EDM

    Directory of Open Access Journals (Sweden)

    R.Nagaraja

    2015-02-01

    Full Text Available The bronze alumina (Al2O3 alloy is an Metal Matrix Composite (MMC of interest in several applications like bearing sleeve, piston and cylinder liners etc., The reinforcement used in this MMC makes it difficult to machine using traditional technique. Wire-Electric Discharge Machine (WEDM seems to be a viable option to machine. This paper presents an investigation on the optimization of machining parameters in WEDM of bronze-alumina MMC. The main objective is to find the optimum cutting parameters to achieve a low value of Surface roughness and high value of material removal rate (MRR. The cutting parameters considered in this experimental study are, pulse on time (Ton, pulse off time (Toff and wire feed rate. The settings of cutting parameters were determined by using Taguchi experimental design method. An L9 orthogonal array was chosen. Signal to Noise ratio (S/N and analysis of variance (ANOVA was used to analyze the effect of the parameters on surface roughness and to identify the optimum cutting parameters. The contribution of each cutting parameters towards the surface roughness and MRR is also identified. The study shows that the Taguchi method is suitable to solve the stated problem with minimum number of trails as compared with a full factorial design.

  4. Optimizing casting parameters of steel ingot based on orthogonal method

    Institute of Scientific and Technical Information of China (English)

    张沛; 李学通; 臧新良; 杜凤山

    2008-01-01

    The influence and signification of casting parameters on the solidification process of steel ingot were discussed based on the finite element method (FEM) results by orthogonal experiment method. The range analysis, analysis of variance (ANOVA) and optimization project were used to investigate the FEM results. In order to decrease the ingot riser head and improve the utilization ratio of ingot, the casting parameters involved casting temperature, pouring velocity and interface heat transfer were optimized to decrease shrinkage pore and microporosity. The results show that the heat transfer coefficient between melt and heated board is a more sensitive factor. It is favor to decrease the shrinkage pore and microporosity under the conditions of low temperature, high pouring velocity and high heat transfer between melt and mold. If heat transfer in the ingot body is quicker than that in the riser, the position of shrinkage pore and microporosity will be closer to riser top. The results of optimization project show that few of shrinkage pore and microporosity reach into ingot body with the rational parameters, so the riser size can be reduced.

  5. Damage localization using experimental modal parameters and topology optimization

    Science.gov (United States)

    Niemann, Hanno; Morlier, Joseph; Shahdin, Amir; Gourinat, Yves

    2010-04-01

    This work focuses on the development of a damage detection and localization tool using the topology optimization feature of MSC.Nastran. This approach is based on the correlation of a local stiffness loss and the change in modal parameters due to damages in structures. The loss in stiffness is accounted by the topology optimization approach for updating undamaged numerical models towards similar models with embedded damages. Hereby, only a mass penalization and the changes in experimentally obtained modal parameters are used as objectives. The theoretical background for the implementation of this method is derived and programmed in a Nastran input file and the general feasibility of the approach is validated numerically, as well as experimentally by updating a model of an experimentally tested composite laminate specimen. The damages have been introduced to the specimen by controlled low energy impacts and high quality vibration tests have been conducted on the specimen for different levels of damage. These supervised experiments allow to test the numerical diagnosis tool by comparing the result with both NDT technics and results of previous works (concerning shifts in modal parameters due to damage). Good results have finally been achieved for the localization of the damages by the topology optimization.

  6. Optimization of laser butt welding parameters with multiple performance characteristics

    Science.gov (United States)

    Sathiya, P.; Abdul Jaleel, M. Y.; Katherasan, D.; Shanmugarajan, B.

    2011-04-01

    This paper presents a study carried out on 3.5 kW cooled slab laser welding of 904 L super austenitic stainless steel. The joints have butts welded with different shielding gases, namely argon, helium and nitrogen, at a constant flow rate. Super austenitic stainless steel (SASS) normally contains high amount of Mo, Cr, Ni, N and Mn. The mechanical properties are controlled to obtain good welded joints. The quality of the joint is evaluated by studying the features of weld bead geometry, such as bead width (BW) and depth of penetration (DOP). In this paper, the tensile strength and bead profiles (BW and DOP) of laser welded butt joints made of AISI 904 L SASS are investigated. The Taguchi approach is used as a statistical design of experiment (DOE) technique for optimizing the selected welding parameters. Grey relational analysis and the desirability approach are applied to optimize the input parameters by considering multiple output variables simultaneously. Confirmation experiments have also been conducted for both of the analyses to validate the optimized parameters.

  7. Optimizing spectral CT parameters for material classification tasks

    Science.gov (United States)

    Rigie, D. S.; La Rivière, P. J.

    2016-06-01

    In this work, we propose a framework for optimizing spectral CT imaging parameters and hardware design with regard to material classification tasks. Compared with conventional CT, many more parameters must be considered when designing spectral CT systems and protocols. These choices will impact material classification performance in a non-obvious, task-dependent way with direct implications for radiation dose reduction. In light of this, we adapt Hotelling Observer formalisms typically applied to signal detection tasks to the spectral CT, material-classification problem. The result is a rapidly computable metric that makes it possible to sweep out many system configurations, generating parameter optimization curves (POC’s) that can be used to select optimal settings. The proposed model avoids restrictive assumptions about the basis-material decomposition (e.g. linearity) and incorporates signal uncertainty with a stochastic object model. This technique is demonstrated on dual-kVp and photon-counting systems for two different, clinically motivated material classification tasks (kidney stone classification and plaque removal). We show that the POC’s predicted with the proposed analytic model agree well with those derived from computationally intensive numerical simulation studies.

  8. Optimal construction parameters of electrosprayed trilayer organic photovoltaic devices

    International Nuclear Information System (INIS)

    A detailed investigation of the optimal set of parameters employed in multilayer device fabrication obtained through successive electrospray deposited layers is reported. In this scheme, the donor/acceptor (D/A) bulk heterojunction layer is sandwiched between two thin stacked layers of individual donor and acceptor materials. The stacked layers geometry with optimal thicknesses plays a decisive role in improving operation characteristics. Among the parameters of the multilayer organic photovoltaics device, the D/A concentration ratio, blend thickness and stacking layers thicknesses are optimized. Other parameters, such as thermal annealing and the role of top metal contacts, are also discussed. Internal photon to current efficiency is found to attain a strong response in the 500 nm optical region for the most efficient device architectures. Such an observation indicates a clear interplay between photon harvesting of active layers and transport by ancillary stacking layers, opening up the possibility to engineer both the material fine structure and the device architecture to obtain the best photovoltaic response from a complex organic heterostructure. (paper)

  9. Mathematical Modelling and Parameter Optimization of Pulsating Heat Pipes

    CERN Document Server

    Yang, Xin-She; Luan, Tao; Koziel, Slawomir

    2014-01-01

    Proper heat transfer management is important to key electronic components in microelectronic applications. Pulsating heat pipes (PHP) can be an efficient solution to such heat transfer problems. However, mathematical modelling of a PHP system is still very challenging, due to the complexity and multiphysics nature of the system. In this work, we present a simplified, two-phase heat transfer model, and our analysis shows that it can make good predictions about startup characteristics. Furthermore, by considering parameter estimation as a nonlinear constrained optimization problem, we have used the firefly algorithm to find parameter estimates efficiently. We have also demonstrated that it is possible to obtain good estimates of key parameters using very limited experimental data.

  10. Optimization of Neutrino Oscillation Parameters Using Differential Evolution

    Institute of Scientific and Technical Information of China (English)

    Ghulam Mustafa; Faisal Akram; Bilal Masud

    2013-01-01

    We show how the traditional grid based method for finding neutrino oscillation parameters △m2 and tan2θ can be combined with an optimization technique,Differential Evolution (DE),to get a significant decrease in computer processing time required to obtain minimal chi-square (x2) in four different regions of the parameter space.We demonstrate efficiency for the two-neutrinos case.For this,the x2 function for neutrino oscillations is evaluated for grids with different density of points in standard allowed regions of the parameter space of △m2 and tan2 θ using experimental and theoretical total event rates of chlorine (Homestake),Gallex+GNO,SAGE,Superkamiokande,and SNO detectors.We find that using DE in combination with the grid based method with small density of points can produce the results comparable with the one obtained using high density grid,in much lesser computation time.

  11. Optimal design of nanoplasmonic materials using genetic algorithms as a multi-parameter optimization tool

    OpenAIRE

    Yelk, Joseph; Sukharev, Maxim; Seideman, Tamar

    2008-01-01

    An optimal control approach based on multiple parameter genetic algorithms is applied to the design of plasmonic nanoconstructs with pre-determined optical properties and functionalities. We first develop nanoscale metallic lenses that focus an incident plane wave onto a pre-specified, spatially confined spot. Our results illustrate the role of symmetry breaking and unravel the principles that favor dimeric constructs for optimal light localization. Next we design a periodic array of silver p...

  12. Dose-painting IMRT optimization using biological parameters

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yusung (Dept. of Radiation Oncology, Univ. of Iowa, Iowa City (United States)); Tome, Wolfgang A. (Dept. of Human Oncology Univ. of Wisconsin, Madison (United States)), E-mail: tome@humonc.wisc.edu

    2010-11-15

    Purpose. Our work on dose-painting based on the possible risk characteristics for local recurrence in tumor subvolumes and the optimization of treatment plans using biological objective functions that are region-specific are reviewed. Materials and methods. A series of intensity modulated dose-painting techniques are compared to their corresponding intensity modulated plans in which the entire PTV is treated to a single dose level, delivering the same equivalent uniform dose (EUD) to the entire PTV. Iso-TCP and iso-NTCP maps are introduced as a tool to aid the planner in the evaluation of the resulting non-uniform dose distributions. Iso-TCP and iso-NTCP maps are akin to iso-dose maps in 3D conformal radiotherapy. The impact of the currently limited diagnostic accuracy of functional imaging on a series of dose-painting techniques is also discussed. Results. Utilizing biological parameters (risk-adaptive optimization) in the generation of dose-painting plans results in an increase in the therapeutic ratio as compared to conventional dose-painting plans in which optimization techniques based on physical dose are employed. Conclusion. Dose-painting employing biological parameters appears to be a promising approach for individualized patient- and disease-specific radiotherapy

  13. Dose-painting IMRT optimization using biological parameters

    International Nuclear Information System (INIS)

    Purpose. Our work on dose-painting based on the possible risk characteristics for local recurrence in tumor subvolumes and the optimization of treatment plans using biological objective functions that are region-specific are reviewed. Materials and methods. A series of intensity modulated dose-painting techniques are compared to their corresponding intensity modulated plans in which the entire PTV is treated to a single dose level, delivering the same equivalent uniform dose (EUD) to the entire PTV. Iso-TCP and iso-NTCP maps are introduced as a tool to aid the planner in the evaluation of the resulting non-uniform dose distributions. Iso-TCP and iso-NTCP maps are akin to iso-dose maps in 3D conformal radiotherapy. The impact of the currently limited diagnostic accuracy of functional imaging on a series of dose-painting techniques is also discussed. Results. Utilizing biological parameters (risk-adaptive optimization) in the generation of dose-painting plans results in an increase in the therapeutic ratio as compared to conventional dose-painting plans in which optimization techniques based on physical dose are employed. Conclusion. Dose-painting employing biological parameters appears to be a promising approach for individualized patient- and disease-specific radiotherapy

  14. Optimal Selection of Parameters for Nonuniform Embedding of Chaotic Time Series Using Ant Colony Optimization.

    Science.gov (United States)

    Shen, Meie; Chen, Wei-Neng; Zhang, Jun; Chung, Henry Shu-Hung; Kaynak, Okyay

    2013-04-01

    The optimal selection of parameters for time-delay embedding is crucial to the analysis and the forecasting of chaotic time series. Although various parameter selection techniques have been developed for conventional uniform embedding methods, the study of parameter selection for nonuniform embedding is progressed at a slow pace. In nonuniform embedding, which enables different dimensions to have different time delays, the selection of time delays for different dimensions presents a difficult optimization problem with combinatorial explosion. To solve this problem efficiently, this paper proposes an ant colony optimization (ACO) approach. Taking advantage of the characteristic of incremental solution construction of the ACO, the proposed ACO for nonuniform embedding (ACO-NE) divides the solution construction procedure into two phases, i.e., selection of embedding dimension and selection of time delays. In this way, both the embedding dimension and the time delays can be optimized, along with the search process of the algorithm. To accelerate search speed, we extract useful information from the original time series to define heuristics to guide the search direction of ants. Three geometry- or model-based criteria are used to test the performance of the algorithm. The optimal embeddings found by the algorithm are also applied in time-series forecasting. Experimental results show that the ACO-NE is able to yield good embedding solutions from both the viewpoints of optimization performance and prediction accuracy. PMID:23144038

  15. A Novel Optimization Tool for Automated Design of Integrated Circuits based on MOSGA

    Directory of Open Access Journals (Sweden)

    Maryam Dehbashian

    2011-11-01

    Full Text Available In this paper a novel optimization method based on Multi-Objective Gravitational Search Algorithm (MOGSA is presented for automated design of analog integrated circuits. The recommended method firstly simulates a selected circuit using a simulator and then simulated results are optimized by MOGSA algorithm. Finally this process continues to meet its optimum result. The main programs of the proposed method have been implemented in MATLAB while analog circuits are simulated by HSPICE software. To show the capability of this method, its proficiency will be examined in the optimization of analog integrated circuits design. In this paper, an analog circuit sizing scheme -Optimum Automated Design of a Temperature independent Differential Op-amp using Widlar Current Source- is illustrated as a case study. The computer results obtained from implementing this method indicate that the design specifications are closely met. Moreover, according to various design criteria, this tool by proposing a varied set of answers can give more options to designers to choose a desirable scheme among other suggested results. MOGSA, the proposal algorithm, introduces a novel method in multi objective optimization on the basis of Gravitational Search Algorithm in which the concept of “Pareto-optimality” is used to determine “non-dominated” positions as well as an external repository to keep these positions. To ensure the accuracy of MOGSA performance, this algorithm is validated using several standard test functions from some specialized literatures. Final results indicate that our method is highly competitive with current multi objective optimization algorithms.

  16. Process Parameters Optimization in Single Point Incremental Forming

    Science.gov (United States)

    Gulati, Vishal; Aryal, Ashmin; Katyal, Puneet; Goswami, Amitesh

    2016-04-01

    This work aims to optimize the formability and surface roughness of parts formed by the single-point incremental forming process for an Aluminium-6063 alloy. The tests are based on Taguchi's L18 orthogonal array selected on the basis of DOF. The tests have been carried out on vertical machining center (DMC70V); using CAD/CAM software (SolidWorks V5/MasterCAM). Two levels of tool radius, three levels of sheet thickness, step size, tool rotational speed, feed rate and lubrication have been considered as the input process parameters. Wall angle and surface roughness have been considered process responses. The influential process parameters for the formability and surface roughness have been identified with the help of statistical tool (response table, main effect plot and ANOVA). The parameter that has the utmost influence on formability and surface roughness is lubrication. In the case of formability, lubrication followed by the tool rotational speed, feed rate, sheet thickness, step size and tool radius have the influence in descending order. Whereas in surface roughness, lubrication followed by feed rate, step size, tool radius, sheet thickness and tool rotational speed have the influence in descending order. The predicted optimal values for the wall angle and surface roughness are found to be 88.29° and 1.03225 µm. The confirmation experiments were conducted thrice and the value of wall angle and surface roughness were found to be 85.76° and 1.15 µm respectively.

  17. Optimization of surface roughness parameters in dry turning

    Directory of Open Access Journals (Sweden)

    R.A. Mahdavinejad

    2009-12-01

    Full Text Available Purpose: The precision of machine tools on one hand and the input setup parameters on the other hand, are strongly influenced in main output machining parameters such as stock removal, toll wear ratio and surface roughnes.Design/methodology/approach: There are a lot of input parameters which are effective in the variations of these output parameters. In CNC machines, the optimization of machining process in order to predict surface roughness is very important.Findings: From this point of view, the combination of adaptive neural fuzzy intelligent system is used to predict the roughness of dried surface machined in turning process.Research limitations/implications: There are some limitations in the properties of various kinds of lubricants. The influence of some undesirable factors in experiments is Another limitation in this research.Practical implications: From this point of view, some samples are machined with various input parameters and then the experimental data is used to create fuzzy rules and their processing via neural networks. So that, the prediction model is created with some experimental data first. Then the results of this model are compared with the real surface roughness.Originality/value: When the cutting speed is increased the machined surface quality is improved.The quality of machined surface is decreased with the feeding rates and the depth of cut.The error of the model is more less than the error of using ordinary equations. The comparison results show that this model is more effective than theoretical calculation methods.

  18. Optimizing experimental parameters for tracking of diffusing particles

    Science.gov (United States)

    Vestergaard, Christian L.

    2016-08-01

    We describe how a single-particle tracking experiment should be designed in order for its recorded trajectories to contain the most information about a tracked particle's diffusion coefficient. The precision of estimators for the diffusion coefficient is affected by motion blur, limited photon statistics, and the length of recorded time series. We demonstrate for a particle undergoing free diffusion that precision is negligibly affected by motion blur in typical experiments, while optimizing photon counts and the number of recorded frames is the key to precision. Building on these results, we describe for a wide range of experimental scenarios how to choose experimental parameters in order to optimize the precision. Generally, one should choose quantity over quality: experiments should be designed to maximize the number of frames recorded in a time series, even if this means lower information content in individual frames.

  19. Input Parameters Optimization in Swarm DS-CDMA Multiuser Detectors

    CERN Document Server

    Abrão, Taufik; Angelico, Bruno A; Jeszensky, Paul Jean E

    2010-01-01

    In this paper, the uplink direct sequence code division multiple access (DS-CDMA) multiuser detection problem (MuD) is studied into heuristic perspective, named particle swarm optimization (PSO). Regarding different system improvements for future technologies, such as high-order modulation and diversity exploitation, a complete parameter optimization procedure for the PSO applied to MuD problem is provided, which represents the major contribution of this paper. Furthermore, the performance of the PSO-MuD is briefly analyzed via Monte-Carlo simulations. Simulation results show that, after convergence, the performance reached by the PSO-MuD is much better than the conventional detector, and somewhat close to the single user bound (SuB). Rayleigh flat channel is initially considered, but the results are further extend to diversity (time and spatial) channels.

  20. Total energy control system autopilot design with constrained parameter optimization

    Science.gov (United States)

    Ly, Uy-Loi; Voth, Christopher

    1990-01-01

    A description is given of the application of a multivariable control design method (SANDY) based on constrained parameter optimization to the design of a multiloop aircraft flight control system. Specifically, the design method is applied to the direct synthesis of a multiloop AFCS inner-loop feedback control system based on total energy control system (TECS) principles. The design procedure offers a structured approach for the determination of a set of stabilizing controller design gains that meet design specifications in closed-loop stability, command tracking performance, disturbance rejection, and limits on control activities. The approach can be extended to a broader class of multiloop flight control systems. Direct tradeoffs between many real design goals are rendered systematic by proper formulation of the design objectives and constraints. Satisfactory designs are usually obtained in few iterations. Performance characteristics of the optimized TECS design have been improved, particularly in the areas of closed-loop damping and control activity in the presence of turbulence.

  1. Determination of the Optimized Automation Rate considering Effects of Automation on Human Operators in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Automation refers to the use of a device or a system to perform a function previously performed by a human operator. It is introduced to reduce the human errors and to enhance the performance in various industrial fields, including the nuclear industry. However, these positive effects are not always achieved in complex systems such as nuclear power plants (NPPs). An excessive introduction of automation can generate new roles for human operators and change activities in unexpected ways. As more automation systems are accepted, the ability of human operators to detect automation failures and resume manual control is diminished. This disadvantage of automation is called the Out-of-the- Loop (OOTL) problem. We should consider the positive and negative effects of automation at the same time to determine the appropriate level of the introduction of automation. Thus, in this paper, we suggest an estimation method to consider the positive and negative effects of automation at the same time to determine the appropriate introduction of automation. This concept is limited in that it does not consider the effects of automation on human operators. Thus, a new estimation method for automation rate was suggested to overcome this problem

  2. Optimization of automation: I. Estimation method of cognitive automation rates reflecting the effects of automation on human operators in nuclear power plants

    International Nuclear Information System (INIS)

    Highlights: • We propose an estimation method of the automation rate by taking the advantages of automation as the estimation measures. • We conduct the experiments to examine the validity of the suggested method. • The higher the cognitive automation rate is, the greater the decreased rate of the working time will be. • The usefulness of the suggested estimation method is proved by statistical analyses. - Abstract: Since automation was introduced in various industrial fields, the concept of the automation rate has been used to indicate the inclusion proportion of automation among all work processes or facilities. Expressions of the inclusion proportion of automation are predictable, as is the ability to express the degree of the enhancement of human performance. However, many researchers have found that a high automation rate does not guarantee high performance. Therefore, to reflect the effects of automation on human performance, this paper proposes a new estimation method of the automation rate that considers the effects of automation on human operators in nuclear power plants (NPPs). Automation in NPPs can be divided into two types: system automation and cognitive automation. Some general descriptions and characteristics of each type of automation are provided, and the advantages of automation are investigated. The advantages of each type of automation are used as measures of the estimation method of the automation rate. One advantage was found to be a reduction in the number of tasks, and another was a reduction in human cognitive task loads. The system and the cognitive automation rate were proposed as quantitative measures by taking advantage of the aforementioned benefits. To quantify the required human cognitive task loads and thus suggest the cognitive automation rate, Conant’s information-theory-based model was applied. The validity of the suggested method, especially as regards the cognitive automation rate, was proven by conducting

  3. Standardless quantification by parameter optimization in electron probe microanalysis

    Energy Technology Data Exchange (ETDEWEB)

    Limandri, Silvina P. [Instituto de Fisica Enrique Gaviola (IFEG), CONICET (Argentina); Facultad de Matematica, Astronomia y Fisica, Universidad Nacional de Cordoba, Medina Allende s/n, (5016) Cordoba (Argentina); Bonetto, Rita D. [Centro de Investigacion y Desarrollo en Ciencias Aplicadas Dr. Jorge Ronco (CINDECA), CONICET, 47 Street 257, (1900) La Plata (Argentina); Facultad de Ciencias Exactas, Universidad Nacional de La Plata, 1 and 47 Streets (1900) La Plata (Argentina); Josa, Victor Galvan; Carreras, Alejo C. [Instituto de Fisica Enrique Gaviola (IFEG), CONICET (Argentina); Facultad de Matematica, Astronomia y Fisica, Universidad Nacional de Cordoba, Medina Allende s/n, (5016) Cordoba (Argentina); Trincavelli, Jorge C., E-mail: trincavelli@famaf.unc.edu.ar [Instituto de Fisica Enrique Gaviola (IFEG), CONICET (Argentina); Facultad de Matematica, Astronomia y Fisica, Universidad Nacional de Cordoba, Medina Allende s/n, (5016) Cordoba (Argentina)

    2012-11-15

    A method for standardless quantification by parameter optimization in electron probe microanalysis is presented. The method consists in minimizing the quadratic differences between an experimental spectrum and an analytical function proposed to describe it, by optimizing the parameters involved in the analytical prediction. This algorithm, implemented in the software POEMA (Parameter Optimization in Electron Probe Microanalysis), allows the determination of the elemental concentrations, along with their uncertainties. The method was tested in a set of 159 elemental constituents corresponding to 36 spectra of standards (mostly minerals) that include trace elements. The results were compared with those obtained with the commercial software GENESIS Spectrum Registered-Sign for standardless quantification. The quantifications performed with the method proposed here are better in the 74% of the cases studied. In addition, the performance of the method proposed is compared with the first principles standardless analysis procedure DTSA for a different data set, which excludes trace elements. The relative deviations with respect to the nominal concentrations are lower than 0.04, 0.08 and 0.35 for the 66% of the cases for POEMA, GENESIS and DTSA, respectively. - Highlights: Black-Right-Pointing-Pointer A method for standardless quantification in EPMA is presented. Black-Right-Pointing-Pointer It gives better results than the commercial software GENESIS Spectrum. Black-Right-Pointing-Pointer It gives better results than the software DTSA. Black-Right-Pointing-Pointer It allows the determination of the conductive coating thickness. Black-Right-Pointing-Pointer It gives an estimation for the concentration uncertainties.

  4. Optimization of Parameters for Melt Crystallization of p-Cresolt

    Institute of Scientific and Technical Information of China (English)

    丛山; 李鑫钢; 邬俊; 许长春

    2012-01-01

    Laboratory-scale experiments were carried out to evaluate the influences of operational parameters on the melt crystallization efficiency for p-cresol purification. The optimal c.rystallization conditions were determined: dynamic pulsed aeration at 90 L·h-1 and the cooling rate of 0.6 0.8 ℃min^-1, followed by sweating at 0.2-0.3 ℃.min^-1 for 40 min. Results also demonstrate that the melt crystallization efficiency is sensitive to feed concentration, which highlights this technology for separation and purification of high purity products.

  5. A novel validation algorithm allows for automated cell tracking and the extraction of biologically meaningful parameters.

    Science.gov (United States)

    Rapoport, Daniel H; Becker, Tim; Madany Mamlouk, Amir; Schicktanz, Simone; Kruse, Charli

    2011-01-01

    Automated microscopy is currently the only method to non-invasively and label-free observe complex multi-cellular processes, such as cell migration, cell cycle, and cell differentiation. Extracting biological information from a time-series of micrographs requires each cell to be recognized and followed through sequential microscopic snapshots. Although recent attempts to automatize this process resulted in ever improving cell detection rates, manual identification of identical cells is still the most reliable technique. However, its tedious and subjective nature prevented tracking from becoming a standardized tool for the investigation of cell cultures. Here, we present a novel method to accomplish automated cell tracking with a reliability comparable to manual tracking. Previously, automated cell tracking could not rival the reliability of manual tracking because, in contrast to the human way of solving this task, none of the algorithms had an independent quality control mechanism; they missed validation. Thus, instead of trying to improve the cell detection or tracking rates, we proceeded from the idea to automatically inspect the tracking results and accept only those of high trustworthiness, while rejecting all other results. This validation algorithm works independently of the quality of cell detection and tracking through a systematic search for tracking errors. It is based only on very general assumptions about the spatiotemporal contiguity of cell paths. While traditional tracking often aims to yield genealogic information about single cells, the natural outcome of a validated cell tracking algorithm turns out to be a set of complete, but often unconnected cell paths, i.e. records of cells from mitosis to mitosis. This is a consequence of the fact that the validation algorithm takes complete paths as the unit of rejection/acceptance. The resulting set of complete paths can be used to automatically extract important biological parameters with high

  6. A novel validation algorithm allows for automated cell tracking and the extraction of biologically meaningful parameters.

    Directory of Open Access Journals (Sweden)

    Daniel H Rapoport

    Full Text Available Automated microscopy is currently the only method to non-invasively and label-free observe complex multi-cellular processes, such as cell migration, cell cycle, and cell differentiation. Extracting biological information from a time-series of micrographs requires each cell to be recognized and followed through sequential microscopic snapshots. Although recent attempts to automatize this process resulted in ever improving cell detection rates, manual identification of identical cells is still the most reliable technique. However, its tedious and subjective nature prevented tracking from becoming a standardized tool for the investigation of cell cultures. Here, we present a novel method to accomplish automated cell tracking with a reliability comparable to manual tracking. Previously, automated cell tracking could not rival the reliability of manual tracking because, in contrast to the human way of solving this task, none of the algorithms had an independent quality control mechanism; they missed validation. Thus, instead of trying to improve the cell detection or tracking rates, we proceeded from the idea to automatically inspect the tracking results and accept only those of high trustworthiness, while rejecting all other results. This validation algorithm works independently of the quality of cell detection and tracking through a systematic search for tracking errors. It is based only on very general assumptions about the spatiotemporal contiguity of cell paths. While traditional tracking often aims to yield genealogic information about single cells, the natural outcome of a validated cell tracking algorithm turns out to be a set of complete, but often unconnected cell paths, i.e. records of cells from mitosis to mitosis. This is a consequence of the fact that the validation algorithm takes complete paths as the unit of rejection/acceptance. The resulting set of complete paths can be used to automatically extract important biological parameters

  7. Parameter optimization in AQM controller design to support TCP traffic

    Science.gov (United States)

    Yang, Wei; Yang, Oliver W.

    2004-09-01

    TCP congestion control mechanism has been widely investigated and deployed on Internet in preventing congestion collapse. We would like to employ modern control theory to specify quantitatively the control performance of the TCP communication system. In this paper, we make use of a commonly used performance index called the Integral of the Square of the Error (ISE), which is a quantitative measure to gauge the performance of a control system. By applying the ISE performance index into the Proportional-plus-Integral controller based on Pole Placement (PI_PP controller) for active queue management (AQM) in IP routers, we can further tune the parameters for the controller to achieve an optimum control minimizing control errors. We have analyzed the dynamic model of the TCP congestion control under this ISE, and used OPNET simulation tool to verify the derived optimized parameters of the controllers.

  8. Optimal VLF Parameters for Pitch Angle Scattering of Trapped Electrons

    Science.gov (United States)

    Albert, J. M.; Inan, U. S.

    2001-12-01

    VLF waves are known to determine the lifetimes of energetic radiation belt electrons in the inner radiation belt and slot regions. Artificial injection of such waves from ground- or space-based transmitters may thus be used to affect the trapped electron population. In this paper, we seek to determine the optimal parameters (frequency and wave normal angle) of a quasi-monochromatic VLF wave using bounce-averaged quasi-linear theory. We consider the cumulative effects of all harmonic resonances and determine the diffusion rates of particles with selected energies on particular L-shells. We also compare the effects of the VLF wave to diffusion driven by other whistler-mode waves (plasmaspheric hiss, lightning, and VLF transmitters). With appropriate choice of the wave parameters, it may be possible to substantially reduce the lifetime of selected classes of particles.

  9. Parameter optimization in molecular dynamics simulations using a genetic algorithm

    International Nuclear Information System (INIS)

    In this work, we introduce a genetic algorithm for the parameterization of the reactive force field developed by Kieffer . This potential includes directional covalent bonds and dispersion terms. Important features of this force field for simulating systems that undergo significant structural reorganization are (i) the ability to account for the redistribution of electron density upon ionization, formation, or breaking of bonds, through a charge transfer term, and (ii) the fact that the angular constraints dynamically adjust when a change in the coordination number of an atom occurs. In this paper, we present the implementation of the genetic algorithm into the existing code as well as the algorithm efficiency and preliminary results on Si-Si force field optimization. The parameters obtained by this method will be compared to existing parameter sets obtained by a trial-and-error process.

  10. Comparison and Application of Metaheuristic Population-Based Optimization Algorithms in Manufacturing Automation

    Directory of Open Access Journals (Sweden)

    Rhythm Suren Wadhwa

    2011-11-01

    Full Text Available The paper presents a comparison and application of metaheuristic population-based optimization algorithms to a flexible manufacturing automation scenario in a metacasting foundry. It presents a novel application and comparison of Bee Colony Algorithm (BCA with variations of Particle Swarm Optimization (PSO and Ant Colony Optimization (ACO for object recognition problem in a robot material handling system. To enable robust pick and place activity of metalcasted parts by a six axis industrial robot manipulator, it is important that the correct orientation of the parts is input to the manipulator, via the digital image captured by the vision system. This information is then used for orienting the robot gripper to grip the part from a moving conveyor belt. The objective is to find the reference templates on the manufactured parts from the target landscape picture which may contain noise. The Normalized cross-correlation (NCC function is used as an objection function in the optimization procedure. The ultimate goal is to test improved algorithms that could prove useful in practical manufacturing automation scenarios.

  11. An Etching Yield Parameters Optimization Method Based on Ordinal Optimization and Tabu Search Hybrid Algorithm

    Science.gov (United States)

    Ruan, Cong; Sun, Xiao-Min; Song, Yi-Xu

    In this paper, we propose a method to optimize etching yield parameters. By means of defining a fitness function between the actual etching profile and the simulation profile, the etching yield parameters solving problem is transformed into an optimization problem. The problem is nonlinear and high dimensional, and each simulation is computationally expensive. To solve this problem, we need to search a better solution in a multidimensional space. Ordinal optimization and tabu search hybrid algorithm is introduced to solve this complex problem. This method ensures getting good enough solution in an acceptable time. The experimental results illustrate that simulation profile obtained by this method is very similar with the actual etching profile in surface topography. It also proves that our proposed method has feasibility and validity.

  12. Optimizing the Pulsed Current Gas Tungsten Arc Welding Parameters

    Institute of Scientific and Technical Information of China (English)

    M. Balasubramanian; V. Jayabalan; V. Balasubramanian

    2006-01-01

    The selection of process parameter in the gas tungsten arc (GTA) welding of titanium alloy was presented for obtaining optimum grain size and hardness. Titanium alloy (Ti-6Al-4V) is one of the most important non-ferrous metals which offers great potential application in aerospace, biomedical and chemical industries,because of its low density (4.5 g/cm3), excellent corrosion resistance, high strength, attractive fracture behaviour and high melting point (1678℃). The preferred welding process for titanium alloy is frequent GTA welding due to its comparatively easier applicability and better economy. In the case of single pass (GTA)welding of thinner section of this alloy, the pulsed current has been found beneficial due to its advantages over the conventional continuous current process. Many considerations come into the picture and one needs to carefully balance various pulse current parameters to reach an optimum combination. Four factors, five level, central composite, rotatable design matrix were used to optimize the required number of experimental conditions. Mathematical models were developed to predict the fusion zone grain size using analysis of variance (ANOVA) and regression analysis. The developed models were optimized using the traditional Hooke and Jeeve's algorithm. Experimental results were provided to illustrate the proposed approach.

  13. Biohydrogen Production from Simple Carbohydrates with Optimization of Operating Parameters.

    Science.gov (United States)

    Muri, Petra; Osojnik-Črnivec, Ilja Gasan; Djinovič, Petar; Pintar, Albin

    2016-01-01

    Hydrogen could be alternative energy carrier in the future as well as source for chemical and fuel synthesis due to its high energy content, environmentally friendly technology and zero carbon emissions. In particular, conversion of organic substrates to hydrogen via dark fermentation process is of great interest. The aim of this study was fermentative hydrogen production using anaerobic mixed culture using different carbon sources (mono and disaccharides) and further optimization by varying a number of operating parameters (pH value, temperature, organic loading, mixing intensity). Among all tested mono- and disaccharides, glucose was shown as the preferred carbon source exhibiting hydrogen yield of 1.44 mol H(2)/mol glucose. Further evaluation of selected operating parameters showed that the highest hydrogen yield (1.55 mol H(2)/mol glucose) was obtained at the initial pH value of 6.4, T=37 °C and organic loading of 5 g/L. The obtained results demonstrate that lower hydrogen yield at all other conditions was associated with redirection of metabolic pathways from butyric and acetic (accompanied by H(2) production) to lactic (simultaneous H(2) production is not mandatory) acid production. These results therefore represent an important foundation for the optimization and industrial-scale production of hydrogen from organic substrates. PMID:26970800

  14. Process parameter optimization for fly ash brick by Taguchi method

    Directory of Open Access Journals (Sweden)

    Prabir Kumar Chaulia

    2008-06-01

    Full Text Available This paper presents the results of an experimental investigation carried out to optimize the mix proportions of the fly ash brick by Taguchi method of parameter design. The experiments have been designed using an L9 orthogonal array with four factors and three levels each. Small quantity of cement has been mixed as binding materials. Both cement and the fly ash used are indicated as binding material and water binder ratio has been considered as one of the control factors. So the effects of water/binder ratio, fly ash, coarse sand, and stone dust on the performance characteristic are analyzed using signal-to-noise ratios and mean response data. According to the results, water/binder ratio and stone dust play the significant role on the compressive strength of the brick. Furthermore, the estimated optimum values of the process parameters are corresponding to water/binder ratio of 0.4, fly ash of 39%, coarse sand of 24%, and stone dust of 30%. The mean value of optimal strength is predicted as 166.22 kg.cm-2 with a tolerance of ± 10.97 kg.cm-2. Confirmatory experimental result obtained for the optimum conditions is 160.17 kg.cm-2.

  15. Automated scheme to determine design parameters for a recoverable reentry vehicle

    International Nuclear Information System (INIS)

    The NRV (Nosetip Recovery Vehicle) program at Sandia Laboratories is designed to recover the nose section from a sphere cone reentry vehicle after it has flown a near ICBM reentry trajectory. Both mass jettison and parachutes are used to reduce the velocity of the RV near the end of the trajectory to a sufficiently low level that the vehicle may land intact. The design problem of determining mass jettison time and parachute deployment time in order to ensure that the vehicle does land intact is considered. The problem is formulated as a min-max optimization problem where the design parameters are to be selected to minimize the maximum possible deviation in the design criteria due to uncertainties in the system. The results of the study indicate that the optimal choice of the design parameters ensures that the maximum deviation in the design criteria is within acceptable bounds. This analytically ensures the feasibility of recovery for NRV

  16. Automated Software Testing Using Metahurestic Technique Based on An Ant Colony Optimization

    CERN Document Server

    Srivastava, Praveen Ranjan

    2011-01-01

    Software testing is an important and valuable part of the software development life cycle. Due to time, cost and other circumstances, exhaustive testing is not feasible that's why there is a need to automate the software testing process. Testing effectiveness can be achieved by the State Transition Testing (STT) which is commonly used in real time, embedded and web-based type of software systems. Aim of the current paper is to present an algorithm by applying an ant colony optimization technique, for generation of optimal and minimal test sequences for behavior specification of software. Present paper approach generates test sequence in order to obtain the complete software coverage. This paper also discusses the comparison between two metaheuristic techniques (Genetic Algorithm and Ant Colony optimization) for transition based testing

  17. Acoustical characterization and parameter optimization of polymeric noise control materials

    Science.gov (United States)

    Homsi, Emile N.

    2003-10-01

    The sound transmission loss (STL) characteristics of polymer-based materials are considered. Analytical models that predict, characterize and optimize the STL of polymeric materials, with respect to physical parameters that affect performance, are developed for single layer panel configuration and adapted for layered panel construction with homogenous core. An optimum set of material parameters is selected and translated into practical applications for validation. Sound attenuating thermoplastic materials designed to be used as barrier systems in the automotive and consumer industries have certain acoustical characteristics that vary in function of the stiffness and density of the selected material. The validity and applicability of existing theory is explored, and since STL is influenced by factors such as the surface mass density of the panel's material, a method is modified to improve STL performance and optimize load-bearing attributes. An experimentally derived function is applied to the model for better correlation. In-phase and out-of-phase motion of top and bottom layers are considered. It was found that the layered construction of the co-injection type would exhibit fused planes at the interface and move in-phase. The model for the single layer case is adapted to the layered case where it would behave as a single panel. Primary physical parameters that affect STL are identified and manipulated. Theoretical analysis is linked to the resin's matrix attribute. High STL material with representative characteristics is evaluated versus standard resins. It was found that high STL could be achieved by altering materials' matrix and by integrating design solution in the low frequency range. A suggested numerical approach is described for STL evaluation of simple and complex geometries. In practice, validation on actual vehicle systems proved the adequacy of the acoustical characterization process.

  18. Automated Tuning for Parameter Identification in Multi-Scale Coronary Simulations

    Science.gov (United States)

    Tran, Justin; Schiavazzi, Daniele; Ramachandra, Abhay; Kahn, Andrew; Marsden, Alison

    2015-11-01

    Computational simulations of coronary flow can provide non-invasive information on hemodynamics that can aid in disease research. In this study, patient-specific geometries are constructed and combined with finite element flow simulations using the open source software SimVascular. Lumped parameter networks (LPN), consisting of circuit representations of hemodynamic behavior, can be used as coupled boundary conditions for the flow solver. The parameters of the LPN are tuned so the outputs match a patient's clinical data. However, the parameters are usually manually tuned, which is time consuming and does not account for uncertainty in the measurements. We thus propose a Bayesian approach to parameter tuning that provides optimal parameter statistics through sampling from their posterior distribution and is particularly well suited for models characterized by a large number of parameters and scarce data. We also show that analysis of the local and global identifiability play an important role for dimensionality reduction in the estimation. We present the results of applying the proposed approach to a cohort of patients, and demonstrate the ability to match high priority targets. After identifying the LPN parameters for each patient, we demonstrate their use in 3D simulations.

  19. Parameter Estimation of Induction Motors Using Water Cycle Optimization

    Directory of Open Access Journals (Sweden)

    M. Yazdani-Asrami

    2013-12-01

    Full Text Available This paper presents the application of recently introduced water cycle algorithm (WCA to optimize the parameters of exact and approximate induction motor from the nameplate data. Considering that induction motors are widely used in industrial applications, these parameters have a significant effect on the accuracy and efficiency of the motors and, ultimately, the overall system performance. Therefore, it is essential to develop algorithms for the parameter estimation of the induction motor. The fundamental concepts and ideas which underlie the proposed method is inspired from nature and based on the observation of water cycle process and how rivers and streams flow to the sea in the real world. The objective function is defined as the minimization of the real values of the relative error between the measured and estimated torques of the machine in different slip points. The proposed WCA approach has been applied on two different sample motors. Results of the proposed method have been compared with other previously applied Meta heuristic methods on the problem, which show the feasibility and the fast convergence of the proposed approach.

  20. GAUFRE: a tool for an automated determination of atmospheric parameters from spectroscopy

    CERN Document Server

    Valentini, Marica; Miglio, Andrea; Fossati, Luca; Munari, Ulisse

    2013-01-01

    We present an automated tool for measuring atmospheric parameters (T_eff, log(g), [Fe/H]) for F-G-K dwarf and giant stars. The tool, called GAUFRE, is written in C++ and composed of several routines: GAUFRE-RV measures radial velocity from spectra via cross-correlation against a synthetic template, GAUFRE-EW measures atmospheric parameters through the classic line-by-line technique and GAUFRE-CHI2 performs a chi^2 fitting to a library of synthetic spectra. A set of F-G-K stars extensively studied in the literature were used as a benchmark for the program: their high signal-to-noise and high resolution spectra were analysed by using GAUFRE and results were compared with those present in literature. The tool is also implemented in order to perform the spectral analysis after fixing the surface gravity (log(g)) to the accurate value provided by asteroseismology. A set of CoRoT stars, belonging to LRc01 and LRa01 fields was used for first testing the performances and the behaviour of the program when using the se...

  1. Optimization-based particle filter for state and parameter estimation

    Institute of Scientific and Technical Information of China (English)

    Li Fu; Qi Fei; Shi Guangming; Zhang Li

    2009-01-01

    In recent years, the theory of particle filter has been developed and widely used for state and parameter estimation in nonlinear/non-Gaussian systems. Choosing good importance density is a critical issue in particle filter design. In order to improve the approximation of posterior distribution, this paper provides an optimization-based algorithm (the steepest descent method) to generate the proposal distribution and then sample particles from the distribution. This algorithm is applied in 1-D case, and the simulation results show that the proposed particle filter performs better than the extended Kalman filter (EKF), the standard particle filter (PF), the extended Kalman particle filter (PF-EKF) and the unscented particle filter (UPF) both in efficiency and in estimation precision.

  2. Optimal z-axis scanning parameters for gynecologic cytology specimens

    Directory of Open Access Journals (Sweden)

    Amber D Donnelly

    2013-01-01

    Full Text Available Background: The use of virtual microscopy (VM in clinical cytology has been limited due to the inability to focus through three dimensional (3D cell clusters with a single focal plane (2D images. Limited information exists regarding the optimal scanning parameters for 3D scanning. Aims: The purpose of this study was to determine the optimal number of the focal plane levels and the optimal scanning interval to digitize gynecological (GYN specimens prepared on SurePath™ glass slides while maintaining a manageable file size. Subjects and Methods: The iScanCoreo Au scanner (Ventana, AZ, USA was used to digitize 192 SurePath™ glass slides at three focal plane levels at 1 μ interval. The digitized virtual images (VI were annotated using BioImagene′s Image Viewer. Five participants interpreted the VI and recorded the focal plane level at which they felt confident and later interpreted the corresponding glass slide specimens using light microscopy (LM. The participants completed a survey about their experiences. Inter-rater agreement and concordance between the VI and the glass slide specimens were evaluated. Results: This study determined an overall high intra-rater diagnostic concordance between glass and VI (89-97%, however, the inter-rater agreement for all cases was higher for LM (94% compared with VM (82%. Survey results indicate participants found low grade dysplasia and koilocytes easy to diagnose using three focal plane levels, the image enhancement tool was useful and focusing through the cells helped with interpretation; however, the participants found VI with hyperchromatic crowded groups challenging to interpret. Participants reported they prefer using LM over VM. This study supports using three focal plane levels and 1 μ interval to expand the use of VM in GYN cytology. Conclusion: Future improvements in technology and appropriate training should make this format a more preferable and practical option in clinical cytology.

  3. Optimization of system parameters for a complete multispectral polarimeter

    International Nuclear Information System (INIS)

    We optimize a general class of complete multispectral polarimeters with respect to signal-to-noise ratio, stability against alignment errors, and the minimization of errors regarding a given set of polarization states. The class of polarimeters that are dealt with consists of at least four polarization optics each with a multispectral detector. A polarization optic is made of an azimuthal oriented wave plate and a polarizing filter. A general, but not unique, analytic solution that minimizes signal-to-noise ratio is introduced for a polarimeter that incorporates four simultaneous measurements with four independent optics. The optics consist of four sufficient wave plates, where at least one is a quarter-wave plate. The solution is stable with respect to the retardance of the quarter-wave plate; therefore, it can be applied to real-world cases where the retardance deviates from λ/4. The solution is a set of seven rotational parameters that depends on the given retardances of the wave plates. It can be applied to a broad range of real world cases. A numerical method for the optimization of arbitrary polarimeters of the type discussed is also presented and applied for two cases. First, the class of polarimeters that were analytically dealt with are further optimized with respect to stability and error performance with respect to linear polarized states. Then a multispectral case for a polarimeter that consists of four optics with real achromatic wave plates is presented. This case was used as the theoretical background for the development of the Airborne Multi-Spectral Sunphoto- and Polarimeter (AMSSP), which is an instrument for the German research aircraft HALO.

  4. Parameter optimization in differential geometry based solvation models.

    Science.gov (United States)

    Wang, Bao; Wei, G W

    2015-10-01

    Differential geometry (DG) based solvation models are a new class of variational implicit solvent approaches that are able to avoid unphysical solvent-solute boundary definitions and associated geometric singularities, and dynamically couple polar and non-polar interactions in a self-consistent framework. Our earlier study indicates that DG based non-polar solvation model outperforms other methods in non-polar solvation energy predictions. However, the DG based full solvation model has not shown its superiority in solvation analysis, due to its difficulty in parametrization, which must ensure the stability of the solution of strongly coupled nonlinear Laplace-Beltrami and Poisson-Boltzmann equations. In this work, we introduce new parameter learning algorithms based on perturbation and convex optimization theories to stabilize the numerical solution and thus achieve an optimal parametrization of the DG based solvation models. An interesting feature of the present DG based solvation model is that it provides accurate solvation free energy predictions for both polar and non-polar molecules in a unified formulation. Extensive numerical experiment demonstrates that the present DG based solvation model delivers some of the most accurate predictions of the solvation free energies for a large number of molecules. PMID:26450304

  5. Robust integrated autopilot/autothrottle design using constrained parameter optimization

    Science.gov (United States)

    Ly, Uy-Loi; Voth, Christopher; Sanjay, Swamy

    1990-01-01

    A multivariable control design method based on constrained parameter optimization was applied to the design of a multiloop aircraft flight control system. Specifically, the design method is applied to the following: (1) direct synthesis of a multivariable 'inner-loop' feedback control system based on total energy control principles; (2) synthesis of speed/altitude-hold designs as 'outer-loop' feedback/feedforward control systems around the above inner loop; and (3) direct synthesis of a combined 'inner-loop' and 'outer-loop' multivariable control system. The design procedure offers a direct and structured approach for the determination of a set of controller gains that meet design specifications in closed-loop stability, command tracking performance, disturbance rejection, and limits on control activities. The presented approach may be applied to a broader class of multiloop flight control systems. Direct tradeoffs between many real design goals are rendered systematic by this method following careful problem formulation of the design objectives and constraints. Performance characteristics of the optimization design were improved over the current autopilot design on the B737-100 Transport Research Vehicle (TSRV) at the landing approach and cruise flight conditions; particularly in the areas of closed-loop damping, command responses, and control activity in the presence of turbulence.

  6. High Temperature Epoxy Foam: Optimization of Process Parameters

    Directory of Open Access Journals (Sweden)

    Samira El Gazzani

    2016-06-01

    Full Text Available For many years, reduction of fuel consumption has been a major aim in terms of both costs and environmental concerns. One option is to reduce the weight of fuel consumers. For this purpose, the use of a lightweight material based on rigid foams is a relevant choice. This paper deals with a new high temperature epoxy expanded material as substitution of phenolic resin, classified as potentially mutagenic by European directive Reach. The optimization of thermoset foam depends on two major parameters, the reticulation process and the expansion of the foaming agent. Controlling these two phenomena can lead to a fully expanded and cured material. The rheological behavior of epoxy resin is studied and gel time is determined at various temperatures. The expansion of foaming agent is investigated by thermomechanical analysis. Results are correlated and compared with samples foamed in the same temperature conditions. The ideal foaming/gelation temperature is then determined. The second part of this research concerns the optimization of curing cycle of a high temperature trifunctional epoxy resin. A two-step curing cycle was defined by considering the influence of different curing schedules on the glass transition temperature of the material. The final foamed material has a glass transition temperature of 270 °C.

  7. Optimizing resistance spot welding parameters for vibration damping steel sheets

    Energy Technology Data Exchange (ETDEWEB)

    Oberle, H. [Centre de Recherches et Developpements Metallurgiques, Sollac (France); Commaret, C.; Minier, C. [Automobiles Citroeen PSA (France); Magnaud, R. [Direction des Methodes Carrosserie, Renault (France); Pradere, G. [Materials Engineering Dept., Renault (France)

    1998-01-01

    In order to meet the growing demand for functionality and comfort in vehicles, weight and quietness are major concerns for carmakers and materials suppliers. Noise reduction by damping vibrations can meet both aspects. Therefore, steelmakers have developed vibration damping steel sheets (VDSS), which are a three-layer composite material composed of two steel sheets sandwiching a viscoelastic resin core. Industrial use of VDSS in automobiles usually implies the product can be resistance welded. The intent of this investigation is to set up rules to optimize resistance spot welding of VDSS. Two phenomena are the focus of this research: the reduction of blistering and gas expulsion holes. Different aspects are studied, such as the effect of polymer presence and of electrode shape on welding domain and the evaluation of the influence of a welding schedule on blistering and expulsion holes. It appears that polymer presence has no effect on domain width, but does on its position. Higher frequency of expulsion holes with truncated electrodes can be explained with mechanical considerations. From the influence of short circuit voltage, current delay angle and welding schedule on the frequency of gas expulsion holes, a mechanism responsible for expulsion holes is proposed and optimal welding parameters are given.

  8. Robust fluence map optimization via alternating direction method of multipliers with empirical parameter optimization.

    Science.gov (United States)

    Gao, Hao

    2016-04-01

    For the treatment planning during intensity modulated radiation therapy (IMRT) or volumetric modulated arc therapy (VMAT), beam fluence maps can be first optimized via fluence map optimization (FMO) under the given dose prescriptions and constraints to conformally deliver the radiation dose to the targets while sparing the organs-at-risk, and then segmented into deliverable MLC apertures via leaf or arc sequencing algorithms. This work is to develop an efficient algorithm for FMO based on alternating direction method of multipliers (ADMM). Here we consider FMO with the least-square cost function and non-negative fluence constraints, and its solution algorithm is based on ADMM, which is efficient and simple-to-implement. In addition, an empirical method for optimizing the ADMM parameter is developed to improve the robustness of the ADMM algorithm. The ADMM based FMO solver was benchmarked with the quadratic programming method based on the interior-point (IP) method using the CORT dataset. The comparison results suggested the ADMM solver had a similar plan quality with slightly smaller total objective function value than IP. A simple-to-implement ADMM based FMO solver with empirical parameter optimization is proposed for IMRT or VMAT. PMID:26987680

  9. Robust fluence map optimization via alternating direction method of multipliers with empirical parameter optimization

    Science.gov (United States)

    Gao, Hao

    2016-04-01

    For the treatment planning during intensity modulated radiation therapy (IMRT) or volumetric modulated arc therapy (VMAT), beam fluence maps can be first optimized via fluence map optimization (FMO) under the given dose prescriptions and constraints to conformally deliver the radiation dose to the targets while sparing the organs-at-risk, and then segmented into deliverable MLC apertures via leaf or arc sequencing algorithms. This work is to develop an efficient algorithm for FMO based on alternating direction method of multipliers (ADMM). Here we consider FMO with the least-square cost function and non-negative fluence constraints, and its solution algorithm is based on ADMM, which is efficient and simple-to-implement. In addition, an empirical method for optimizing the ADMM parameter is developed to improve the robustness of the ADMM algorithm. The ADMM based FMO solver was benchmarked with the quadratic programming method based on the interior-point (IP) method using the CORT dataset. The comparison results suggested the ADMM solver had a similar plan quality with slightly smaller total objective function value than IP. A simple-to-implement ADMM based FMO solver with empirical parameter optimization is proposed for IMRT or VMAT.

  10. Optimized drying parameters of water hyacinths (Eichhornia crassipes. L

    Directory of Open Access Journals (Sweden)

    Edgardo V. Casas

    2012-12-01

    Full Text Available The study investigated the optimum drying conditions of water hyacinth to contribute in the improvement of present drying processes. The effects of independent parameters (drying temperature, airflow rate, and number of passes on the responses were determined using the Response Surface Methodology. The response parameters were composed of (1 final moisture content, (2 moisture ratio, (3 drying rate,(4 tensile strength, and (5 browning index. Box and Behnken experimental design represented the design of experiments that resulted in 15 drying runs. Statistical analysis evaluated the treatment effects. Drying temperature significantly affected the drying rate, moisture ratio, and browning index. Airflow rate had a significant effect only on the drying rate, while the number of passes significantly affected both the drying rate and browning index. The optimized conditions for drying the water hyacinth were at drying temperature of 90C, airflow rate of 0.044m3/s, and number of passes equivalent to five. The best modelthat characterizes the drying of water hyacinth is a rational function expressed as:

  11. Automated system for calibration and control of the CHSPP-800 multichannel γ detector parameters

    International Nuclear Information System (INIS)

    An automated system for adjustment, calibration and control of total absorption Cherenkov spectrometer is described. The system comprises a mechanical platform, capable of moving in two mutually perpendicular directions; movement detectors and limit switches; power unit, automation unit with remote control board. The automated system can operate both in manual control regime with coordinate control by a digital indicator, and in operation regime with computer according to special programs. The platform mounting accuracy is ± 0.1 mm. Application of the automated system has increased the rate of the course of the counter adjustment works 3-5 times

  12. Optimal Control and Coordination of Connected and Automated Vehicles at Urban Traffic Intersections

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Yue J. [Boston University; Malikopoulos, Andreas [ORNL; Cassandras, Christos G. [Boston University

    2016-01-01

    We address the problem of coordinating online a continuous flow of connected and automated vehicles (CAVs) crossing two adjacent intersections in an urban area. We present a decentralized optimal control framework whose solution yields for each vehicle the optimal acceleration/deceleration at any time in the sense of minimizing fuel consumption. The solu- tion, when it exists, allows the vehicles to cross the intersections without the use of traffic lights, without creating congestion on the connecting road, and under the hard safety constraint of collision avoidance. The effectiveness of the proposed solution is validated through simulation considering two intersections located in downtown Boston, and it is shown that coordination of CAVs can reduce significantly both fuel consumption and travel time.

  13. Automated Discovery of Elementary Chemical Reaction Steps Using Freezing String and Berny Optimization Methods

    CERN Document Server

    Suleimanov, Yury V

    2015-01-01

    We present a simple protocol which allows fully automated discovery of elementary chemical reaction steps using in cooperation single- and double-ended transition-state optimization algorithms - the freezing string and Berny optimization methods, respectively. To demonstrate the utility of the proposed approach, the reactivity of several systems of combustion and atmospheric chemistry importance is investigated. The proposed algorithm allowed us to detect without any human intervention not only "known" reaction pathways, manually detected in the previous studies, but also new, previously "unknown", reaction pathways which involve significant atom rearrangements. We believe that applying such a systematic approach to elementary reaction path finding will greatly accelerate the possibility of discovery of new chemistry and will lead to more accurate computer simulations of various chemical processes.

  14. Automated Discovery of Elementary Chemical Reaction Steps Using Freezing String and Berny Optimization Methods.

    Science.gov (United States)

    Suleimanov, Yury V; Green, William H

    2015-09-01

    We present a simple protocol which allows fully automated discovery of elementary chemical reaction steps using in cooperation double- and single-ended transition-state optimization algorithms--the freezing string and Berny optimization methods, respectively. To demonstrate the utility of the proposed approach, the reactivity of several single-molecule systems of combustion and atmospheric chemistry importance is investigated. The proposed algorithm allowed us to detect without any human intervention not only "known" reaction pathways, manually detected in the previous studies, but also new, previously "unknown", reaction pathways which involve significant atom rearrangements. We believe that applying such a systematic approach to elementary reaction path finding will greatly accelerate the discovery of new chemistry and will lead to more accurate computer simulations of various chemical processes. PMID:26575920

  15. Applications of the theory of optimal control of distributed-parameter systems to structural optimization

    Science.gov (United States)

    Armand, J. P.

    1972-01-01

    An extension of classical methods of optimal control theory for systems described by ordinary differential equations to distributed-parameter systems described by partial differential equations is presented. An application is given involving the minimum-mass design of a simply-supported shear plate with a fixed fundamental frequency of vibration. An optimal plate thickness distribution in analytical form is found. The case of a minimum-mass design of an elastic sandwich plate whose fundamental frequency of free vibration is fixed. Under the most general conditions, the optimization problem reduces to the solution of two simultaneous partial differential equations involving the optimal thickness distribution and the modal displacement. One equation is the uniform energy distribution expression which was found by Ashley and McIntosh for the optimal design of one-dimensional structures with frequency constraints, and by Prager and Taylor for various design criteria in one and two dimensions. The second equation requires dynamic equilibrium at the preassigned vibration frequency.

  16. The mass movement routing tool r.randomwalk and its functionalities for parameter sensitivity analysis and optimization

    Science.gov (United States)

    Krenn, Julia; Mergili, Martin

    2016-04-01

    r.randomwalk is a GIS-based, multi-functional conceptual tool for mass movement routing. Starting from one to many release points or release areas, mass points are routed down through the digital elevation model until a defined break criterion is reached. Break criteria are defined by the user and may consist in an angle of reach or a related parameter (empirical-statistical relationships), in the drop of the flow velocity to zero (two-parameter friction model), or in the exceedance of a maximum runup height. Multiple break criteria may be combined. A constrained random walk approach is applied for the routing procedure, where the slope and the perpetuation of the flow direction determine the probability of the flow to move in a certain direction. r.randomwalk is implemented as a raster module of the GRASS GIS software and, as such, is open source. It can be obtained from http://www.mergili.at/randomwalk.html. Besides other innovative functionalities, r.randomwalk serves with built-in functionalities for the derivation of an impact indicator index (III) map with values in the range 0-1. III is derived from multiple model runs with different combinations of input parameters varied in a random or controlled way. It represents the fraction of model runs predicting an impact at a given pixel and is evaluated against the observed impact area through an ROC Plot. The related tool r.ranger facilitates the automated generation and evaluation of many III maps from a variety of sets of parameter combinations. We employ r.randomwalk and r.ranger for parameter optimization and sensitivity analysis. Thereby we do not focus on parameter values, but - accounting for the uncertainty inherent in all parameters - on parameter ranges. In this sense, we demonstrate two strategies for parameter sensitivity analysis and optimization. We avoid to (i) use one-at-a-time parameter testing which would fail to account for interdependencies of the parameters, and (ii) to explore all possible

  17. Artificial Intelligence Based Selection of Optimal Cutting Tool and Process Parameters for Effective Turning and Milling Operations

    Science.gov (United States)

    Saranya, Kunaparaju; John Rozario Jegaraj, J.; Ramesh Kumar, Katta; Venkateshwara Rao, Ghanta

    2016-06-01

    With the increased trend in automation of modern manufacturing industry, the human intervention in routine, repetitive and data specific activities of manufacturing is greatly reduced. In this paper, an attempt has been made to reduce the human intervention in selection of optimal cutting tool and process parameters for metal cutting applications, using Artificial Intelligence techniques. Generally, the selection of appropriate cutting tool and parameters in metal cutting is carried out by experienced technician/cutting tool expert based on his knowledge base or extensive search from huge cutting tool database. The present proposed approach replaces the existing practice of physical search for tools from the databooks/tool catalogues with intelligent knowledge-based selection system. This system employs artificial intelligence based techniques such as artificial neural networks, fuzzy logic and genetic algorithm for decision making and optimization. This intelligence based optimal tool selection strategy is developed using Mathworks Matlab Version 7.11.0 and implemented. The cutting tool database was obtained from the tool catalogues of different tool manufacturers. This paper discusses in detail, the methodology and strategies employed for selection of appropriate cutting tool and optimization of process parameters based on multi-objective optimization criteria considering material removal rate, tool life and tool cost.

  18. Platelet parameters from an automated hematology analyzer in dogs with inflammatory clinical diseases.

    Science.gov (United States)

    Smith, Jo R; Smith, Katherine F; Brainard, Benjamin M

    2014-09-01

    The mean platelet component (MPC) is a proprietary algorithm of an automated laser-based hematology analyzer system which measures the refractive index of platelets. The MPC is related linearly to platelet density and is an indirect index of platelet activation status. Previous investigations of canine inflammatory conditions and models of endotoxemia demonstrated a significant decrease in the MPC, consistent with platelet activation. The purpose of this study was to evaluate the MPC and other platelet parameters in dogs with different diseases to determine if they could show differential platelet activation with different pathologies. The hypothesis was that the MPC would decrease in clinical conditions associated with systemic inflammation or platelet activation. Complete blood counts run on the analyzer from dogs with different inflammatory conditions (primary immune-mediated hemolytic anemia (IMHA) or thrombocytopenia (ITP), pituitary-dependent hyperadrenocorticism, intra-abdominal sepsis, pancreatitis, intravascular thrombus or thromboembolus and hemangiosarcoma) were reviewed retrospectively and compared with those of control dogs presenting for orthopedic evaluation. Dogs with ITP had a decreased plateletcrit and MPC, with an increased platelet volume and number of large platelets (P Dogs with IMHA had an increased plateletcrit and mass, and more numerous large platelets (P < 0.001).With the exception of the ITP group, there was no difference in MPC in the diseased groups when compared with the controls. The results of this study suggest the MPC does not change in certain canine diseases associated with systemic inflammation. PMID:25082397

  19. Optimal Design of Variable Stiffness Composite Structures using Lamination Parameters

    NARCIS (Netherlands)

    IJsselmuiden, S.T.

    2011-01-01

    Fiber reinforced composite materials have gained widespread acceptance for a multitude of applications in the aerospace, automotive, maritime and wind-energy industries. Automated fiber placement technologies have developed rapidly over the past two decades, driven primarily by a need to reduce m

  20. Multi-objective Genetic Algorithm for System Identification and Controller Optimization of Automated Guided Vehicle

    Directory of Open Access Journals (Sweden)

    Xing Wu

    2011-07-01

    Full Text Available This paper presents a multi-objective genetic algorithm (MOGA with Pareto optimality and elitist tactics for the control system design of automated guided vehicle (AGV. The MOGA is used to identify AGV driving system model and optimize its servo control system sequentially. In system identification, the model identified by least square method is adopted as an evolution tutor who selects the individuals having balanced performances in all objectives as elitists. In controller optimization, the velocity regulating capability required by AGV path tracking is employed as decision-making preferences which select Pareto optimal solutions as elitists. According to different objectives and elitist tactics, several sub-populations are constructed and they evolve concurrently by using independent reproduction, neighborhood mutation and heuristic crossover. The lossless finite precision method and the multi-objective normalized increment distance are proposed to keep the population diversity with a low computational complexity. Experiment results show that the cascaded MOGA have the capability to make the system model consistent with AGV driving system both in amplitude and phase, and to make its servo control system satisfy the requirements on dynamic performance and steady-state accuracy in AGV path tracking.

  1. MATHEMATICAL BASES OF CONSTRUCTION OF THE DISCRETE AUTOMATIC MACHINE FOR OPTIMIZATION THE PARAMETERS OF THE SYSTEM

    Directory of Open Access Journals (Sweden)

    Markov V. N.

    2014-11-01

    Full Text Available The article considers discrete automatic machines with memory, designed for searching the values of parameters of the system optimizing a merit figure of the system, described by some criterion function. The problem of multiple parameter optimization is shown as a problem of discrete optimization by means of representation of values of parameters of the optimized system in the form of a set of discrete values with a specified step of digitization

  2. Optimal design of nanoplasmonic materials using genetic algorithms as a multi-parameter optimization tool

    CERN Document Server

    Yelk, Joseph; Seideman, Tamar

    2008-01-01

    An optimal control approach based on multiple parameter genetic algorithms is applied to the design of plasmonic nanoconstructs with pre-determined optical properties and functionalities. We first develop nanoscale metallic lenses that focus an incident plane wave onto a pre-specified, spatially confined spot. Our results illustrate the role of symmetry breaking and unravel the principles that favor dimeric constructs for optimal light localization. Next we design a periodic array of silver particles to modify the polarization of an incident, linearly-polarized plane wave in a desired fashion while localizing the light in space. The results provide insight into the structural features that determine the birefringence properties of metal nanoparticles and their arrays. Of the variety of potential applications that may be envisioned, we note the design of nanoscale light sources with controllable coherence and polarization properties that could serve for coherent control of molecular or electronic dynamics in t...

  3. Parameter estimation of nonlinear econometric models using particle swarm optimization

    OpenAIRE

    Mark P Wachowiak; Smolíková-Wachowiak, Renáta; Smolík, Dušan

    2010-01-01

    Global optimization is an essential component of econometric modeling. Optimization in econometrics is often difficult due to irregular cost functions characterized by multiple local optima. The goal of this paper is to apply a relatively new stochastic global technique, particle swarm optimization, to the well-known but difficult disequilibrium problem. Because of its co-operative nature and balance of local and global search, particle swarm is successful in optimizing the disequ...

  4. Optimal training dataset composition for SVM-based, age-independent, automated epileptic seizure detection.

    Science.gov (United States)

    Bogaarts, J G; Gommer, E D; Hilkman, D M W; van Kranen-Mastenbroek, V H J M; Reulen, J P H

    2016-08-01

    Automated seizure detection is a valuable asset to health professionals, which makes adequate treatment possible in order to minimize brain damage. Most research focuses on two separate aspects of automated seizure detection: EEG feature computation and classification methods. Little research has been published regarding optimal training dataset composition for patient-independent seizure detection. This paper evaluates the performance of classifiers trained on different datasets in order to determine the optimal dataset for use in classifier training for automated, age-independent, seizure detection. Three datasets are used to train a support vector machine (SVM) classifier: (1) EEG from neonatal patients, (2) EEG from adult patients and (3) EEG from both neonates and adults. To correct for baseline EEG feature differences among patients feature, normalization is essential. Usually dedicated detection systems are developed for either neonatal or adult patients. Normalization might allow for the development of a single seizure detection system for patients irrespective of their age. Two classifier versions are trained on all three datasets: one with feature normalization and one without. This gives us six different classifiers to evaluate using both the neonatal and adults test sets. As a performance measure, the area under the receiver operating characteristics curve (AUC) is used. With application of FBC, it resulted in performance values of 0.90 and 0.93 for neonatal and adult seizure detection, respectively. For neonatal seizure detection, the classifier trained on EEG from adult patients performed significantly worse compared to both the classifier trained on EEG data from neonatal patients and the classier trained on both neonatal and adult EEG data. For adult seizure detection, optimal performance was achieved by either the classifier trained on adult EEG data or the classifier trained on both neonatal and adult EEG data. Our results show that age

  5. An automated approach to magnetic divertor configuration design, using an efficient optimization methodology

    Energy Technology Data Exchange (ETDEWEB)

    Blommaert, Maarten; Reiter, Detlev [Institute of Energy and Climate Research (IEK-4), FZ Juelich GmbH, D-52425 Juelich (Germany); Heumann, Holger [Centre de Recherche INRIA Sophia Antipolis, BP 93 06902 Sophia Antipolis (France); Baelmans, Martine [KU Leuven, Department of Mechanical Engineering, 3001 Leuven (Belgium); Gauger, Nicolas Ralph [TU Kaiserslautern, Chair for Scientific Computing, 67663 Kaiserslautern (Germany)

    2015-05-01

    At present, several plasma boundary codes exist that attempt to describe the complex interactions in the divertor SOL (Scrape-Off Layer). The predictive capability of these edge codes is still very limited. Yet, in parallel to major efforts to mature edge codes, we face the design challenges for next step fusion devices. One of them is the design of the helium and heat exhaust system. In past automated design studies, results indicated large potential reductions in peak heat load by an increased magnetic flux divergence towards the target structures. In the present study, a free boundary magnetic equilibrium solver is included into the simulation chain to verify these tendencies. Additionally, we expanded the applicability of the automated design method by introducing advanced ''adjoint'' sensitivity computations. This method, inherited from airfoil shape optimization in aerodynamics, allows for a large number of design variables at no additional computational cost. Results are shown for a design application of the new WEST divertor.

  6. Automated digital microfluidic platform for magnetic-particle-based immunoassays with optimization by design of experiments.

    Science.gov (United States)

    Choi, Kihwan; Ng, Alphonsus H C; Fobel, Ryan; Chang-Yen, David A; Yarnell, Lyle E; Pearson, Elroy L; Oleksak, Carl M; Fischer, Andrew T; Luoma, Robert P; Robinson, John M; Audet, Julie; Wheeler, Aaron R

    2013-10-15

    We introduce an automated digital microfluidic (DMF) platform capable of performing immunoassays from sample to analysis with minimal manual intervention. This platform features (a) a 90 Pogo pin interface for digital microfluidic control, (b) an integrated (and motorized) photomultiplier tube for chemiluminescent detection, and (c) a magnetic lens assembly which focuses magnetic fields into a narrow region on the surface of the DMF device, facilitating up to eight simultaneous digital microfluidic magnetic separations. The new platform was used to implement a three-level full factorial design of experiments (DOE) optimization for thyroid-stimulating hormone immunoassays, varying (1) the analyte concentration, (2) the sample incubation time, and (3) the sample volume, resulting in an optimized protocol that reduced the detection limit and sample incubation time by up to 5-fold and 2-fold, respectively, relative to those from previous work. To our knowledge, this is the first report of a DOE optimization for immunoassays in a microfluidic system of any format. We propose that this new platform paves the way for a benchtop tool that is useful for implementing immunoassays in near-patient settings, including community hospitals, physicians' offices, and small clinical laboratories. PMID:23978190

  7. Automated procedure for selection of optimal refueling policies for light water reactors

    International Nuclear Information System (INIS)

    An automated procedure determining a minimum cost refueling policy has been developed for light water reactors. The procedure is an extension of the equilibrium core approach previously devised for pressurized water reactors (PWRs). Use of 1 1/2-group theory has improved the accuracy of the nuclear model and eliminated tedious fitting of albedos. A simple heuristic algorithm for locating a good starting policy has materially reduced PWR computing time. Inclusion of void effects and use of the Haling principle for axial flux calculations extended the nuclear model to boiling water reactors (BWRs). A good initial estimate of the refueling policy is obtained by recognizing that a nearly uniform distribution of reactivity provides low-power peaking. The initial estimate is improved upon by interchanging groups of four assemblies and is subsequently refined by interchanging individual assemblies. The method yields very favorable results, is simpler than previously proposed BWR fuel optimization schemes, and retains power cost as the objective function

  8. Knowledge Network Driven Coordination and Robust Optimization to Support Concurrent and Collaborative Parameter Design

    OpenAIRE

    Hu, Jie; Peng, Yinghong; Xiong, Guangleng

    2007-01-01

    Abstract This study presents a parameter coordination and robust optimization approach based on knowledge network modeling. The method allows multidisciplinary designer to synthetically coordinate and optimize parameter considering multidisciplinary knowledge. First, a knowledge network model is established, including design knowledge from assembly, manufacture, performance, and simulation. Second, the parameter coordination method is presented to solve the knowledge network model,...

  9. Comparison of haematological parameters determined by the Sysmex KX - 2IN automated haematology analyzer and the manual counts

    OpenAIRE

    Shu Elvis N; Nubila Imelda N; Ukaejiofo Ernest O; Nubila Thomas; Ike Samuel O; Ezema Ifeyinwa

    2010-01-01

    Abstract Background This study was designed to determine the correlation between heamatological parameters by Sysmex KX-21N automated hematology analyzer with the manual methods. Method Sixty (60) subjects were randomly selected from both apparently healthy subjects and those who have different blood disorders from the University of Teaching Hospital (UNTH), Ituku-Ozalla, Enugu, Enugu State, Nigeria. Three (3)mls of venous blood sample was collected aseptically from each subject into tri-pota...

  10. Evaluation of a Multi-Parameter Sensor for Automated, Continuous Cell Culture Monitoring in Bioreactors

    Science.gov (United States)

    Pappas, D.; Jeevarajan, A.; Anderson, M. M.

    2004-01-01

    Compact and automated sensors are desired for assessing the health of cell cultures in biotechnology experiments in microgravity. Measurement of cell culture medium allows for the optirn.jzation of culture conditions on orbit to maximize cell growth and minimize unnecessary exchange of medium. While several discrete sensors exist to measure culture health, a multi-parameter sensor would simplify the experimental apparatus. One such sensor, the Paratrend 7, consists of three optical fibers for measuring pH, dissolved oxygen (p02), dissolved carbon dioxide (pC02) , and a thermocouple to measure temperature. The sensor bundle was designed for intra-arterial placement in clinical patients, and potentially can be used in NASA's Space Shuttle and International Space Station biotechnology program bioreactors. Methods: A Paratrend 7 sensor was placed at the outlet of a rotating-wall perfused vessel bioreactor system inoculated with BHK-21 (baby hamster kidney) cells. Cell culture medium (GTSF-2, composed of 40% minimum essential medium, 60% L-15 Leibovitz medium) was manually measured using a bench top blood gas analyzer (BGA, Ciba-Corning). Results: A Paratrend 7 sensor was used over a long-term (>120 day) cell culture experiment. The sensor was able to track changes in cell medium pH, p02, and pC02 due to the consumption of nutrients by the BHK-21. When compared to manually obtained BGA measurements, the sensor had good agreement for pH, p02, and pC02 with bias [and precision] of 0.02 [0.15], 1 mm Hg [18 mm Hg], and -4.0 mm Hg [8.0 mm Hg] respectively. The Paratrend oxygen sensor was recalibrated (offset) periodically due to drift. The bias for the raw (no offset or recalibration) oxygen measurements was 42 mm Hg [38 mm Hg]. The measured response (rise) time of the sensor was 20 +/- 4s for pH, 81 +/- 53s for pC02, 51 +/- 20s for p02. For long-term cell culture measurements, these response times are more than adequate. Based on these findings , the Paratrend sensor could

  11. Optimization of operational aircraft parameters reducing noise emission

    OpenAIRE

    Abdallah, Lina; Khardi, Salah; Haddou, Mounir

    2010-01-01

    The objective of this paper is to develop a model and a minimization method to provide flight path optimums reducing aircraft noise in the vicinity of airports. Optimization algorithm has solved a complex optimal control problem, and generates flight paths minimizing aircraft noise levels. Operational and safety constraints have been considered and their limits satisfied. Results are here presented and discussed.

  12. Optimization of operational aircraft parameters Reducing Noise Emission

    CERN Document Server

    Abdallah, Lina; Khardi, Salah

    2008-01-01

    The objective of this paper is to develop a model and a minimization method to provide flight path optimums reducing aircraft noise in the vicinity of airports. Optimization algorithm has solved a complex optimal control problem, and generates flight paths minimizing aircraft noise levels. Operational and safety constraints have been considered and their limits satisfied. Results are here presented and discussed.

  13. Optimization of the ARIES-CS compact stellarator reactor parameters

    International Nuclear Information System (INIS)

    optimum reactor size are the minimum distance between coils, neutron and radiative power flux to the wall, and the beta limit. A reactor systems/optimization code is used to optimize the reactor parameters for minimum cost of electricity subject to a large number of physics, engineering, materials, and reactor component constraints. Different transport models, reactor component models, and costing algorithms are used to test sensitivities to different models and assumptions. A 1-D power balance code is used to study the path to ignition and the effect of different plasma and confinement assumptions including density and temperature profiles, impurity density levels and peaking near the outside, confinement scaling, beta limits, alpha particle losses, etc. for each plasma and coil configuration. Variations on two different magnetic configurations were analyzed in detail: a three-field-period (M = 3) NCSX-based plasma with coils modified to allow a larger plasma-coil spacing, and an M = 2 plasma with coils that are closer to the plasma on the outboard side with less toroidal excursion. The reactors have major radii R in the 7-9 m range with an improved blanket and shield concept and an advanced superconducting coil approach. The results show that compact stellarator reactors should be cost competitive with tokamak reactors. (author)

  14. Numerical and experimental analysis of a ducted propeller designed by a fully automated optimization process under open water condition

    Science.gov (United States)

    Yu, Long; Druckenbrod, Markus; Greve, Martin; Wang, Ke-qi; Abdel-Maksoud, Moustafa

    2015-10-01

    A fully automated optimization process is provided for the design of ducted propellers under open water conditions, including 3D geometry modeling, meshing, optimization algorithm and CFD analysis techniques. The developed process allows the direct integration of a RANSE solver in the design stage. A practical ducted propeller design case study is carried out for validation. Numerical simulations and open water tests are fulfilled and proved that the optimum ducted propeller improves hydrodynamic performance as predicted.

  15. Optimization of non-linear mass damper parameters for transient response

    DEFF Research Database (Denmark)

    Jensen, Jakob Søndergaard; Lazarov, Boyan Stefanov

    2008-01-01

    We optimize the parameters of multiple non-linear mass dampers based on numerical simulation of transient wave propagation through a linear mass-spring carrier structure. Topology optimization is used to obtain optimized distributions of damper mass ratio, natural frequency, damping ratio and non...... nonlinear stiffness coefficient. Large improvements in performance is obtained with optimized parameters and it is shown that nonlinearmass dampers can bemore effective for wave attenuation than linear mass dampers....

  16. Characterization and optimized control by means of multi-parameter controllers

    Energy Technology Data Exchange (ETDEWEB)

    Nielsen, Carsten; Hoeg, S.; Thoegersen, A. (Dan-Ejendomme, Hellerup (Denmark)) (and others)

    2009-07-01

    Poorly functioning HVAC systems (Heating, Ventilation and Air Conditioning), but also separate heating, ventilation and air conditioning systems are costing the Danish society billions of kroner every year: partly because of increased energy consumption and high operational and maintenance costs, but mainly due to reduced productivity and absence due to illness because of a poor indoor climate. Typically, the operation of buildings and installations takes place today with traditional build-ing automation, which is characterised by 1) being based on static considerations 2) the individual sensor being coupled with one actuator/valve, i.e. the sensor's signal is only used in one place in the system 3) subsystems often being controlled independently of each other 4) the dynamics in building constructions and systems which is very important to the system and comfort regulation is not being considered. This, coupled with the widespread tendency to use large glass areas in the facades without sufficient sun shading, means that it is difficult to optimise comfort and energy consumption. Therefore, the last 10-20 years have seen a steady increase in the complaints of the indoor climate in Danish buildings and, at the same time, new buildings often turn out to be considerably higher energy consuming than expected. The purpose of the present project is to investigate the type of multi parameter sensors which may be generated for buildings and further to carry out a preliminary evaluation on how such multi parameter controllers may be utilized for optimal control of buildings. The aim of the project isn't to develop multi parameter controllers - this requires much more effort than possible in the present project. The aim is to show the potential of using multi parameter sensors when controlling buildings. For this purpose a larger office building has been chosen - an office building with at high energy demand and complaints regarding the indoor climate. In order to

  17. SU-E-J-16: Automatic Image Contrast Enhancement Based On Automatic Parameter Optimization for Radiation Therapy Setup Verification

    International Nuclear Information System (INIS)

    Purpose: In RT patient setup 2D images, tissues often cannot be seen well due to the lack of image contrast. Contrast enhancement features provided by image reviewing software, e.g. Mosaiq and ARIA, require manual selection of the image processing filters and parameters thus inefficient and cannot be automated. In this work, we developed a novel method to automatically enhance the 2D RT image contrast to allow automatic verification of patient daily setups as a prerequisite step of automatic patient safety assurance. Methods: The new method is based on contrast limited adaptive histogram equalization (CLAHE) and high-pass filtering algorithms. The most important innovation is to automatically select the optimal parameters by optimizing the image contrast. The image processing procedure includes the following steps: 1) background and noise removal, 2) hi-pass filtering by subtracting the Gaussian smoothed Result, and 3) histogram equalization using CLAHE algorithm. Three parameters were determined through an iterative optimization which was based on the interior-point constrained optimization algorithm: the Gaussian smoothing weighting factor, the CLAHE algorithm block size and clip limiting parameters. The goal of the optimization is to maximize the entropy of the processed Result. Results: A total 42 RT images were processed. The results were visually evaluated by RT physicians and physicists. About 48% of the images processed by the new method were ranked as excellent. In comparison, only 29% and 18% of the images processed by the basic CLAHE algorithm and by the basic window level adjustment process, were ranked as excellent. Conclusion: This new image contrast enhancement method is robust and automatic, and is able to significantly outperform the basic CLAHE algorithm and the manual window-level adjustment process that are currently used in clinical 2D image review software tools

  18. DEFINITION OF THE OPTIMAL PARAMETERS OF SUPERSONIC EXTRACTION AT STACHYS NODULES EXTRACTION

    OpenAIRE

    Zakharova, L.; Dyatlov, A.

    2013-01-01

    The researches allowed determination of the extractives amount in stachys nodules and motivation of optimal parameters of supersonic extraction at obtaining extracts on water basis from stachys nodules.

  19. Optimization of structural parameters for spatial flexible redundant manipulators with maximum ratio of load to mass

    Institute of Scientific and Technical Information of China (English)

    ZHANG Xu-ping; YU Yue-qing

    2005-01-01

    Optimization of structural parameters aimed at improving the load carrying capacity of spatial flexible redundant manipulators is presented in this paper. In order to increase the ratio of load to mass of robots, the cross-sectional parameters and constructional parameters are optimized respectively. The cross-sectional and configurational parameters are optimized simultaneously. The numerical simulation of a 4R spatial manipulator is performed. The results show that the load capacity of robots has been greatly improved through the optimization strategies proposed in this paper.

  20. Parameter Studies, time-dependent simulations and design with automated Cartesian methods

    Science.gov (United States)

    Aftosmis, Michael

    2005-01-01

    Over the past decade, NASA has made a substantial investment in developing adaptive Cartesian grid methods for aerodynamic simulation. Cartesian-based methods played a key role in both the Space Shuttle Accident Investigation and in NASA's return to flight activities. The talk will provide an overview of recent technological developments focusing on the generation of large-scale aerodynamic databases, automated CAD-based design, and time-dependent simulations with of bodies in relative motion. Automation, scalability and robustness underly all of these applications and research in each of these topics will be presented.

  1. Design And Modeling An Automated Digsilent Power System For Optimal New Load Locations

    Directory of Open Access Journals (Sweden)

    Mohamed Saad

    2015-08-01

    Full Text Available Abstract The electric power utilities seek to take advantage of novel approaches to meet growing energy demand. Utilities are under pressure to evolve their classical topologies to increase the usage of distributed generation. Currently the electrical power engineers in many regions of the world are implementing manual methods to measure power consumption for farther assessment of voltage violation. Such process proved to be time consuming costly and inaccurate. Also demand response is a grid management technique where retail or wholesale customers are requested either electronically or manually to reduce their load. Therefore this paper aims to design and model an automated power system for optimal new load locations using DPL DIgSILENT Programming Language. This study is a diagnostic approach that assists system operator about any voltage violation cases that would happen during adding new load to the grid. The process of identifying the optimal bus bar location involves a complicated calculation of the power consumptions at each load bus As a result the DPL program would consider all the IEEE 30 bus internal networks data then a load flow simulation will be executed. To add the new load to the first bus in the network. Therefore the developed model will simulate the new load at each available bus bar in the network and generate three analytical reports for each case that captures the overunder voltage and the loading elements among the grid.

  2. Direct Multiple Shooting Optimization with Variable Problem Parameters

    Science.gov (United States)

    Whitley, Ryan J.; Ocampo, Cesar A.

    2009-01-01

    Taking advantage of a novel approach to the design of the orbital transfer optimization problem and advanced non-linear programming algorithms, several optimal transfer trajectories are found for problems with and without known analytic solutions. This method treats the fixed known gravitational constants as optimization variables in order to reduce the need for an advanced initial guess. Complex periodic orbits are targeted with very simple guesses and the ability to find optimal transfers in spite of these bad guesses is successfully demonstrated. Impulsive transfers are considered for orbits in both the 2-body frame as well as the circular restricted three-body problem (CRTBP). The results with this new approach demonstrate the potential for increasing robustness for all types of orbit transfer problems.

  3. Parameter estimation for time-delay chaotic system by particle swarm optimization

    International Nuclear Information System (INIS)

    The knowledge about time delays and parameters is very important for control and synchronization of time-delay chaotic system. In this paper, parameter estimation for time-delay chaotic system is given by treating the time delay as an additional parameter. The parameter estimation is converted to an optimization problem, which finds a best parameter combination such that an objective function is minimized. Particle swarm optimization (PSO) is used to optimize the objective function through particles' cooperation and evolution. Two illustrative examples are given to show the validity of the proposed method.

  4. Optimizing Soil Hydraulic Parameters in RZWQM2 Under Fallow Conditions

    Science.gov (United States)

    Effective estimation of soil hydraulic parameters is essential for predicting soil water dynamics and related biochemical processes in agricultural systems. However, high uncertainties in estimated parameter values limit a model’s skill for prediction and application. In this study, a global search ...

  5. Parameter optimization method for the water quality dynamic model based on data-driven theory.

    Science.gov (United States)

    Liang, Shuxiu; Han, Songlin; Sun, Zhaochen

    2015-09-15

    Parameter optimization is important for developing a water quality dynamic model. In this study, we applied data-driven method to select and optimize parameters for a complex three-dimensional water quality model. First, a data-driven model was developed to train the response relationship between phytoplankton and environmental factors based on the measured data. Second, an eight-variable water quality dynamic model was established and coupled to a physical model. Parameter sensitivity analysis was investigated by changing parameter values individually in an assigned range. The above results served as guidelines for the control parameter selection and the simulated result verification. Finally, using the data-driven model to approximate the computational water quality model, we employed the Particle Swarm Optimization (PSO) algorithm to optimize the control parameters. The optimization routines and results were analyzed and discussed based on the establishment of the water quality model in Xiangshan Bay (XSB). PMID:26277602

  6. Optimal parameters for the FFA-Beddoes dynamic stall model

    Energy Technology Data Exchange (ETDEWEB)

    Bjoerck, A.; Mert, M. [FFA, The Aeronautical Research Institute of Sweden, Bromma (Sweden); Madsen, H.A. [Risoe National Lab., Roskilde (Denmark)

    1999-03-01

    Unsteady aerodynamic effects, like dynamic stall, must be considered in calculation of dynamic forces for wind turbines. Models incorporated in aero-elastic programs are of semi-empirical nature. Resulting aerodynamic forces therefore depend on values used for the semi-empiricial parameters. In this paper a study of finding appropriate parameters to use with the Beddoes-Leishman model is discussed. Minimisation of the `tracking error` between results from 2D wind tunnel tests and simulation with the model is used to find optimum values for the parameters. The resulting optimum parameters show a large variation from case to case. Using these different sets of optimum parameters in the calculation of blade vibrations, give rise to quite different predictions of aerodynamic damping which is discussed. (au)

  7. A Particle Swarm Optimization Algorithm for Optimal Operating Parameters of VMI Systems in a Two-Echelon Supply Chain

    Science.gov (United States)

    Sue-Ann, Goh; Ponnambalam, S. G.

    This paper focuses on the operational issues of a Two-echelon Single-Vendor-Multiple-Buyers Supply chain (TSVMBSC) under vendor managed inventory (VMI) mode of operation. To determine the optimal sales quantity for each buyer in TSVMBC, a mathematical model is formulated. Based on the optimal sales quantity can be obtained and the optimal sales price that will determine the optimal channel profit and contract price between the vendor and buyer. All this parameters depends upon the understanding of the revenue sharing between the vendor and buyers. A Particle Swarm Optimization (PSO) is proposed for this problem. Solutions obtained from PSO is compared with the best known results reported in literature.

  8. Optimization of Russian roulette parameters for the KENO computer code

    Energy Technology Data Exchange (ETDEWEB)

    Hoffman, T.J.

    1982-10-01

    Proper specification of the (statistical) weight standards for Monte Carlo calculations can lead to a substantial reduction in computer time. Frequently these weights are set intuitively. When optimization is performed, it is usually based on a simplified model (to enable mathematical analysis) and involves minimization of the sample variance. In this report, weight standards are optimized through consideration of the actual implementation of Russian roulette in the KENO computer code. The goal is minimization of computer time rather than minimization of sample variance. Verification of the development and assumptions is obtained from Monte Carlo simulations. The results indicate that the current default weight standards are appropriate for most problems in which thermal neutron transport is not a major consumer of computer time. For thermal systems, the optimization technique described in this report should be used.

  9. Parameter design and optimization of tight-lattice rod bundles

    International Nuclear Information System (INIS)

    Thin rod bundles with tight lattice are arranged according to the equilateral triangle grid, as the proportion of fuel is large, and the power density of core is high. Based on the analysis of the performance of core, the ABV-6M reactor is taken as the example, and two objective functions, power density and flow rate of coolant are proposed for optimization calculation. Diameter and pitch of rod are optimized by using GA method respectively. The results, which are considered to be safety in security checking, show that tight lattice is effective for improving the power density and other performances of the reactor core. (author)

  10. An automated optimization tool for high-dose-rate (HDR) prostate brachytherapy with divergent needle pattern

    Science.gov (United States)

    Borot de Battisti, M.; Maenhout, M.; de Senneville, B. Denis; Hautvast, G.; Binnekamp, D.; Lagendijk, J. J. W.; van Vulpen, M.; Moerland, M. A.

    2015-10-01

    Focal high-dose-rate (HDR) for prostate cancer has gained increasing interest as an alternative to whole gland therapy as it may contribute to the reduction of treatment related toxicity. For focal treatment, optimal needle guidance and placement is warranted. This can be achieved under MR guidance. However, MR-guided needle placement is currently not possible due to space restrictions in the closed MR bore. To overcome this problem, a MR-compatible, single-divergent needle-implant robotic device is under development at the University Medical Centre, Utrecht: placed between the legs of the patient inside the MR bore, this robot will tap the needle in a divergent pattern from a single rotation point into the tissue. This rotation point is just beneath the perineal skin to have access to the focal prostate tumor lesion. Currently, there is no treatment planning system commercially available which allows optimization of the dose distribution with such needle arrangement. The aim of this work is to develop an automatic inverse dose planning optimization tool for focal HDR prostate brachytherapy with needle insertions in a divergent configuration. A complete optimizer workflow is proposed which includes the determination of (1) the position of the center of rotation, (2) the needle angulations and (3) the dwell times. Unlike most currently used optimizers, no prior selection or adjustment of input parameters such as minimum or maximum dose or weight coefficients for treatment region and organs at risk is required. To test this optimizer, a planning study was performed on ten patients (treatment volumes ranged from 8.5 cm3to 23.3 cm3) by using 2-14 needle insertions. The total computation time of the optimizer workflow was below 20 min and a clinically acceptable plan was reached on average using only four needle insertions.

  11. An automated optimization tool for high-dose-rate (HDR) prostate brachytherapy with divergent needle pattern.

    Science.gov (United States)

    Borot de Battisti, M; Maenhout, M; Denis de Senneville, B; Hautvast, G; Binnekamp, D; Lagendijk, J J W; van Vulpen, M; Moerland, M A

    2015-10-01

    Focal high-dose-rate (HDR) for prostate cancer has gained increasing interest as an alternative to whole gland therapy as it may contribute to the reduction of treatment related toxicity. For focal treatment, optimal needle guidance and placement is warranted. This can be achieved under MR guidance. However, MR-guided needle placement is currently not possible due to space restrictions in the closed MR bore. To overcome this problem, a MR-compatible, single-divergent needle-implant robotic device is under development at the University Medical Centre, Utrecht: placed between the legs of the patient inside the MR bore, this robot will tap the needle in a divergent pattern from a single rotation point into the tissue. This rotation point is just beneath the perineal skin to have access to the focal prostate tumor lesion. Currently, there is no treatment planning system commercially available which allows optimization of the dose distribution with such needle arrangement. The aim of this work is to develop an automatic inverse dose planning optimization tool for focal HDR prostate brachytherapy with needle insertions in a divergent configuration. A complete optimizer workflow is proposed which includes the determination of (1) the position of the center of rotation, (2) the needle angulations and (3) the dwell times. Unlike most currently used optimizers, no prior selection or adjustment of input parameters such as minimum or maximum dose or weight coefficients for treatment region and organs at risk is required. To test this optimizer, a planning study was performed on ten patients (treatment volumes ranged from 8.5 cm(3)to 23.3 cm(3)) by using 2-14 needle insertions. The total computation time of the optimizer workflow was below 20 min and a clinically acceptable plan was reached on average using only four needle insertions. PMID:26378657

  12. Individual Parameter Selection Strategy for Particle Swarm Optimization

    OpenAIRE

    Cai, Xingjuan; Cui, Zhihua; Zeng, Jianchao; Tan, Ying

    2009-01-01

    This chapter proposes a new model incorporated with the characteristic differences for each particle, and the individual selection strategy for inertia weight, cognitive learning factor and social learning factor are discussed, respectively. Simulation results show the individual selection strategy maintains a fast search speed and robust. Further research should be made on individual structure for particle swarm optimization.

  13. Air Compressor Driving with Synchronous Motors at Optimal Parameters

    Directory of Open Access Journals (Sweden)

    Iuliu Petrica

    2010-10-01

    Full Text Available In this paper a method of optimal compensation of the reactive load by the synchronous motors, driving the air compressors, used in mining enterprises is presented, taking into account that in this case, the great majority of the equipment (compressors, pumps are generally working a constant load.

  14. Optimization of parameters of single-beam gamma absorption concentration meter within definite measuring range

    International Nuclear Information System (INIS)

    Problems of two-parametric optimization of single-beam gamma absorption concentration meters in the assigned measurement range are considered. It is shown that maximum absolute and relative statistical measurement errors are observed at the measurement range boundaries under any values of variable parameters. Optimization of single-beam gamma absorption concentration meter parameters for a number of binary solutions is performed

  15. SU-E-J-130: Automating Liver Segmentation Via Combined Global and Local Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Li, Dengwang; Wang, Jie [College of Physics and Electronics, Shandong Normal University, Jinan, Shandong (China); Kapp, Daniel S.; Xing, Lei [Department of Radiation Oncology, Stanford University School of Medicine, Stanford, CA (United States)

    2015-06-15

    Purpose: The aim of this work is to develop a robust algorithm for accurate segmentation of liver with special attention paid to the problems with fuzzy edges and tumor. Methods: 200 CT images were collected from radiotherapy treatment planning system. 150 datasets are selected as the panel data for shape dictionary and parameters estimation. The remaining 50 datasets were used as test images. In our study liver segmentation was formulated as optimization process of implicit function. The liver region was optimized via local and global optimization during iterations. Our method consists five steps: 1)The livers from the panel data were segmented manually by physicians, and then We estimated the parameters of GMM (Gaussian mixture model) and MRF (Markov random field). Shape dictionary was built by utilizing the 3D liver shapes. 2)The outlines of chest and abdomen were located according to rib structure in the input images, and the liver region was initialized based on GMM. 3)The liver shape for each 2D slice was adjusted using MRF within the neighborhood of liver edge for local optimization. 4)The 3D liver shape was corrected by employing SSR (sparse shape representation) based on liver shape dictionary for global optimization. Furthermore, H-PSO(Hybrid Particle Swarm Optimization) was employed to solve the SSR equation. 5)The corrected 3D liver was divided into 2D slices as input data of the third step. The iteration was repeated within the local optimization and global optimization until it satisfied the suspension conditions (maximum iterations and changing rate). Results: The experiments indicated that our method performed well even for the CT images with fuzzy edge and tumors. Comparing with physician delineated results, the segmentation accuracy with the 50 test datasets (VOE, volume overlap percentage) was on average 91%–95%. Conclusion: The proposed automatic segmentation method provides a sensible technique for segmentation of CT images. This work is

  16. SU-E-J-130: Automating Liver Segmentation Via Combined Global and Local Optimization

    International Nuclear Information System (INIS)

    Purpose: The aim of this work is to develop a robust algorithm for accurate segmentation of liver with special attention paid to the problems with fuzzy edges and tumor. Methods: 200 CT images were collected from radiotherapy treatment planning system. 150 datasets are selected as the panel data for shape dictionary and parameters estimation. The remaining 50 datasets were used as test images. In our study liver segmentation was formulated as optimization process of implicit function. The liver region was optimized via local and global optimization during iterations. Our method consists five steps: 1)The livers from the panel data were segmented manually by physicians, and then We estimated the parameters of GMM (Gaussian mixture model) and MRF (Markov random field). Shape dictionary was built by utilizing the 3D liver shapes. 2)The outlines of chest and abdomen were located according to rib structure in the input images, and the liver region was initialized based on GMM. 3)The liver shape for each 2D slice was adjusted using MRF within the neighborhood of liver edge for local optimization. 4)The 3D liver shape was corrected by employing SSR (sparse shape representation) based on liver shape dictionary for global optimization. Furthermore, H-PSO(Hybrid Particle Swarm Optimization) was employed to solve the SSR equation. 5)The corrected 3D liver was divided into 2D slices as input data of the third step. The iteration was repeated within the local optimization and global optimization until it satisfied the suspension conditions (maximum iterations and changing rate). Results: The experiments indicated that our method performed well even for the CT images with fuzzy edge and tumors. Comparing with physician delineated results, the segmentation accuracy with the 50 test datasets (VOE, volume overlap percentage) was on average 91%–95%. Conclusion: The proposed automatic segmentation method provides a sensible technique for segmentation of CT images. This work is

  17. AN IMPROVED GENETIC ALGORITHM FOR SEARCHING OPTIMAL PARAMETERS IN n-DIMENSIONAL SPACE

    Institute of Scientific and Technical Information of China (English)

    Tang Bin; Hu Guangrui

    2002-01-01

    An improved genetic algorithm for searching optimal parameters in n-dimensional space is presented, which encodes movement direction and distance and searches from coarse to precise. The algorithm can realize global optimization and improve the search efficiency, and can be applied effectively in industrial optimization, data mining and pattern recognition.

  18. Optimizing human-system interface automation design based on a skill-rule-knowledge framework

    International Nuclear Information System (INIS)

    This study considers the technological change that has occurred in complex systems within the past 30 years. The role of human operators in controlling and interacting with complex systems following the technological change was also investigated. Modernization of instrumentation and control systems and components leads to a new issue of human-automation interaction, in which human operational performance must be considered in automated systems. The human-automation interaction can differ in its types and levels. A system design issue is usually realized: given these technical capabilities, which system functions should be automated and to what extent? A good automation design can be achieved by making an appropriate human-automation function allocation. To our knowledge, only a few studies have been published on how to achieve appropriate automation design with a systematic procedure. Further, there is a surprising lack of information on examining and validating the influences of levels of automation (LOAs) on instrumentation and control systems in the advanced control room (ACR). The study we present in this paper proposed a systematic framework to help in making an appropriate decision towards types of automation (TOA) and LOAs based on a 'Skill-Rule-Knowledge' (SRK) model. From the evaluating results, it was shown that the use of either automatic mode or semiautomatic mode is insufficient to prevent human errors. For preventing the occurrences of human errors and ensuring the safety in ACR, the proposed framework can be valuable for making decisions in human-automation allocation.

  19. Problems of optimization of the accelerating system parameters of induction linear accelerators

    International Nuclear Information System (INIS)

    Optimization of the accelerating system of an induction linear accelerator (ILAC) is discussed. For computerized optimization of ILAC parameters, analytical dependences are required which relate accelerator elements to its operating conditions. In deriving the objective function the degree of importance of optimized parameters is taken account of in the weight factors whose value can vary from 0 to 1. One minimum in the objective function permitted the employment of a simple algorithm for optimization - the gradient method. In optimization it is assumed that most common criteria for estimating the ILAC are the efficiency, the relative cost, and the specific energy capacity

  20. Parameter Identification of Anaerobic Wastewater Treatment Bioprocesses Using Particle Swarm Optimization

    OpenAIRE

    Dorin Sendrescu

    2013-01-01

    This paper deals with the offline parameters identification for a class of wastewater treatment bioprocesses using particle swarm optimization (PSO) techniques. Particle swarm optimization is a relatively new heuristic method that has produced promising results for solving complex optimization problems. In this paper one uses some variants of the PSO algorithm for parameter estimation of an anaerobic wastewater treatment process that is a complex biotechnological system. The identification sc...

  1. Computer-automated multi-disciplinary analysis and design optimization of internally cooled turbine blades

    Science.gov (United States)

    Martin, Thomas Joseph

    This dissertation presents the theoretical methodology, organizational strategy, conceptual demonstration and validation of a fully automated computer program for the multi-disciplinary analysis, inverse design and optimization of convectively cooled axial gas turbine blades and vanes. Parametric computer models of the three-dimensional cooled turbine blades and vanes were developed, including the automatic generation of discretized computational grids. Several new analysis programs were written and incorporated with existing computational tools to provide computer models of the engine cycle, aero-thermodynamics, heat conduction and thermofluid physics of the internally cooled turbine blades and vanes. A generalized information transfer protocol was developed to provide the automatic mapping of geometric and boundary condition data between the parametric design tool and the numerical analysis programs. A constrained hybrid optimization algorithm controlled the overall operation of the system and guided the multi-disciplinary internal turbine cooling design process towards the objectives and constraints of engine cycle performance, aerodynamic efficiency, cooling effectiveness and turbine blade and vane durability. Several boundary element computer programs were written to solve the steady-state non-linear heat conduction equation inside the internally cooled and thermal barrier-coated turbine blades and vanes. The boundary element method (BEM) did not require grid generation inside the internally cooled turbine blades and vanes, so the parametric model was very robust. Implicit differentiations of the BEM thermal and thereto-elastic analyses were done to compute design sensitivity derivatives faster and more accurately than via explicit finite differencing. A factor of three savings of computer processing time was realized for two-dimensional thermal optimization problems, and a factor of twenty was obtained for three-dimensional thermal optimization problems

  2. Application of optimal input synthesis to aircraft parameter identification

    Science.gov (United States)

    Gupta, N. K.; Hall, W. E., Jr.; Mehra, R. K.

    1976-01-01

    The Frequency Domain Input Synthesis procedure is used in identifying the stability and control derivatives of an aircraft. By using a frequency-domain approach, one can handle criteria that are not easily handled by the time-domain approaches. Numerical results are presented for optimal elevator deflections to estimate the longitudinal stability and control derivatives subject to root-mean square constraints on the input. The applicability of the steady state optimal inputs to finite duration flight testing is investigated. The steady state approximation of frequency-domain synthesis is good for data lengths greater than two time cycles for the short period mode of the aircraft longitudinal motions. Phase relationships between different frequency components become important for shorter data lengths. The frequency domain inputs are shown to be much better than the conventional doublet inputs.

  3. Mechanical surface treatment of steel-Optimization parameters of regime

    Science.gov (United States)

    Laouar, L.; Hamadache, H.; Saad, S.; Bouchelaghem, A.; Mekhilef, S.

    2009-11-01

    Mechanical treatment process by superficial plastic deformation is employed for finished mechanical part surface. It introduces structural modifications that offer to basic material new properties witch give a high quality of physical and geometrical on superficial layers. This study focuses on the application of burnishing treatment (ball burnishing) on XC48 steel and parameters optimisation of treatment regime. Three important parameters were considered: burnishing force ' Py', burnishing feed 'f' and ball radius 'r'. An empirical model has been developed to illustrate the relationship between these parameters and superficial layer characteristics defined by surface roughness ' Ra' and superficial hardness ' Hv'. A program was developed in order to determine the optimum treatment regimes for each characteristic.

  4. Parameter estimation and optimal experimental design in flow reactors

    OpenAIRE

    Carraro, Thomas

    2005-01-01

    In this work we present numerical techniques, based on the finite element method, for the simulation of reactive flows in a chemical flow reactor as well as for the identification of the kinetic of the reactions using measurements of observable quantities. We present the case of a real experiment in which the reaction rate is estimated by means of concentration measurements. We introduce methods for the optimal experimental design of experiments in the context of reactive flows modeled by par...

  5. A new ensemble algorithm of differential evolution and backtracking search optimization algorithm with adaptive control parameter for function optimization

    Directory of Open Access Journals (Sweden)

    Sukanta Nama

    2016-04-01

    Full Text Available Differential evolution (DE is an effective and powerful approach and it has been widely used in different environments. However, the performance of DE is sensitive to the choice of control parameters. Thus, to obtain optimal performance, time-consuming parameter tuning is necessary. Backtracking Search Optimization Algorithm (BSA is a new evolutionary algorithm (EA for solving real-valued numerical optimization problems. An ensemble algorithm called E-BSADE is proposed which incorporates concepts from DE and BSA. The performance of E-BSADE is evaluated on several benchmark functions and is compared with basic DE, BSA and conventional DE mutation strategy.

  6. GPU based Monte Carlo for PET image reconstruction: parameter optimization

    International Nuclear Information System (INIS)

    This paper presents the optimization of a fully Monte Carlo (MC) based iterative image reconstruction of Positron Emission Tomography (PET) measurements. With our MC re- construction method all the physical effects in a PET system are taken into account thus superior image quality is achieved in exchange for increased computational effort. The method is feasible because we utilize the enormous processing power of Graphical Processing Units (GPUs) to solve the inherently parallel problem of photon transport. The MC approach regards the simulated positron decays as samples in mathematical sums required in the iterative reconstruction algorithm, so to complement the fast architecture, our work of optimization focuses on the number of simulated positron decays required to obtain sufficient image quality. We have achieved significant results in determining the optimal number of samples for arbitrary measurement data, this allows to achieve the best image quality with the least possible computational effort. Based on this research recommendations can be given for effective partitioning of computational effort into the iterations in limited time reconstructions. (author)

  7. Design of Digital Imaging System for Optimization of Control Parameters

    Institute of Scientific and Technical Information of China (English)

    SONG Yong; HAO Qun; YANG Guang; SUN Hong-wei

    2007-01-01

    The design of experimental system of digital imaging system for control parameter is discussed in detail. Signal processing of digital CCD imaging system is first analyzed. Then the real time control of CCD driver and digital processing circuit and man-machine interaction are achieved by the design of digital CCD imaging module and control module. Experimental results indicate that the image quality of CCD experimental system makes a good response to the change of control parameters. The system gives an important base for improving image quality and the applicability of micro imaging system in complex environment.

  8. Optimization of control parameters for petroleum waste composting

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Composting is being widely employed in the treatment of petroleum waste. The purpose of this study was to find the optimum control parameters for petroleum waste in-vessel composting. Various physical and chemical parameters were monitored to evaluate their influence on the microbial communities present in composting. The CO2 evolution and the number of microorganisms were measured as theactivity of composting. The results demonstrated that the optimum temperature, pH and moisture content were 56.5-59.5, 7.0-8.5 and 55%-60%, respectively. Under the optimum conditions, the removal efficiency of petroleum hydrocarbon reached 83.29% after 30 days composting.

  9. Model of Stochastic Automation Asymptotically Optimal Behavior for Inter-budget Regulation

    Directory of Open Access Journals (Sweden)

    Elena D. Streltsova

    2013-01-01

    Full Text Available This paper is focused on the topical issue of inter-budget control in the structure ↔ by applying econometric models. To create the decision-making model, mathematical tool of the theory of stochastic automation, operating in random environments was used. On the basis of the application of this mathematical tool, the adaptive training economic and mathematical model, able to adapt to the environment, maintained by the income from the payment of federal and regional taxes and fees, payable to the budget of the constituent entity of the RF and paid to the budget of a lower level in the form of budget regulation was developed. The authors have developed the structure of the machine, described its behavior in a random environment and introduced the expression for the final probabilities of machine in each of its states. The behavioral aspect of the machine by means of a mathematically rigorous proof of the theorem on the feasibility of behavior and the asymptotic optimality of the proposed design of the machine were presented.

  10. Optimal feature selection for automated classification of FDG-PET in patients with suspected dementia

    Science.gov (United States)

    Serag, Ahmed; Wenzel, Fabian; Thiele, Frank; Buchert, Ralph; Young, Stewart

    2009-02-01

    FDG-PET is increasingly used for the evaluation of dementia patients, as major neurodegenerative disorders, such as Alzheimer's disease (AD), Lewy body dementia (LBD), and Frontotemporal dementia (FTD), have been shown to induce specific patterns of regional hypo-metabolism. However, the interpretation of FDG-PET images of patients with suspected dementia is not straightforward, since patients are imaged at different stages of progression of neurodegenerative disease, and the indications of reduced metabolism due to neurodegenerative disease appear slowly over time. Furthermore, different diseases can cause rather similar patterns of hypo-metabolism. Therefore, classification of FDG-PET images of patients with suspected dementia may lead to misdiagnosis. This work aims to find an optimal subset of features for automated classification, in order to improve classification accuracy of FDG-PET images in patients with suspected dementia. A novel feature selection method is proposed, and performance is compared to existing methods. The proposed approach adopts a combination of balanced class distributions and feature selection methods. This is demonstrated to provide high classification accuracy for classification of FDG-PET brain images of normal controls and dementia patients, comparable with alternative approaches, and provides a compact set of features selected.

  11. Automated synthesis of both the topology and numerical parameters for seven patented optical lens systems using genetic programming

    Science.gov (United States)

    Jones, Lee W.; Al-Sakran, Sameer H.; Koza, John R.

    2005-08-01

    This paper describes how genetic programming was used as an automated invention machine to synthesize both the topology and numerical parameters for seven previously patented optical lens systems, including one aspherical system and one issued in the 21st-century. Two of the evolved optical lens systems infringe the claims of the patents and the others are novel solutions that satisfy the design goals stated in the patent. The automatic synthesis was done "from scratch"--that is, without starting from a pre-existing good design and without pre-specifying the number of lenses, the topological layout of the lenses, or the numerical parameters of the lenses. Genetic programming is a form of evolutionary computation used to automatically solve problems. It starts from a high-level statement of what needs to be done and progressively breeds a population of candidate individuals over many generations using the principle of Darwinian natural selection and genetic recombination. The paper describes how genetic programming created eyepieces that duplicated the functionality of seven previously patented lens systems. The seven designs were created in a substantially similar and routine way, suggesting that the use of genetic programming in the automated design of both the topology and numerical parameters for optical lens systems may have widespread utility.

  12. Design optimization on the front wheel orientation parameters of a vehicle

    Institute of Scientific and Technical Information of China (English)

    CHU Zhigang; DENG Zhaoxiang; HU Yumei; ZHU Ming

    2003-01-01

    A uniform optimization object function for front wheel orientation parameters of a vehicle is reported, which includes the tolerances of practical values and set values of front wheel orientation parameters under full load, and the changing value of each parameter with front wheel fluctuation to build a front suspension model for optimization analysis based on the multi-body dynamic (MD) theory. The original suspension is optimized with this model, and the variation law of each parameter with front wheel fluctuation is obtained. The results of a case study demonstrate that the front wheel orientation parameters of the optimized vehicle are reasonable under typical conditions and the variation of each parameter is in an ideal range with the wheel fluctuating within ±40 mm. In addition, the driving performance is improved greatly in the road test and practical use.

  13. MUSE: MUlti-atlas region Segmentation utilizing Ensembles of registration algorithms and parameters, and locally optimal atlas selection.

    Science.gov (United States)

    Doshi, Jimit; Erus, Guray; Ou, Yangming; Resnick, Susan M; Gur, Ruben C; Gur, Raquel E; Satterthwaite, Theodore D; Furth, Susan; Davatzikos, Christos

    2016-02-15

    Atlas-based automated anatomical labeling is a fundamental tool in medical image segmentation, as it defines regions of interest for subsequent analysis of structural and functional image data. The extensive investigation of multi-atlas warping and fusion techniques over the past 5 or more years has clearly demonstrated the advantages of consensus-based segmentation. However, the common approach is to use multiple atlases with a single registration method and parameter set, which is not necessarily optimal for every individual scan, anatomical region, and problem/data-type. Different registration criteria and parameter sets yield different solutions, each providing complementary information. Herein, we present a consensus labeling framework that generates a broad ensemble of labeled atlases in target image space via the use of several warping algorithms, regularization parameters, and atlases. The label fusion integrates two complementary sources of information: a local similarity ranking to select locally optimal atlases and a boundary modulation term to refine the segmentation consistently with the target image's intensity profile. The ensemble approach consistently outperforms segmentations using individual warping methods alone, achieving high accuracy on several benchmark datasets. The MUSE methodology has been used for processing thousands of scans from various datasets, producing robust and consistent results. MUSE is publicly available both as a downloadable software package, and as an application that can be run on the CBICA Image Processing Portal (https://ipp.cbica.upenn.edu), a web based platform for remote processing of medical images. PMID:26679328

  14. An Effect and Analysis of Parameter on Ant Colony Optimization for Solving Travelling Salesman Problem

    OpenAIRE

    Km. Shweta; Alka Singh

    2013-01-01

    Ant Colony optimization has proved suitable to solve a wide range of combinatorial optimization(or NP-hard) problems as the Travelling Salesman Problem (TSP). The first step of ACO algorithm is to setthe parameters that drive the algorithm. The parameter has an important impact on the performance of theant colony algorithm. The basic parameters that are used in ACO algorithms are; the relative importance (orweight) of pheromone, the relative importance of heuristics value, initial pheromone v...

  15. Optimal Parameter and Uncertainty Estimation of a Land Surface Model: Sensitivity to Parameter Ranges and Model Complexities

    Institute of Scientific and Technical Information of China (English)

    Youlong XIA; Zong-Liang YANG; Paul L. STOFFA; Mrinal K. SEN

    2005-01-01

    Most previous land-surface model calibration studies have defined global ranges for their parameters to search for optimal parameter sets. Little work has been conducted to study the impacts of realistic versus global ranges as well as model complexities on the calibration and uncertainty estimates. The primary purpose of this paper is to investigate these impacts by employing Bayesian Stochastic Inversion (BSI)to the Chameleon Surface Model (CHASM). The CHASM was designed to explore the general aspects of land-surface energy balance representation within a common modeling framework that can be run from a simple energy balance formulation to a complex mosaic type structure. The BSI is an uncertainty estimation technique based on Bayes theorem, importance sampling, and very fast simulated annealing.The model forcing data and surface flux data were collected at seven sites representing a wide range of climate and vegetation conditions. For each site, four experiments were performed with simple and complex CHASM formulations as well as realistic and global parameter ranges. Twenty eight experiments were conducted and 50 000 parameter sets were used for each run. The results show that the use of global and realistic ranges gives similar simulations for both modes for most sites, but the global ranges tend to produce some unreasonable optimal parameter values. Comparison of simple and complex modes shows that the simple mode has more parameters with unreasonable optimal values. Use of parameter ranges and model complexities have significant impacts on frequency distribution of parameters, marginal posterior probability density functions, and estimates of uncertainty of simulated sensible and latent heat fluxes.Comparison between model complexity and parameter ranges shows that the former has more significant impacts on parameter and uncertainty estimations.

  16. Air conditioning with methane: Efficiency and economics optimization parameters

    International Nuclear Information System (INIS)

    This paper presents an efficiency and economics evaluation method for methane fired cooling systems. Focus is on direct flame two staged absorption systems and alternative engine driven compressor sets. Comparisons are made with conventional vapour compression plants powered by electricity supplied by the national grid. A first and second law based thermodynamics analysis is made in which fuel use coefficients and exergy yields are determined. The economics analysis establishes annual energy savings, unit cooling energy production costs, payback periods and economics/efficiency optimization curves useful for preliminary feasibility studies

  17. Optimization of process parameters for osmotic dehydration of papaya cubes

    OpenAIRE

    S.K. Jain; R. C. Verma; Murdia, L. K.; Jain, H. K.; Sharma, G. P.

    2010-01-01

    Process temperature (30, 40 and 50 °C), syrup concentration (50, 60 and 70o Brix) and process time (4, 5 and 6 h) for osmotic dehydration of papaya (Carica papaya) cubes were optimized for the maximum water loss and optimum sugar gain by using response surface methodology. The peeled and pre-processed papaya cubes of 1 cm size were immersed in sugar syrup at constant temperature water bath having syrup to papaya cubes ratio of 4:1 (w/w). The cubes were removed from bath at pre-decided time, r...

  18. Optimization of Process Parameters of Tool Wear in Turning Operation

    Directory of Open Access Journals (Sweden)

    Manik Barman

    2015-04-01

    Full Text Available Tool Wear is of great apprehension in machining industries since itaffects the surface quality, dimensional accuracy and production cost of the materials / components. In the present study twenty seven experiments were conducted as per 3 parameter 3 level full factorial design for turning operation of a mild steel specimen with high speed steel (HSS cutting tool. An experimental investigation on cutting tool wear and a mathematical model for tool wear estimation is reported in this paper where the model was simulated by computer programming and it has been found that this model is capable of estimating the wear rate of cutting tool and it provides an optimum set of process parameters for minimum tool wear.

  19. On optimal detection and estimation of the FCN parameters

    Science.gov (United States)

    Yatskiv, Y.

    2009-09-01

    Statistical approach for detection and estimation of parameters of short-term quasi- periodic processes was used in order to investigate the Free Core Nutation (FCN) signal in the Celestial Pole Offset (CPO). The results show that this signal is very unstable and that it disappeared in year 2000. The amplitude of oscillation with period of about 435 days is larger for dX as compared with that for dY .

  20. GEANT4 for breast dosimetry: parameters optimization study

    Science.gov (United States)

    Fedon, C.; Longo, F.; Mettivier, G.; Longo, R.

    2015-08-01

    Mean glandular dose (MGD) is the main dosimetric quantity in mammography. MGD evaluation is obtained by multiplying the entrance skin air kerma (ESAK) by normalized glandular dose (DgN) coefficients. While ESAK is an empirical quantity, DgN coefficients can only be estimated with Monte Carlo (MC) methods. Thus, a MC parameters benchmark is needed for effectively evaluating DgN coefficients. GEANT4 is a MC toolkit suitable for medical purposes that offers to the users several computational choices. In this work we investigate the GEANT4 performances testing the main PhysicsLists for medical applications. Four electromagnetic PhysicsLists were implemented: the linear attenuation coefficients were calculated for breast glandularity 0%, 50%, 100% in the energetic range 8-50 keV and DgN coefficients were evaluated. The results were compared with published data. Fit equations for the estimation of the G-factor parameter, introduced by the literature for converting the dose delivered in the heterogeneous medium to that in the glandular tissue, are proposed and the application of this parameter interaction-by-interaction or retrospectively is discussed. G4EmLivermorePhysicsList shows the best agreement for the linear attenuation coefficients both with theoretical values and published data. Moreover, excellent correlation factor ({{r}2}>0.99 ) is found for the DgN coefficients with the literature. The final goal of this study is to identify, for the first time, a benchmark of parameters that could be useful for future breast dosimetry studies with GEANT4.

  1. Parameter selection for the SSC trade-offs and optimization

    International Nuclear Information System (INIS)

    In November of 1988, a site was selected in the state of Texas for the SSC. In January of 1989, the SSC Laboratory was established in Texas to adapt the design of the collider to the site and to manage the construction of the project. This paper describes the evolution of the SSC design since site selection, notes the increased concentration on the injector system, and addresses the rationale for choice of parameters

  2. Multidimensional Optimization of Signal Space Distance Parameters in WLAN Positioning

    OpenAIRE

    Milenko Brković; Mirjana Simić

    2014-01-01

    Accurate indoor localization of mobile users is one of the challenging problems of the last decade. Besides delivering high speed Internet, Wireless Local Area Network (WLAN) can be used as an effective indoor positioning system, being competitive both in terms of accuracy and cost. Among the localization algorithms, nearest neighbor fingerprinting algorithms based on Received Signal Strength (RSS) parameter have been extensively studied as an inexpensive solution for delivering indoor Locati...

  3. Relationships among various parameters for decision tree optimization

    KAUST Repository

    Hussain, Shahid

    2014-01-14

    In this chapter, we study, in detail, the relationships between various pairs of cost functions and between uncertainty measure and cost functions, for decision tree optimization. We provide new tools (algorithms) to compute relationship functions, as well as provide experimental results on decision tables acquired from UCI ML Repository. The algorithms presented in this paper have already been implemented and are now a part of Dagger, which is a software system for construction/optimization of decision trees and decision rules. The main results presented in this chapter deal with two types of algorithms for computing relationships; first, we discuss the case where we construct approximate decision trees and are interested in relationships between certain cost function, such as depth or number of nodes of a decision trees, and an uncertainty measure, such as misclassification error (accuracy) of decision tree. Secondly, relationships between two different cost functions are discussed, for example, the number of misclassification of a decision tree versus number of nodes in a decision trees. The results of experiments, presented in the chapter, provide further insight. © 2014 Springer International Publishing Switzerland.

  4. Optimization of Nano-Process Deposition Parameters Based on Gravitational Search Algorithm

    Directory of Open Access Journals (Sweden)

    Norlina Mohd Sabri

    2016-06-01

    Full Text Available This research is focusing on the radio frequency (RF magnetron sputtering process, a physical vapor deposition technique which is widely used in thin film production. This process requires the optimized combination of deposition parameters in order to obtain the desirable thin film. The conventional method in the optimization of the deposition parameters had been reported to be costly and time consuming due to its trial and error nature. Thus, gravitational search algorithm (GSA technique had been proposed to solve this nano-process parameters optimization problem. In this research, the optimized parameter combination was expected to produce the desirable electrical and optical properties of the thin film. The performance of GSA in this research was compared with that of Particle Swarm Optimization (PSO, Genetic Algorithm (GA, Artificial Immune System (AIS and Ant Colony Optimization (ACO. Based on the overall results, the GSA optimized parameter combination had generated the best electrical and an acceptable optical properties of thin film compared to the others. This computational experiment is expected to overcome the problem of having to conduct repetitive laboratory experiments in obtaining the most optimized parameter combination. Based on this initial experiment, the adaptation of GSA into this problem could offer a more efficient and productive way of depositing quality thin film in the fabrication process.

  5. Optimization of automated radiosynthesis of [18F]AV-45: a new PET imaging agent for Alzheimer's disease

    International Nuclear Information System (INIS)

    Introduction: Accumulation of β-amyloid (Aβ) aggregates in the brain is linked to the pathogenesis of Alzheimer's disease (AD). Imaging probes targeting these Aβ aggregates in the brain may provide a useful tool to facilitate the diagnosis of AD. Recently, [18F]AV-45 ([18F]5) demonstrated high binding to the Aβ aggregates in AD patients. To improve the availability of this agent for widespread clinical application, a rapid, fully automated, high-yield, cGMP-compliant radiosynthesis was necessary for production of this probe. We report herein an optimal [18F]fluorination, de-protection condition and fully automated radiosynthesis of [18F]AV-45 ([18F]5) on a radiosynthesis module (BNU F-A2). Methods: The preparation of [18F]AV-45 ([18F]5) was evaluated under different conditions, specifically by employing different precursors (-OTs and -Br as the leaving group), reagents (K222/K2CO3 vs. tributylammonium bicarbonate) and deprotection in different acids. With optimized conditions from these experiments, the automated synthesis of [18F]AV-45 ([18F]5) was accomplished by using a computer-programmed, standard operating procedure, and was purified on an on-line solid-phase cartridge (Oasis HLB). Results: The optimized reaction conditions were successfully implemented to an automated nucleophilic fluorination module. The radiochemical purity of [18F]AV-45 ([18F]5) was >95%, and the automated synthesis yield was 33.6±5.2% (no decay corrected, n=4), 50.1±7.9% (decay corrected) in 50 min at a quantity level of 10-100 mCi (370-3700 MBq). Autoradiography studies of [18F]AV-45 ([18F]5) using postmortem AD brain and Tg mouse brain sections in the presence of different concentration of 'cold' AV-136 showed a relatively low inhibition of in vitro binding of [18F]AV-45 ([18F]5) to the Aβ plaques (IC50=1-4 μM, a concentration several order of magnitude higher than the expected pseudo carrier concentration in the brain). Conclusions: Solid-phase extraction purification and

  6. Estimability analysis for optimization of hysteretic soil hydraulic parameters using data of a field irrigation experiment

    Science.gov (United States)

    Ngo, Viet V.; Gerke, Horst H.; Badorreck, Annika

    2014-05-01

    The estimability analysis has been proposed to improve the quality of parameter optimization. For field data, wetting and drying processes may complicate optimization of soil hydraulic parameters. The objectives of this study were to apply estimability analysis for improving optimization of soil hydraulic parameters and compare models with and without considering hysteresis. Soil water pressure head data of a field irrigation experiment were used. The one-dimensional vertical water movement in variably-saturated soil was described with the Richards equation using the HYDRUS-1D code. Estimability of the unimodal van Genuchten - Mualem hydraulic model parameters as well as of the hysteretic parameter model of Parker and Lenhard was classified according to a sensitivity coefficient matrix. The matrix was obtained by sequentially calculating effects of initial parameter variations on changes in the simulated pressure head values. Optimization was carried out by means of the Levenberg-Marquardt method as implemented in the HYDRUS-1D code. The parameters α, Ks, θs, and n in the nonhysteretic model were found sensitive and parameter θs and n strongly correlated with parameter n in the nonhysteretic model. When assuming hysteresis, the estimability was highest for αw and decreased with soil depth for Ks and αd, and increased for θs and n. The hysteretic model could approximate the pressure heads in the soil by considering parameters from wetting and drying periods separately as initial estimates. The inverse optimization could be carried out more efficiently with most estimable parameters. Despite the weaknesses of the local optimization algorithm and the inflexibility of the unimodal van Genuchten model, the results suggested that estimability analysis could be considered as a guidance to better define the optimization scenarios and then improved the determination of soil hydraulic parameters.

  7. The Study of the Optimal Parameter Settings in a Hospital Supply Chain System in Taiwan

    Directory of Open Access Journals (Sweden)

    Hung-Chang Liao

    2014-01-01

    Full Text Available This study proposed the optimal parameter settings for the hospital supply chain system (HSCS when either the total system cost (TSC or patient safety level (PSL (or both simultaneously was considered as the measure of the HSCS’s performance. Four parameters were considered in the HSCS: safety stock, maximum inventory level, transportation capacity, and the reliability of the HSCS. A full-factor experimental design was used to simulate an HSCS for the purpose of collecting data. The response surface method (RSM was used to construct the regression model, and a genetic algorithm (GA was applied to obtain the optimal parameter settings for the HSCS. The results show that the best method of obtaining the optimal parameter settings for the HSCS is the simultaneous consideration of both the TSC and the PSL to measure performance. Also, the results of sensitivity analysis based on the optimal parameter settings were used to derive adjustable strategies for the decision-makers.

  8. Teaching-learning-based Optimization Algorithm for Parameter Identification in the Design of IIR Filters

    Science.gov (United States)

    Singh, R.; Verma, H. K.

    2013-12-01

    This paper presents a teaching-learning-based optimization (TLBO) algorithm to solve parameter identification problems in the designing of digital infinite impulse response (IIR) filter. TLBO based filter modelling is applied to calculate the parameters of unknown plant in simulations. Unlike other heuristic search algorithms, TLBO algorithm is an algorithm-specific parameter-less algorithm. In this paper big bang-big crunch (BB-BC) optimization and PSO algorithms are also applied to filter design for comparison. Unknown filter parameters are considered as a vector to be optimized by these algorithms. MATLAB programming is used for implementation of proposed algorithms. Experimental results show that the TLBO is more accurate to estimate the filter parameters than the BB-BC optimization algorithm and has faster convergence rate when compared to PSO algorithm. TLBO is used where accuracy is more essential than the convergence speed.

  9. Parameter Identification of Anaerobic Wastewater Treatment Bioprocesses Using Particle Swarm Optimization

    Directory of Open Access Journals (Sweden)

    Dorin Sendrescu

    2013-01-01

    Full Text Available This paper deals with the offline parameters identification for a class of wastewater treatment bioprocesses using particle swarm optimization (PSO techniques. Particle swarm optimization is a relatively new heuristic method that has produced promising results for solving complex optimization problems. In this paper one uses some variants of the PSO algorithm for parameter estimation of an anaerobic wastewater treatment process that is a complex biotechnological system. The identification scheme is based on a multimodal numerical optimization problem with high dimension. The performances of the method are analyzed by numerical simulations.

  10. Optimal measurement locations for parameter estimation of non linear distributed parameter systems

    Directory of Open Access Journals (Sweden)

    J. E. Alaña

    2010-12-01

    Full Text Available A sensor placement approach for the purpose of accurately estimating unknown parameters of a distributed parameter system is discussed. The idea is to convert the sensor location problem to a classical experimental design. The technique consists of analysing the extrema values of the sensitivity coefficients derived from the system and their corresponding spatial positions. This information is used to formulate an efficient computational optimum experiment design on discrete domains. The scheme studied is verified by a numerical example regarding the chemical reaction in a tubular reactor for two possible scenarios; stable and unstable operation conditions. The resulting approach is easy to implement and good estimates for the parameters of the system are obtained. This study shows that the measurement location plays an essential role in the parameter estimation procedure.

  11. Optimization of Experimental Model Parameter Identification for Energy Storage Systems

    Directory of Open Access Journals (Sweden)

    Rosario Morello

    2013-09-01

    Full Text Available The smart grid approach is envisioned to take advantage of all available modern technologies in transforming the current power system to provide benefits to all stakeholders in the fields of efficient energy utilisation and of wide integration of renewable sources. Energy storage systems could help to solve some issues that stem from renewable energy usage in terms of stabilizing the intermittent energy production, power quality and power peak mitigation. With the integration of energy storage systems into the smart grids, their accurate modeling becomes a necessity, in order to gain robust real-time control on the network, in terms of stability and energy supply forecasting. In this framework, this paper proposes a procedure to identify the values of the battery model parameters in order to best fit experimental data and integrate it, along with models of energy sources and electrical loads, in a complete framework which represents a real time smart grid management system. The proposed method is based on a hybrid optimisation technique, which makes combined use of a stochastic and a deterministic algorithm, with low computational burden and can therefore be repeated over time in order to account for parameter variations due to the battery’s age and usage.

  12. Optimal choice of trapezoidal shaping parameters in digital nuclear spectrometer system

    International Nuclear Information System (INIS)

    Trapezoidal shaping method is widely applied to pulse amplitude extraction in digital nuclear spectrometer system, the optimal selection of the shaping parameters can improve the energy resolution and pulse counting rate. From the view of noise characteristics, ballistic deficit compensation characteristics and pulse pile-up characteristics, in this paper the optimal selection of the trapezoidal shaping parameters is studied on. According to the theoretical analysis and experimental verification, the optimal choice of trapezoidal shaping parameters is similar to the triangle, the rise time is longer and the flat-top width is shorter. (authors)

  13. Parameter estimation for chaotic systems with a Drift Particle Swarm Optimization method

    International Nuclear Information System (INIS)

    Inspired by the motion of electrons in metal conductors in an electric field, we propose a variant of Particle Swarm Optimization (PSO), called Drift Particle Swarm Optimization (DPSO) algorithm, and apply it in estimating the unknown parameters of chaotic dynamic systems. The principle and procedure of DPSO are presented, and the algorithm is used to identify Lorenz system and Chen system. The experiment results show that for the given parameter configurations, DPSO can identify the parameters of the systems accurately and effectively, and it may be a promising tool for chaotic system identification as well as other numerical optimization problems in physics.

  14. Feature Selection and Parameter Optimization of Support Vector Machines Based on Modified Cat Swarm Optimization

    OpenAIRE

    Kuan-Cheng Lin; Yi-Hung Huang; Jason C. Hung; Yung-Tso Lin

    2015-01-01

    Recently, applications of Internet of Things create enormous volumes of data, which are available for classification and prediction. Classification of big data needs an effective and efficient metaheuristic search algorithm to find the optimal feature subset. Cat swarm optimization (CSO) is a novel metaheuristic for evolutionary optimization algorithms based on swarm intelligence. CSO imitates the behavior of cats through two submodes: seeking and tracing. Previous studies have indicated that...

  15. Steam condenser optimization using Real-parameter Genetic Algorithm for Prototype Fast Breeder Reactor

    International Nuclear Information System (INIS)

    Highlights: → We model design optimization of a vital reactor component using Genetic Algorithm. → Real-parameter Genetic Algorithm is used for steam condenser optimization study. → Comparison analysis done with various Genetic Algorithm related mechanisms. → The results obtained are validated with the reference study results. - Abstract: This work explores the use of Real-parameter Genetic Algorithm and analyses its performance in the steam condenser (or Circulating Water System) optimization study of a 500 MW fast breeder nuclear reactor. Choice of optimum design parameters for condenser for a power plant from among a large number of technically viable combination is a complex task. This is primarily due to the conflicting nature of the economic implications of the different system parameters for maximizing the capitalized profit. In order to find the optimum design parameters a Real-parameter Genetic Algorithm model is developed and applied. The results obtained are validated with the reference study results.

  16. Measuring Digital PCR Quality: Performance Parameters and Their Optimization.

    Science.gov (United States)

    Lievens, A; Jacchia, S; Kagkli, D; Savini, C; Querci, M

    2016-01-01

    Digital PCR is rapidly being adopted in the field of DNA-based food analysis. The direct, absolute quantification it offers makes it an attractive technology for routine analysis of food and feed samples for their composition, possible GMO content, and compliance with labelling requirements. However, assessing the performance of dPCR assays is not yet well established. This article introduces three straightforward parameters based on statistical principles that allow users to evaluate if their assays are robust. In addition, we present post-run evaluation criteria to check if quantification was accurate. Finally, we evaluate the usefulness of Poisson confidence intervals and present an alternative strategy to better capture the variability in the analytical chain. PMID:27149415

  17. Enhancing parameter precision of optimal quantum estimation by quantum screening

    Science.gov (United States)

    Jiang, Huang; You-Neng, Guo; Qin, Xie

    2016-02-01

    We propose a scheme of quantum screening to enhance the parameter-estimation precision in open quantum systems by means of the dynamics of quantum Fisher information. The principle of quantum screening is based on an auxiliary system to inhibit the decoherence processes and erase the excited state to the ground state. By comparing the case without quantum screening, the results show that the dynamics of quantum Fisher information with quantum screening has a larger value during the evolution processes. Project supported by the National Natural Science Foundation of China (Grant No. 11374096), the Natural Science Foundation of Guangdong Province, China (Grants No. 2015A030310354), and the Project of Enhancing School with Innovation of Guangdong Ocean University (Grants Nos. GDOU2014050251 and GDOU2014050252).

  18. 1000 MW ultra-supercritical turbine steam parameter optimization

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    The 2 ×1000 MW ultra-supercritical steam turbine of Shanghai Waigaoqiao Phase Ⅲ project,which uses grid frequency regulation and overload control through an overload valve,is manufactured by Shanghai Turbine Company using Siemens technology.Through optimization,the steam pressure is regarded as the criterion between constant pressure and sliding pressure operation.At high circulating water temperature,the turbine overload valve is kept closed when the unit load is lower than 1000 MW while at other circulating water temperatures the turbine can run in sliding pressure operation when the unit load is higher than 1000 MW and the pressure is lower than 27 MPa This increases the unit operation efficiency.The 3D bending technology in the critical piping helps to reduce the project investment and minimize the reheat system pressure drop which improves the unit operation efficiency and safety.By choosing lower circulating water design temperature and by setting the individual Boiler Feedwater Turbine condenser to reduce the exhaust steam flow and the heat load to the main condenser,the unit average back pressure and the terminal temperature difference are minimized.Therefore,the unit heat efficiency is increased.

  19. Automated bone segmentation from dental CBCT images using patch-based sparse representation and convex optimization

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Li; Gao, Yaozong; Shi, Feng; Liao, Shu; Li, Gang [Department of Radiology and BRIC, University of North Carolina at Chapel Hill, North Carolina 27599 (United States); Chen, Ken Chung [Department of Oral and Maxillofacial Surgery, Houston Methodist Hospital Research Institute, Houston, Texas 77030 and Department of Stomatology, National Cheng Kung University Medical College and Hospital, Tainan, Taiwan 70403 (China); Shen, Steve G. F.; Yan, Jin [Department of Oral and Craniomaxillofacial Surgery and Science, Shanghai Ninth People' s Hospital, Shanghai Jiao Tong University College of Medicine, Shanghai, China 200011 (China); Lee, Philip K. M.; Chow, Ben [Hong Kong Dental Implant and Maxillofacial Centre, Hong Kong, China 999077 (China); Liu, Nancy X. [Department of Oral and Maxillofacial Surgery, Houston Methodist Hospital Research Institute, Houston, Texas 77030 and Department of Oral and Maxillofacial Surgery, Peking University School and Hospital of Stomatology, Beijing, China 100050 (China); Xia, James J. [Department of Oral and Maxillofacial Surgery, Houston Methodist Hospital Research Institute, Houston, Texas 77030 (United States); Department of Surgery (Oral and Maxillofacial Surgery), Weill Medical College, Cornell University, New York, New York 10065 (United States); Department of Oral and Craniomaxillofacial Surgery and Science, Shanghai Ninth People' s Hospital, Shanghai Jiao Tong University College of Medicine, Shanghai, China 200011 (China); Shen, Dinggang, E-mail: dgshen@med.unc.edu [Department of Radiology and BRIC, University of North Carolina at Chapel Hill, North Carolina 27599 and Department of Brain and Cognitive Engineering, Korea University, Seoul, 136701 (Korea, Republic of)

    2014-04-15

    Purpose: Cone-beam computed tomography (CBCT) is an increasingly utilized imaging modality for the diagnosis and treatment planning of the patients with craniomaxillofacial (CMF) deformities. Accurate segmentation of CBCT image is an essential step to generate three-dimensional (3D) models for the diagnosis and treatment planning of the patients with CMF deformities. However, due to the poor image quality, including very low signal-to-noise ratio and the widespread image artifacts such as noise, beam hardening, and inhomogeneity, it is challenging to segment the CBCT images. In this paper, the authors present a new automatic segmentation method to address these problems. Methods: To segment CBCT images, the authors propose a new method for fully automated CBCT segmentation by using patch-based sparse representation to (1) segment bony structures from the soft tissues and (2) further separate the mandible from the maxilla. Specifically, a region-specific registration strategy is first proposed to warp all the atlases to the current testing subject and then a sparse-based label propagation strategy is employed to estimate a patient-specific atlas from all aligned atlases. Finally, the patient-specific atlas is integrated into amaximum a posteriori probability-based convex segmentation framework for accurate segmentation. Results: The proposed method has been evaluated on a dataset with 15 CBCT images. The effectiveness of the proposed region-specific registration strategy and patient-specific atlas has been validated by comparing with the traditional registration strategy and population-based atlas. The experimental results show that the proposed method achieves the best segmentation accuracy by comparison with other state-of-the-art segmentation methods. Conclusions: The authors have proposed a new CBCT segmentation method by using patch-based sparse representation and convex optimization, which can achieve considerably accurate segmentation results in CBCT

  20. Moving Toward an Optimal and Automated Geospatial Network for CCUS Infrastructure

    Energy Technology Data Exchange (ETDEWEB)

    Hoover, Brendan Arthur [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-08-05

    Modifications in the global climate are being driven by the anthropogenic release of greenhouse gases (GHG) including carbon dioxide (CO2) (Middleton et al. 2014). CO2 emissions have, for example, been directly linked to an increase in total global temperature (Seneviratne et al. 2016). Strategies that limit CO2 emissions—like CO2 capture, utilization, and storage (CCUS) technology—can greatly reduce emissions by capturing CO2 before it is released to the atmosphere. However, to date CCUS technology has not been developed at a large commercial scale despite several promising high profile demonstration projects (Middleton et al. 2015). Current CCUS research has often focused on capturing CO2 emissions from coal-fired power plants, but recent research at Los Alamos National Laboratory (LANL) suggests focusing CCUS CO2 capture research upon industrial sources might better encourage CCUS deployment. To further promote industrial CCUS deployment, this project builds off current LANL research by continuing the development of a software tool called SimCCS, which estimates a regional system of transport to inject CO2 into sedimentary basins. The goal of SimCCS, which was first developed by Middleton and Bielicki (2009), is to output an automated and optimal geospatial industrial CCUS pipeline that accounts for industrial source and sink locations by estimating a Delaunay triangle network which also minimizes topographic and social costs (Middleton and Bielicki 2009). Current development of SimCCS is focused on creating a new version that accounts for spatial arrangements that were not available in the previous version. This project specifically addresses the issue of non-unique Delaunay triangles by adding additional triangles to the network, which can affect how the CCUS network is calculated.

  1. Automated bone segmentation from dental CBCT images using patch-based sparse representation and convex optimization

    International Nuclear Information System (INIS)

    Purpose: Cone-beam computed tomography (CBCT) is an increasingly utilized imaging modality for the diagnosis and treatment planning of the patients with craniomaxillofacial (CMF) deformities. Accurate segmentation of CBCT image is an essential step to generate three-dimensional (3D) models for the diagnosis and treatment planning of the patients with CMF deformities. However, due to the poor image quality, including very low signal-to-noise ratio and the widespread image artifacts such as noise, beam hardening, and inhomogeneity, it is challenging to segment the CBCT images. In this paper, the authors present a new automatic segmentation method to address these problems. Methods: To segment CBCT images, the authors propose a new method for fully automated CBCT segmentation by using patch-based sparse representation to (1) segment bony structures from the soft tissues and (2) further separate the mandible from the maxilla. Specifically, a region-specific registration strategy is first proposed to warp all the atlases to the current testing subject and then a sparse-based label propagation strategy is employed to estimate a patient-specific atlas from all aligned atlases. Finally, the patient-specific atlas is integrated into amaximum a posteriori probability-based convex segmentation framework for accurate segmentation. Results: The proposed method has been evaluated on a dataset with 15 CBCT images. The effectiveness of the proposed region-specific registration strategy and patient-specific atlas has been validated by comparing with the traditional registration strategy and population-based atlas. The experimental results show that the proposed method achieves the best segmentation accuracy by comparison with other state-of-the-art segmentation methods. Conclusions: The authors have proposed a new CBCT segmentation method by using patch-based sparse representation and convex optimization, which can achieve considerably accurate segmentation results in CBCT

  2. Rethinking design parameters in the search for optimal dynamic seating.

    Science.gov (United States)

    Pynt, Jennifer

    2015-04-01

    Dynamic seating design purports to lessen damage incurred during sedentary occupations by increasing sitter movement while modifying muscle activity. Dynamic sitting is currently defined by O'Sullivan et al. ( 2013a) as relating to 'the increased motion in sitting which is facilitated by the use of specific chairs or equipment' (p. 628). Yet the evidence is conflicting that dynamic seating creates variation in the sitter's lumbar posture or muscle activity with the overall consensus being that current dynamic seating design fails to fulfill its goals. Research is needed to determine if a new generation of chairs requiring active sitter involvement fulfills the goals of dynamic seating and aids cardio/metabolic health. This paper summarises the pursuit of knowledge regarding optimal seated spinal posture and seating design. Four new forms of dynamic seating encouraging active sitting are discussed. These are 1) The Core-flex with a split seatpan to facilitate a walking action while seated 2) the Duo balans requiring body action to create rocking 3) the Back App and 4) Locus pedestal stools both using the sitter's legs to drive movement. Unsubstantiated claims made by the designers of these new forms of dynamic seating are outlined. Avenues of research are suggested to validate designer claims and investigate whether these designs fulfill the goals of dynamic seating and assist cardio/metabolic health. Should these claims be efficacious then a new definition of dynamic sitting is suggested; 'Sitting in which the action is provided by the sitter, while the dynamic mechanism of the chair accommodates that action'. PMID:25892386

  3. Automated property optimization via ab initio O(N) elongation method: Application to (hyper-)polarizability in DNA

    Science.gov (United States)

    Orimoto, Yuuichi; Aoki, Yuriko

    2016-07-01

    An automated property optimization method was developed based on the ab initio O(N) elongation (ELG) method and applied to the optimization of nonlinear optical (NLO) properties in DNA as a first test. The ELG method mimics a polymerization reaction on a computer, and the reaction terminal of a starting cluster is attacked by monomers sequentially to elongate the electronic structure of the system by solving in each step a limited space including the terminal (localized molecular orbitals at the terminal) and monomer. The ELG-finite field (ELG-FF) method for calculating (hyper-)polarizabilities was used as the engine program of the optimization method, and it was found to show linear scaling efficiency while maintaining high computational accuracy for a random sequenced DNA model. Furthermore, the self-consistent field convergence was significantly improved by using the ELG-FF method compared with a conventional method, and it can lead to more feasible NLO property values in the FF treatment. The automated optimization method successfully chose an appropriate base pair from four base pairs (A, T, G, and C) for each elongation step according to an evaluation function. From test optimizations for the first order hyper-polarizability (β) in DNA, a substantial difference was observed depending on optimization conditions between "choose-maximum" (choose a base pair giving the maximum β for each step) and "choose-minimum" (choose a base pair giving the minimum β). In contrast, there was an ambiguous difference between these conditions for optimizing the second order hyper-polarizability (γ) because of the small absolute value of γ and the limitation of numerical differential calculations in the FF method. It can be concluded that the ab initio level property optimization method introduced here can be an effective step towards an advanced computer aided material design method as long as the numerical limitation of the FF method is taken into account.

  4. Automated property optimization via ab initio O(N) elongation method: Application to (hyper-)polarizability in DNA.

    Science.gov (United States)

    Orimoto, Yuuichi; Aoki, Yuriko

    2016-07-14

    An automated property optimization method was developed based on the ab initio O(N) elongation (ELG) method and applied to the optimization of nonlinear optical (NLO) properties in DNA as a first test. The ELG method mimics a polymerization reaction on a computer, and the reaction terminal of a starting cluster is attacked by monomers sequentially to elongate the electronic structure of the system by solving in each step a limited space including the terminal (localized molecular orbitals at the terminal) and monomer. The ELG-finite field (ELG-FF) method for calculating (hyper-)polarizabilities was used as the engine program of the optimization method, and it was found to show linear scaling efficiency while maintaining high computational accuracy for a random sequenced DNA model. Furthermore, the self-consistent field convergence was significantly improved by using the ELG-FF method compared with a conventional method, and it can lead to more feasible NLO property values in the FF treatment. The automated optimization method successfully chose an appropriate base pair from four base pairs (A, T, G, and C) for each elongation step according to an evaluation function. From test optimizations for the first order hyper-polarizability (β) in DNA, a substantial difference was observed depending on optimization conditions between "choose-maximum" (choose a base pair giving the maximum β for each step) and "choose-minimum" (choose a base pair giving the minimum β). In contrast, there was an ambiguous difference between these conditions for optimizing the second order hyper-polarizability (γ) because of the small absolute value of γ and the limitation of numerical differential calculations in the FF method. It can be concluded that the ab initio level property optimization method introduced here can be an effective step towards an advanced computer aided material design method as long as the numerical limitation of the FF method is taken into account. PMID:27421397

  5. Software integration for automated stability analysis and design optimization of a bearingless rotor blade

    Science.gov (United States)

    Gunduz, Mustafa Emre

    Many government agencies and corporations around the world have found the unique capabilities of rotorcraft indispensable. Incorporating such capabilities into rotorcraft design poses extra challenges because it is a complicated multidisciplinary process. The concept of applying several disciplines to the design and optimization processes may not be new, but it does not currently seem to be widely accepted in industry. The reason for this might be the lack of well-known tools for realizing a complete multidisciplinary design and analysis of a product. This study aims to propose a method that enables engineers in some design disciplines to perform a fairly detailed analysis and optimization of a design using commercially available software as well as codes developed at Georgia Tech. The ultimate goal is when the system is set up properly, the CAD model of the design, including all subsystems, will be automatically updated as soon as a new part or assembly is added to the design; or it will be updated when an analysis and/or an optimization is performed and the geometry needs to be modified. Designers and engineers will be involved in only checking the latest design for errors or adding/removing features. Such a design process will take dramatically less time to complete; therefore, it should reduce development time and costs. The optimization method is demonstrated on an existing helicopter rotor originally designed in the 1960's. The rotor is already an effective design with novel features. However, application of the optimization principles together with high-speed computing resulted in an even better design. The objective function to be minimized is related to the vibrations of the rotor system under gusty wind conditions. The design parameters are all continuous variables. Optimization is performed in a number of steps. First, the most crucial design variables of the objective function are identified. With these variables, Latin Hypercube Sampling method is used

  6. Optimal Control of Distributed Parameter Systems with Application to Transient Thermoelectric Cooling

    Directory of Open Access Journals (Sweden)

    KOTSUR, M.

    2015-05-01

    Full Text Available We give a solution of optimal control problem for distributed parameter systems described by nonlinear partial differential equations with nonstandard boundary conditions. The variational method is used to obtain the general form of the necessary conditions of optimality. A suitable algorithm based on the numerical method of successive approximations has been constructed for computing the optimal control functions. The results are applied for optimization of transient thermoelectric cooling process. Optimal dependences of current on time have been calculated for thermoelectric cooler power supply with the purpose of minimizing the cooling temperature within a preset time interval.

  7. User's manual for an aerodynamic optimization scheeme that updates flow variables and design parameters simultaneously

    Science.gov (United States)

    Rizk, Magdi H.

    1988-01-01

    This user's manual is presented for an aerodynamic optimization program that updates flow variables and design parameters simultaneously. The program was developed for solving constrained optimization problems in which the objective function and the constraint function are dependent on the solution of the nonlinear flow equations. The program was tested by applying it to the problem of optimizing propeller designs. Some reference to this particular application is therefore made in the manual. However, the optimization scheme is suitable for application to general aerodynamic design problems. A description of the approach used in the optimization scheme is first presented, followed by a description of the use of the program.

  8. CH4 parameter estimation in CLM4.5bgc using surrogate global optimization

    Science.gov (United States)

    Müller, J.; Paudel, R.; Shoemaker, C. A.; Woodbury, J.; Wang, Y.; Mahowald, N.

    2015-10-01

    Over the anthropocene methane has increased dramatically. Wetlands are one of the major sources of methane to the atmosphere, but the role of changes in wetland emissions is not well understood. The Community Land Model (CLM) of the Community Earth System Models contains a module to estimate methane emissions from natural wetlands and rice paddies. Our comparison of CH4 emission observations at 16 sites around the planet reveals, however, that there are large discrepancies between the CLM predictions and the observations. The goal of our study is to adjust the model parameters in order to minimize the root mean squared error (RMSE) between model predictions and observations. These parameters have been selected based on a sensitivity analysis. Because of the cost associated with running the CLM simulation (15 to 30 min on the Yellowstone Supercomputing Facility), only relatively few simulations can be allowed in order to find a near-optimal solution within an acceptable time. Our results indicate that the parameter estimation problem has multiple local minima. Hence, we use a computationally efficient global optimization algorithm that uses a radial basis function (RBF) surrogate model to approximate the objective function. We use the information from the RBF to select parameter values that are most promising with respect to improving the objective function value. We show with pseudo data that our optimization algorithm is able to make excellent progress with respect to decreasing the RMSE. Using the true CH4 emission observations for optimizing the parameters, we are able to significantly reduce the overall RMSE between observations and model predictions by about 50 %. The methane emission predictions of the CLM using the optimized parameters agree better with the observed methane emission data in northern and tropical latitudes. With the optimized parameters, the methane emission predictions are higher in northern latitudes than when the default parameters are

  9. Parameter optimization of controllable local degree of freedom for reducing vibration of flexible manipulator

    Institute of Scientific and Technical Information of China (English)

    Bian Yushu; Gao Zhihui

    2013-01-01

    Parameter optimization of the controllable local degree of freedom is studied for reducing vibration of the flexible manipulator at the lowest possible cost.The controllable local degrees of freedom are suggested and introduced to the topological structure of the flexible manipulator,and used as an effective way to alleviate vibration through dynamic coupling.Parameters introduced by the controllable local degrees of freedom are analyzed and their influences on vibration reduction are investigated.A strategy to optimize these parameters is put forward and the corresponding optimization method is suggested based on Particle Swarm Optimization (PSO).Simulations are conducted and results of case studies confirm that the proposed optimization method is effective in reducing vibration of the flexible manipulator at the lowest possible cost.

  10. Optimization of experimental designs and model parameters exemplified by sedimentation in salt marshes

    Directory of Open Access Journals (Sweden)

    J. Reimer

    2014-09-01

    Full Text Available The weighted least squares estimator for model parameters was presented together with its asymptotic properties. A popular approach to optimize experimental designs called local optimal experimental designs was described together with a lesser known approach which takes into account a potential nonlinearity of the model parameters. These two approaches were combined with two different methods to solve their underlying discrete optimization problem. All presented methods were implemented in an open source MATLAB toolbox called the Optimal Experimental Design Toolbox whose structure and handling was described. In numerical experiments, the model parameters and experimental design were optimized using this toolbox. Two models for sediment concentration in seawater of different complexity served as application example. The advantages and disadvantages of the different approaches were compared, and an evaluation of the approaches was performed.

  11. Optimization of experimental designs and model parameters exemplified by sedimentation in salt marshes

    Science.gov (United States)

    Reimer, J.; Schürch, M.; Slawig, T.

    2014-09-01

    The weighted least squares estimator for model parameters was presented together with its asymptotic properties. A popular approach to optimize experimental designs called local optimal experimental designs was described together with a lesser known approach which takes into account a potential nonlinearity of the model parameters. These two approaches were combined with two different methods to solve their underlying discrete optimization problem. All presented methods were implemented in an open source MATLAB toolbox called the Optimal Experimental Design Toolbox whose structure and handling was described. In numerical experiments, the model parameters and experimental design were optimized using this toolbox. Two models for sediment concentration in seawater of different complexity served as application example. The advantages and disadvantages of the different approaches were compared, and an evaluation of the approaches was performed.

  12. SU-E-T-295: Simultaneous Beam Sampling and Aperture Shape Optimization for Station Parameter Optimized Radiation Therapy (SPORT)

    International Nuclear Information System (INIS)

    Purpose: Station Parameter Optimized Radiation Therapy (SPORT) was recently proposed to fully utilize the technical capability of emerging digital LINACs, in which the station parameters of a delivery system, (such as aperture shape and weight, couch position/angle, gantry/collimator angle) are optimized altogether. SPORT promises to deliver unprecedented radiation dose distributions efficiently, yet there does not exist any optimization algorithm to implement it. The purpose of this work is to propose an optimization algorithm to simultaneously optimize the beam sampling and aperture shapes. Methods: We build a mathematical model whose variables are beam angles (including non-coplanar and/or even nonisocentric beams) and aperture shapes. To solve the resulting large scale optimization problem, we devise an exact, convergent and fast optimization algorithm by integrating three advanced optimization techniques named column generation, gradient method, and pattern search. Column generation is used to find a good set of aperture shapes as an initial solution by adding apertures sequentially. Then we apply the gradient method to iteratively improve the current solution by reshaping the aperture shapes and updating the beam angles toward the gradient. Algorithm continues by pattern search method to explore the part of the search space that cannot be reached by the gradient method. Results: The proposed technique is applied to a series of patient cases and significantly improves the plan quality. In a head-and-neck case, for example, the left parotid gland mean-dose, brainstem max-dose, spinal cord max-dose, and mandible mean-dose are reduced by 10%, 7%, 24% and 12% respectively, compared to the conventional VMAT plan while maintaining the same PTV coverage. Conclusion: Combined use of column generation, gradient search and pattern search algorithms provide an effective way to optimize simultaneously the large collection of station parameters and significantly improves

  13. SU-E-T-295: Simultaneous Beam Sampling and Aperture Shape Optimization for Station Parameter Optimized Radiation Therapy (SPORT)

    Energy Technology Data Exchange (ETDEWEB)

    Zarepisheh, M; Li, R; Xing, L [Stanford UniversitySchool of Medicine, Stanford, CA (United States); Ye, Y [Stanford Univ, Management Science and Engineering, Stanford, Ca (United States); Boyd, S [Stanford University, Electrical Engineering, Stanford, CA (United States)

    2014-06-01

    Purpose: Station Parameter Optimized Radiation Therapy (SPORT) was recently proposed to fully utilize the technical capability of emerging digital LINACs, in which the station parameters of a delivery system, (such as aperture shape and weight, couch position/angle, gantry/collimator angle) are optimized altogether. SPORT promises to deliver unprecedented radiation dose distributions efficiently, yet there does not exist any optimization algorithm to implement it. The purpose of this work is to propose an optimization algorithm to simultaneously optimize the beam sampling and aperture shapes. Methods: We build a mathematical model whose variables are beam angles (including non-coplanar and/or even nonisocentric beams) and aperture shapes. To solve the resulting large scale optimization problem, we devise an exact, convergent and fast optimization algorithm by integrating three advanced optimization techniques named column generation, gradient method, and pattern search. Column generation is used to find a good set of aperture shapes as an initial solution by adding apertures sequentially. Then we apply the gradient method to iteratively improve the current solution by reshaping the aperture shapes and updating the beam angles toward the gradient. Algorithm continues by pattern search method to explore the part of the search space that cannot be reached by the gradient method. Results: The proposed technique is applied to a series of patient cases and significantly improves the plan quality. In a head-and-neck case, for example, the left parotid gland mean-dose, brainstem max-dose, spinal cord max-dose, and mandible mean-dose are reduced by 10%, 7%, 24% and 12% respectively, compared to the conventional VMAT plan while maintaining the same PTV coverage. Conclusion: Combined use of column generation, gradient search and pattern search algorithms provide an effective way to optimize simultaneously the large collection of station parameters and significantly improves

  14. An Improved Bees Algorithm for Real Parameter Optimization

    Directory of Open Access Journals (Sweden)

    Wasim A. Hussein

    2015-10-01

    Full Text Available The Bees Algorithm (BA is a bee swarm-based search algorithm inspired by the foraging behavior of a swarm of honeybees. BA can be divided into four parts: the parameter tuning part, the initialization part, the local search part, and the global search part. Recently, BA based on Patch-Levy-based Initialization Algorithm (PLIA-BA has been proposed. However, the initial stage remains an initial step, and its improvement is not enough for more challenging problem classes with different properties. The local and global search capabilities are also required to be enhanced to improve the quality of final solution and the convergence speed of PLIA-BA on such problems. Consequently, in this paper, a new local search algorithm has been adopted based on the Levy looping flights. Moreover, the mechanism of the global search has been enhanced to be closer to nature and based on the patch-Levy model adopted in the initialization algorithm (PLIA. The improvements in local and global search parts are incorporated into PLIA-BA to advise a new version of BA that is called Patch-Levy-based Bees Algorithm (PLBA. We investigate the performance of the proposed PLBA on a set of challenging benchmark functions. The results of the experiments indicate that PLBA significantly outperforms the other BA variants, including PLIA-BA and can produce comparable results with other state-of-the-art algorithms.

  15. Parameter optimization using GA in SVM to predict damage level of non-reshaped berm breakwater.

    Digital Repository Service at National Institute of Oceanography (India)

    Harish, N.; Lokesha.; Mandal, S.; Rao, S.; Patil, S.G.

    In the present study, Support Vector Machines (SVM) and hybrid of Genetic Algorithm (GA) with SVM models are developed to predict the damage level of non-reshaped berm breakwaters. Optimal kernel parameters of SVM are determined by using GA...

  16. Optimizing of some parameters in the phaze of coarse flotation in "Sasa" - mine R. Macedonia

    OpenAIRE

    Golomeov, Blagoj; Krstev, Boris; Golomeova, Mirjana

    2002-01-01

    Optimizing of some parameters during the phase of coarse Pb - Zn flotation in "Sasa" mine, R. Macedonia is shown in this work supported by the modern investigations and the accomplishment of the mineral flotation.

  17. An adaptive image denoising method based on local parameters optimization

    Indian Academy of Sciences (India)

    Hari Om; Mantosh Biswas

    2014-08-01

    In image denoising algorithms, the noise is handled by either modifying term-by-term, i.e., individual pixels or block-by-block, i.e., group of pixels, using suitable shrinkage factor and threshold function. The shrinkage factor is generally a function of threshold and some other characteristics of the neighbouring pixels of the pixel to be thresholded (denoised). The threshold is determined in terms of the noise variance present in the image and its size. The VisuShrink, SureShrink, and NeighShrink methods are important denoising methods that provide good results. The first two, i.e., VisuShrink and SureShrink methods follow term-by-term approach, i.e., modify the individual pixel and the third one, i.e., NeighShrink and its variants: ModiNeighShrink, IIDMWD, and IAWDMBMC, follow block-by-block approach, i.e., modify the pixels in groups, in order to remove the noise. The VisuShrink, SureShrink, and NeighShrink methods however do not give very good visual quality because they remove too many coefficients due to their high threshold values. In this paper, we propose an image denoising method that uses the local parameters of the neighbouring coefficients of the pixel to be denoised in the noisy image. In our method, we propose two new shrinkage factors and the threshold at each decomposition level, which lead to better visual quality. We also establish the relationship between both the shrinkage factors. We compare the performance of our method with that of the VisuShrink and NeighShrink including various variants. Simulation results show that our proposed method has high peak signal-to-noise ratio and good visual quality of the image as compared to the traditional methods:Weiner filter, VisuShrink, SureShrink, NeighBlock, NeighShrink, ModiNeighShrink, LAWML, IIDMWT, and IAWDMBNC methods.

  18. A Method for Optimizing Technical Parameters of the Vacuum Freeze-Drying Process

    OpenAIRE

    Qian, X.-M.; Huang, W.-R.; Lou, P.-H.

    2014-01-01

    Vacuum freeze-drying is a technique that makes a material dehydrate at low temperature and low pressure, and it has many merits. A control system is designed and developed based on a certain area freeze-drying machine. A test by the control system is done to optimize the freeze-drying technical parameters. According to the test results, by the method of quadratic orthogonal experiment, the key parameters, including duration, temperature and vacuum of freeze-drying, are analysed and optimized....

  19. Derivative-free optimization for parameter estimation in computational nuclear physics

    OpenAIRE

    Wild, Stefan M.; Sarich, Jason; Schunck, Nicolas

    2014-01-01

    We consider optimization problems that arise when estimating a set of unknown parameters from experimental data, particularly in the context of nuclear density functional theory. We examine the cost of not having derivatives of these functionals with respect to the parameters. We show that the POUNDERS code for local derivative-free optimization obtains consistent solutions on a variety of computationally expensive energy density functional calibration problems. We also provide a primer on th...

  20. Mechanism Analysis and Parameter Optimization of Mega-Sub-Isolation System

    OpenAIRE

    Xiangxiu Li; Ping Tan; Xiaojun Li; Aiwen Liu

    2016-01-01

    The equation of motion of mega-sub-isolation system is established. The working mechanism of the mega-sub-isolation system is obtained by systematically investigating its dynamic characteristics corresponding to various structural parameters. Considering the number and location of the isolated substructures, a procedure to optimally design the isolator parameters of the mega-sub-isolation system is put forward based on the genetic algorithm with base shear as the optimization objective. The i...

  1. Linearly Supporting Feature Extraction For Automated Estimation Of Stellar Atmospheric Parameters

    CERN Document Server

    Li, Xiangru; Comte, Georges; Luo, Ali; Zhao, Yongheng; Wang, Yongjun

    2015-01-01

    We describe a scheme to extract linearly supporting (LSU) features from stellar spectra to automatically estimate the atmospheric parameters $T_{eff}$, log$~g$, and [Fe/H]. "Linearly supporting" means that the atmospheric parameters can be accurately estimated from the extracted features through a linear model. The successive steps of the process are as follow: first, decompose the spectrum using a wavelet packet (WP) and represent it by the derived decomposition coefficients; second, detect representative spectral features from the decomposition coefficients using the proposed method Least Absolute Shrinkage and Selection Operator (LARS)$_{bs}$; third, estimate the atmospheric parameters $T_{eff}$, log$~g$, and [Fe/H] from the detected features using a linear regression method. One prominent characteristic of this scheme is its ability to evaluate quantitatively the contribution of each detected feature to the atmospheric parameter estimate and also to trace back the physical significance of that feature. Th...

  2. Automated Modal Parameter Estimation for Operational Modal Analysis of Large Systems

    DEFF Research Database (Denmark)

    Andersen, Palle; Brincker, Rune; Goursat, Maurice;

    2007-01-01

    In this paper the problems of doing automatic modal parameter extraction and how to account for large number of data to process are considered. Two different approaches for obtaining the modal parameters automatically using OMA are presented: The Frequency Domain Decomposition (FDD) technique and...... and a correlation-driven Stochastic Subspace Identification (SSI) technique. Special attention is given to the problem of data reduction, where many sensors are available. Finally, the techniques are demonstrated on real data....

  3. Themoeconomic optimization of triple pressure heat recovery steam generator operating parameters for combined cycle plants

    OpenAIRE

    Mohammd Mohammed S.; Petrović Milan V.

    2015-01-01

    The aim of this work is to develop a method for optimization of operating parameters of a triple pressure heat recovery steam generator. Two types of optimization: (a) thermodynamic and (b) thermoeconomic were preformed. The purpose of the thermodynamic optimization is to maximize the efficiency of the plant. The selected objective for this purpose is minimization of the exergy destruction in the heat recovery steam generator (HRSG). The purpose of the ther...

  4. Feasibility Preserving Constraint-Handling Strategies for Real Parameter Evolutionary Optimization

    OpenAIRE

    Padhye, Nikhil; Mittal, Pulkit; Deb, Kalyanmoy

    2015-01-01

    Evolutionary Algorithms (EAs) are being routinely applied for a variety of optimization tasks, and real-parameter optimization in the presence of constraints is one such important area. During constrained optimization EAs often create solutions that fall outside the feasible region; hence a viable constraint- handling strategy is needed. This paper focuses on the class of constraint-handling strategies that repair infeasible solutions by bringing them back into the search space and explicitly...

  5. Determination of the optimal relaxation parameter in a numerical procedure for solitons propagation

    CERN Document Server

    Cirilo, Eliandro Rodrigues; Romeiro, Neyva Maria Lopes; Natti, Erica Regina Takano

    2010-01-01

    In this work, considering a numerical procedure developed to solve a system of coupled nonlinear complex differential equations, which describes the solitons propagation in dielectric optical fibers, we optimize the numerical processing time, in relation to the relaxation parameter of the procedure, for relevant groups of values of the dielectric variables of the optic fiber. Key-words: optical soliton, processing time, optimization.

  6. Optimal fidelity of teleportation with continuous variables using three tunable parameters in a realistic environment

    Science.gov (United States)

    Hu, Li-Yun; Liao, Zeyang; Ma, Shengli; Zubairy, M. Suhail

    2016-03-01

    We introduce three tunable parameters to optimize the fidelity of quantum teleportation with continuous variables in a nonideal scheme. By using the characteristic-function formalism, we present the condition that the teleportation fidelity is independent of the amplitude of input coherent states for any entangled resource. Then we investigate the effects of tunable parameters on the fidelity with or without the presence of the environment and imperfect measurements by analytically deriving the expression of fidelity for three different input coherent-state distributions. It is shown that, for the linear distribution, the optimization with three tunable parameters is the best one with respect to single- and two-parameter optimization. Our results reveal the usefulness of tunable parameters for improving the fidelity of teleportation and the ability against decoherence.

  7. Optimization of processing parameters for rheo-casting AZ91D magnesium alloy

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    A design of experiment technique was used to optimize the microstructure of the AZ91D alloy produced by rheo-castingThe experimental design consists of four parameters (pouting temperature, shearing temperature, shearing time and shearing rate)with three levels. The grain size and shape factor measurements of primary α-Mg particles were conducted to determine the mierostrueture. The contribution of each parameter shows that pouring temperature is the most significant parameter affecting the grain size, and the shape factor highly depends on the shearing temperature. The optimized rheo-casting processing parameters are 650 ℃ for pouring temperature, 585 ℃ for shearing temperature, 40 s for shearing time, and 600 r/min for shearing rate. Under the optimized processing parameters, the average grain size is 28.53 μn, and the shape factor is 0.591.

  8. Teaching-learning-based optimization algorithm for unconstrained and constrained real-parameter optimization problems

    Science.gov (United States)

    Rao, R. V.; Savsani, V. J.; Balic, J.

    2012-12-01

    An efficient optimization algorithm called teaching-learning-based optimization (TLBO) is proposed in this article to solve continuous unconstrained and constrained optimization problems. The proposed method is based on the effect of the influence of a teacher on the output of learners in a class. The basic philosophy of the method is explained in detail. The algorithm is tested on 25 different unconstrained benchmark functions and 35 constrained benchmark functions with different characteristics. For the constrained benchmark functions, TLBO is tested with different constraint handling techniques such as superiority of feasible solutions, self-adaptive penalty, ɛ-constraint, stochastic ranking and ensemble of constraints. The performance of the TLBO algorithm is compared with that of other optimization algorithms and the results show the better performance of the proposed algorithm.

  9. Network optimization including gas lift and network parameters under subsurface uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Schulze-Riegert, R.; Baffoe, J.; Pajonk, O. [SPT Group GmbH, Hamburg (Germany); Badalov, H.; Huseynov, S. [Technische Univ. Clausthal, Clausthal-Zellerfeld (Germany). ITE; Trick, M. [SPT Group, Calgary, AB (Canada)

    2013-08-01

    Optimization of oil and gas field production systems poses a great challenge to field development due to complex and multiple interactions between various operational design parameters and subsurface uncertainties. Conventional analytical methods are capable of finding local optima based on single deterministic models. They are less applicable for efficiently generating alternative design scenarios in a multi-objective context. Practical implementations of robust optimization workflows integrate the evaluation of alternative design scenarios and multiple realizations of subsurface uncertainty descriptions. Production or economic performance indicators such as NPV (Net Present Value) are linked to a risk-weighted objective function definition to guide the optimization processes. This work focuses on an integrated workflow using a reservoir-network simulator coupled to an optimization framework. The work will investigate the impact of design parameters while considering the physics of the reservoir, wells, and surface facilities. Subsurface uncertainties are described by well parameters such as inflow performance. Experimental design methods are used to investigate parameter sensitivities and interactions. Optimization methods are used to find optimal design parameter combinations which improve key performance indicators of the production network system. The proposed workflow will be applied to a representative oil reservoir coupled to a network which is modelled by an integrated reservoir-network simulator. Gas-lift will be included as an explicit measure to improve production. An objective function will be formulated for the net present value of the integrated system including production revenue and facility costs. Facility and gas lift design parameters are tuned to maximize NPV. Well inflow performance uncertainties are introduced with an impact on gas lift performance. Resulting variances on NPV are identified as a risk measure for the optimized system design. A

  10. Automation of reverse engineering process in aircraft modeling and related optimization problems

    Science.gov (United States)

    Li, W.; Swetits, J.

    1994-01-01

    During the year of 1994, the engineering problems in aircraft modeling were studied. The initial concern was to obtain a surface model with desirable geometric characteristics. Much of the effort during the first half of the year was to find an efficient way of solving a computationally difficult optimization model. Since the smoothing technique in the proposal 'Surface Modeling and Optimization Studies of Aerodynamic Configurations' requires solutions of a sequence of large-scale quadratic programming problems, it is important to design algorithms that can solve each quadratic program in a few interactions. This research led to three papers by Dr. W. Li, which were submitted to SIAM Journal on Optimization and Mathematical Programming. Two of these papers have been accepted for publication. Even though significant progress has been made during this phase of research and computation times was reduced from 30 min. to 2 min. for a sample problem, it was not good enough for on-line processing of digitized data points. After discussion with Dr. Robert E. Smith Jr., it was decided not to enforce shape constraints in order in order to simplify the model. As a consequence, P. Dierckx's nonparametric spline fitting approach was adopted, where one has only one control parameter for the fitting process - the error tolerance. At the same time the surface modeling software developed by Imageware was tested. Research indicated a substantially improved fitting of digitalized data points can be achieved if a proper parameterization of the spline surface is chosen. A winning strategy is to incorporate Dierckx's surface fitting with a natural parameterization for aircraft parts. The report consists of 4 chapters. Chapter 1 provides an overview of reverse engineering related to aircraft modeling and some preliminary findings of the effort in the second half of the year. Chapters 2-4 are the research results by Dr. W. Li on penalty functions and conjugate gradient methods for

  11. Derivative-free optimization for parameter estimation in computational nuclear physics

    Science.gov (United States)

    Wild, Stefan M.; Sarich, Jason; Schunck, Nicolas

    2015-03-01

    We consider optimization problems that arise when estimating a set of unknown parameters from experimental data, particularly in the context of nuclear density functional theory. We examine the cost of not having derivatives of these functionals with respect to the parameters. We show that the POUNDERS code for local derivative-free optimization obtains consistent solutions on a variety of computationally expensive energy density functional calibration problems. We also provide a primer on the operation of the POUNDERS software in the Toolkit for advanced optimization.

  12. Derivative-free optimization for parameter estimation in computational nuclear physics

    CERN Document Server

    Wild, Stefan M; Schunck, Nicolas

    2014-01-01

    We consider optimization problems that arise when estimating a set of unknown parameters from experimental data, particularly in the context of nuclear density functional theory. We examine the cost of not having derivatives of these functionals with respect to the parameters. We show that the POUNDERS code for local derivative-free optimization obtains consistent solutions on a variety of computationally expensive energy density functional calibration problems. We also provide a primer on the operation of the POUNDERS software in the Toolkit for Advanced Optimization.

  13. Application of Factorial Design for Gas Parameter Optimization in CO2 Laser Welding

    DEFF Research Database (Denmark)

    Gong, Hui; Dragsted, Birgitte; Olsen, Flemming Ove

    1997-01-01

    The effect of different gas process parameters involved in CO2 laser welding has been studied by applying two-set of three-level complete factorial designs. In this work 5 gas parameters, gas type, gas flow rate, gas blowing angle, gas nozzle diameter, gas blowing point-offset, are optimized...

  14. Experimental Verification of Statistically Optimized Parameters for Low-Pressure Cold Spray Coating of Titanium

    Directory of Open Access Journals (Sweden)

    Damilola Isaac Adebiyi

    2016-06-01

    Full Text Available The cold spray coating process involves many process parameters which make the process very complex, and highly dependent and sensitive to small changes in these parameters. This results in a small operational window of the parameters. Consequently, mathematical optimization of the process parameters is key, not only to achieving deposition but also improving the coating quality. This study focuses on the mathematical identification and experimental justification of the optimum process parameters for cold spray coating of titanium alloy with silicon carbide (SiC. The continuity, momentum and the energy equations governing the flow through the low-pressure cold spray nozzle were solved by introducing a constitutive equation to close the system. This was used to calculate the critical velocity for the deposition of SiC. In order to determine the input temperature that yields the calculated velocity, the distribution of velocity, temperature, and pressure in the cold spray nozzle were analyzed, and the exit values were predicted using the meshing tool of Solidworks. Coatings fabricated using the optimized parameters and some non-optimized parameters are compared. The coating of the CFD-optimized parameters yielded lower porosity and higher hardness.

  15. Computer-aided method for automated selection of optimal imaging plane for measurement of total cerebral blood flow by MRI

    Science.gov (United States)

    Teng, Pang-yu; Bagci, Ahmet Murat; Alperin, Noam

    2009-02-01

    A computer-aided method for finding an optimal imaging plane for simultaneous measurement of the arterial blood inflow through the 4 vessels leading blood to the brain by phase contrast magnetic resonance imaging is presented. The method performance is compared with manual selection by two observers. The skeletons of the 4 vessels for which centerlines are generated are first extracted. Then, a global direction of the relatively less curved internal carotid arteries is calculated to determine the main flow direction. This is then used as a reference direction to identify segments of the vertebral arteries that strongly deviates from the main flow direction. These segments are then used to identify anatomical landmarks for improved consistency of the imaging plane selection. An optimal imaging plane is then identified by finding a plane with the smallest error value, which is defined as the sum of the angles between the plane's normal and the vessel centerline's direction at the location of the intersections. Error values obtained using the automated and the manual methods were then compared using 9 magnetic resonance angiography (MRA) data sets. The automated method considerably outperformed the manual selection. The mean error value with the automated method was significantly lower than the manual method, 0.09+/-0.07 vs. 0.53+/-0.45, respectively (p<.0001, Student's t-test). Reproducibility of repeated measurements was analyzed using Bland and Altman's test, the mean 95% limits of agreements for the automated and manual method were 0.01~0.02 and 0.43~0.55 respectively.

  16. Optimization of plastic injection molding process parameters for manufacturing a brake booster valve body

    International Nuclear Information System (INIS)

    Highlights: • PIM process parameters have been optimized for a brake booster valve body. • The Taguchi method and computer-aided engineering have been integrated and used. • Seven key parameters of PIM process have been considered. • A nearly 12% improvement have been found by using the optimal PIM process parameters. • The efficient improvement can improve the safety performance of a vehicle. - Abstract: The plastic injection molding (PIM) process parameters have been investigated for manufacturing a brake booster valve body. The optimal PIM process parameters is determined with the application of computer-aided engineering integrating with the Taguchi method to improve the compressive property of the valve body. The parameters considered for optimization are the following: number of gates, gate size, molding temperature, resin temperature, switch over by volume filled, switch over by injection pressure, and curing time. An orthogonal array of L18 is created for the statistical design of experiments based on the Taguchi method. Then, Mold-Flow analyses are performed by using the designed process parameters based on the L18 orthogonal array. The signal-to-noise (S/N) ratio and the analysis of variance (ANOVA) are used to find the optimal PIM process parameters and to figure out the impact of the viscosity of resin, curing percentage, and compressive strength on a brake booster valve body. When compared with the average compression strength out of the 18 design experiments, the compression strength of the valve body produced using the optimal PIM process parameters showed a nearly 12% improvement

  17. Determination of photovoltaic modules parameters at different operating conditions using a novel bird mating optimizer approach

    International Nuclear Information System (INIS)

    Highlights: • A simplified bird mating optimizer (SBMO) approach is developed. • SBMO is used to estimate the electrical parameters of a PV module. • The performance of SBMO is promising. • The results are comparable with the other methods. - Abstract: The main goal of this paper is to provide a framework to accurately estimate the electrical equivalent circuit parameters of photovoltaic arrays by use of an efficient heuristic technique. Owing to the non-linearity of the current vs. voltage (I–V) characteristics of PV modules, using a superior optimization technique helps to effectively find the real electrical parameters. Inspired by the mating process of different bird species, bird mating optimizer (BMO) is a new invented search technique which has shown superior performance for solving complex optimization problems. In this paper, the original BMO algorithm is simplified and used to estimate the electrical parameters of the module model for an amorphous silicon PV system at different operating conditions. The simplified BMO (SBMO) eliminates tedious efforts of parameter setting in original BMO and also modifies some rules. The usefulness of the proposed algorithm is investigated by comparing the obtained results with those found by two particle swarm optimization (PSO) variants, two harmony search (HS) variants as well as seeker optimization algorithm (SOA). Based on the investigated situations of this paper, SBMO yields more accurate results than the other studied methods

  18. Combustion Model and Control Parameter Optimization Methods for Single Cylinder Diesel Engine

    Directory of Open Access Journals (Sweden)

    Bambang Wahono

    2014-01-01

    Full Text Available This research presents a method to construct a combustion model and a method to optimize some control parameters of diesel engine in order to develop a model-based control system. The construction purpose of the model is to appropriately manage some control parameters to obtain the values of fuel consumption and emission as the engine output objectives. Stepwise method considering multicollinearity was applied to construct combustion model with the polynomial model. Using the experimental data of a single cylinder diesel engine, the model of power, BSFC, NOx, and soot on multiple injection diesel engines was built. The proposed method succesfully developed the model that describes control parameters in relation to the engine outputs. Although many control devices can be mounted to diesel engine, optimization technique is required to utilize this method in finding optimal engine operating conditions efficiently beside the existing development of individual emission control methods. Particle swarm optimization (PSO was used to calculate control parameters to optimize fuel consumption and emission based on the model. The proposed method is able to calculate control parameters efficiently to optimize evaluation item based on the model. Finally, the model which added PSO then was compiled in a microcontroller.

  19. Aluminum-zinc alloy squeeze casting technological parameters optimization based on PSO and ANN

    Directory of Open Access Journals (Sweden)

    SHU Fu-hua

    2007-08-01

    Full Text Available This paper presents a kind of ZA27 squeeze casting process parameter optimization method using artificial neural network (ANN combined with the particle swarm optimizer (PSO. Regarding the test data as samples and using neural network create ZA27 squeeze casting process parameters and mechanical properties of nonlinear mapping model. Using PSO optimize the model and obtain the optimum value of the process parameters. Make full use of the non-neural network mapping capabilities and PSO global optimization capability. The network uses the radial direction primary function neural network, using the clustering and gradient method to make use of network learning, in order to enhance the generalization ability of the network. PSO takes dynamic changing inertia weights to accelerate the convergence speed and avoid a local minimum.

  20. Function Optimization and Parameter Performance Analysis Based on Gravitation Search Algorithm

    Directory of Open Access Journals (Sweden)

    Jie-Sheng Wang

    2015-12-01

    Full Text Available The gravitational search algorithm (GSA is a kind of swarm intelligence optimization algorithm based on the law of gravitation. The parameter initialization of all swarm intelligence optimization algorithms has an important influence on the global optimization ability. Seen from the basic principle of GSA, the convergence rate of GSA is determined by the gravitational constant and the acceleration of the particles. The optimization performances on six typical test functions are verified by the simulation experiments. The simulation results show that the convergence speed of the GSA algorithm is relatively sensitive to the setting of the algorithm parameters, and the GSA parameter can be used flexibly to improve the algorithm’s convergence velocity and improve the accuracy of the solutions.

  1. A New Chaotic Parameters Disturbance Annealing Neural Network for Solving Global Optimization Problems

    Institute of Scientific and Technical Information of China (English)

    MA Wei; WANG Zheng-Ou

    2003-01-01

    Since there were few chaotic neural networks applicable to the global optimization, in this paper, we proposea new neural network model - chaotic parameters disturbance annealing (CPDA) network, which is superior to otherexisting neural networks, genetic algorithms, and simulated annealing algorithms in global optimization. In the presentCPDA network, we add some chaotic parameters in the energy function, which make the Hopfield neural network escapefrom the attraction of a local minimal solution and with the parameter p1 annealing, our model will converge to theglobal optimal solutions quickly and steadily. The converge ability and other characters are also analyzed in this paper.The benchmark examples show the present CPDA neuralnetwork's merits in nonlinear global optimization.

  2. Optimization of machining parameters of turning operations based on multi performance criteria

    Directory of Open Access Journals (Sweden)

    N.K.Mandal

    2013-01-01

    Full Text Available The selection of optimum machining parameters plays a significant role to ensure quality of product, to reduce the manufacturing cost and to increase productivity in computer controlled manufacturing process. For many years, multi-objective optimization of turning based on inherent complexity of process is a competitive engineering issue. This study investigates multi-response optimization of turning process for an optimal parametric combination to yield the minimum power consumption, surface roughness and frequency of tool vibration using a combination of a Grey relational analysis (GRA. Confirmation test is conducted for the optimal machining parameters to validate the test result. Various turning parameters, such as spindle speed, feed and depth of cut are considered. Experiments are designed and conducted based on full factorial design of experiment.

  3. Prediction Model of Battery State of Charge and Control Parameter Optimization for Electric Vehicle

    Directory of Open Access Journals (Sweden)

    Bambang Wahono

    2015-07-01

    Full Text Available This paper presents the construction of a battery state of charge (SOC prediction model and the optimization method of the said model to appropriately control the number of parameters in compliance with the SOC as the battery output objectives. Research Centre for Electrical Power and Mechatronics, Indonesian Institute of Sciences has tested its electric vehicle research prototype on the road, monitoring its voltage, current, temperature, time, vehicle velocity, motor speed, and SOC during the operation. Using this experimental data, the prediction model of battery SOC was built. Stepwise method considering multicollinearity was able to efficiently develops the battery prediction model that describes the multiple control parameters in relation to the characteristic values such as SOC. It was demonstrated that particle swarm optimization (PSO succesfully and efficiently calculated optimal control parameters to optimize evaluation item such as SOC based on the model.

  4. Aluminum-zinc alloy squeeze casting technological parameters optimization based on PSO and ANN

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    This paper presents a kind of ZA27 squeeze casting process parameter optimization method using artificial neural network (ANN) combined with the particle swarm optimizer (PSO). Regarding the test data as samples and using neural network create ZA27 squeeze casting process parameters and mechanical properties of nonlinear mapping model. Using PSO optimize the model and obtain the optimum value of the process parameters. Make full use of the non-neural network mapping capabilities and PSO global optimization capability. The network uses the radial direction primary function neural network, using the clustering and gradient method to make use of network learning, in order to enhance the generalization ability of the network. PSO takes dynamic changing inertia weights to accelerate the convergence speed and avoid a local minimum.

  5. Differential-Evolution Control Parameter Optimization for Unmanned Aerial Vehicle Path Planning.

    Science.gov (United States)

    Kok, Kai Yit; Rajendran, Parvathy

    2016-01-01

    The differential evolution algorithm has been widely applied on unmanned aerial vehicle (UAV) path planning. At present, four random tuning parameters exist for differential evolution algorithm, namely, population size, differential weight, crossover, and generation number. These tuning parameters are required, together with user setting on path and computational cost weightage. However, the optimum settings of these tuning parameters vary according to application. Instead of trial and error, this paper presents an optimization method of differential evolution algorithm for tuning the parameters of UAV path planning. The parameters that this research focuses on are population size, differential weight, crossover, and generation number. The developed algorithm enables the user to simply define the weightage desired between the path and computational cost to converge with the minimum generation required based on user requirement. In conclusion, the proposed optimization of tuning parameters in differential evolution algorithm for UAV path planning expedites and improves the final output path and computational cost. PMID:26943630

  6. Differential-Evolution Control Parameter Optimization for Unmanned Aerial Vehicle Path Planning

    Science.gov (United States)

    Kok, Kai Yit; Rajendran, Parvathy

    2016-01-01

    The differential evolution algorithm has been widely applied on unmanned aerial vehicle (UAV) path planning. At present, four random tuning parameters exist for differential evolution algorithm, namely, population size, differential weight, crossover, and generation number. These tuning parameters are required, together with user setting on path and computational cost weightage. However, the optimum settings of these tuning parameters vary according to application. Instead of trial and error, this paper presents an optimization method of differential evolution algorithm for tuning the parameters of UAV path planning. The parameters that this research focuses on are population size, differential weight, crossover, and generation number. The developed algorithm enables the user to simply define the weightage desired between the path and computational cost to converge with the minimum generation required based on user requirement. In conclusion, the proposed optimization of tuning parameters in differential evolution algorithm for UAV path planning expedites and improves the final output path and computational cost. PMID:26943630

  7. Resonance parameters based analysis for metallic thickness optimization of a bimetallic plasmonic structure

    Science.gov (United States)

    Bera, Mahua; Banerjee, Jayeta; Ray, Mina

    2014-02-01

    Metallic film thickness optimization in mono- and bimetallic plasmonic structures has been carried out in order to determine the correct device parameters. Different resonance parameters, such as reflectivity, phase, field enhancement, and the complex amplitude reflectance Argand diagram (CARAD), have been investigated for the proposed optimization procedure. Comparison of mono- and bimetallic plasmonic structures has been carried out in the context of these resonance parameters with simultaneous angular and spectral interrogation. Differential phase analysis has also been performed and its application to sensing has been discussed along with a proposed interferometric set-up.

  8. Optimization of parameters for the inline-injection system at Brookhaven Accelerator Test Facility

    Energy Technology Data Exchange (ETDEWEB)

    Parsa, Z. [Brookhaven National Lab., Upton, NY (United States); Ko, S.K. [Ulsan Univ. (Korea, Republic of)

    1995-10-01

    We present some of our parameter optimization results utilizing code PARMLEA, for the ATF Inline-Injection System. The new solenoid-Gun-Solenoid -- Drift-Linac Scheme would improve the beam quality needed for FEL and other experiments at ATF as compared to the beam quality of the original design injection system. To optimize the gain in the beam quality we have considered various parameters including the accelerating field gradient on the photoathode, the Solenoid field strengths, separation between the gun and entrance to the linac as well as the (type size) initial charge distributions. The effect of the changes in the parameters on the beam emittance is also given.

  9. MyGIsFOS: an automated code for parameter determination and detailed abundance analysis in cool stars

    CERN Document Server

    Sbordone, L; Bonifacio, P; Duffau, S

    2013-01-01

    The current and planned high-resolution, high-multiplexity stellar spectroscopic surveys, as well as the swelling amount of under-utilized data present in public archives have led to an increasing number of efforts to automate the crucial but slow process to retrieve stellar parameters and chemical abundances from spectra. We present MyGIsFOS, a code designed to derive atmospheric parameters and detailed stellar abundances from medium - high resolution spectra of cool (FGK) stars. We describe the general structure and workings of the code, present analyses of a number of well studied stars representative of the parameter space MyGIsFOS is designed to cover, and examples of the exploitation of MyGIsFOS very fast analysis to assess uncertainties through Montecarlo tests. MyGIsFOS aims to reproduce a ``traditional'' manual analysis by fitting spectral features for different elements against a precomputed grid of synthetic spectra. Fe I and Fe II lines can be employed to determine temperature, gravity, microturbu...

  10. On the Non-Linear Optimization of Projective Motion Using Minimal Parameters

    OpenAIRE

    Bartoli, Adrien

    2002-01-01

    I address the problem of optimizing projective motion over a minimal set of parameters. Most of the existing works overparameterize the problem. While this can simplify the estimation process and may ensure well-conditioning of the parameters, this also increases the computational cost since more unknowns than necessary are involved. I propose a method whose key feature is that the number of parameters employed is minimal. The method requires singular value decomposition and minor algebraic m...

  11. Nonlinear Time Series Prediction Using LS-SVM with Chaotic Mutation Evolutionary Programming for Parameter Optimization

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Nonlinear time series prediction is studied by using an improved least squares support vector machine (LSSVM) regression based on chaotic mutation evolutionary programming (CMEP) approach for parameter optimization.We analyze how the prediction error varies with different parameters (σ, γ) in LS-SVM. In order to select appropriate parameters for the prediction model, we employ CMEP algorithm. Finally, Nasdaq stock data are predicted by using this LS-SVM regression based on CMEP, and satisfactory results are obtained.

  12. Automated source term and wind parameter estimation for atmospheric transport and dispersion applications

    Science.gov (United States)

    Bieringer, Paul E.; Rodriguez, Luna M.; Vandenberghe, Francois; Hurst, Jonathan G.; Bieberbach, George; Sykes, Ian; Hannan, John R.; Zaragoza, Jake; Fry, Richard N.

    2015-12-01

    Accurate simulations of the atmospheric transport and dispersion (AT&D) of hazardous airborne materials rely heavily on the source term parameters necessary to characterize the initial release and meteorological conditions that drive the downwind dispersion. In many cases the source parameters are not known and consequently based on rudimentary assumptions. This is particularly true of accidental releases and the intentional releases associated with terrorist incidents. When available, meteorological observations are often not representative of the conditions at the location of the release and the use of these non-representative meteorological conditions can result in significant errors in the hazard assessments downwind of the sensors, even when the other source parameters are accurately characterized. Here, we describe a computationally efficient methodology to characterize both the release source parameters and the low-level winds (eg. winds near the surface) required to produce a refined downwind hazard. This methodology, known as the Variational Iterative Refinement Source Term Estimation (STE) Algorithm (VIRSA), consists of a combination of modeling systems. These systems include a back-trajectory based source inversion method, a forward Gaussian puff dispersion model, a variational refinement algorithm that uses both a simple forward AT&D model that is a surrogate for the more complex Gaussian puff model and a formal adjoint of this surrogate model. The back-trajectory based method is used to calculate a "first guess" source estimate based on the available observations of the airborne contaminant plume and atmospheric conditions. The variational refinement algorithm is then used to iteratively refine the first guess STE parameters and meteorological variables. The algorithm has been evaluated across a wide range of scenarios of varying complexity. It has been shown to improve the source parameters for location by several hundred percent (normalized by the

  13. Effect of experimental parameters on optimal reflection of light from opaque media

    CERN Document Server

    Anderson, Benjamin R; Eilers, Hergen

    2016-01-01

    Previously we considered the effect of experimental parameters on optimized transmission through opaque media using spatial light modulator (SLM)-based wavefront shaping. In this study we consider the opposite geometry, in which we optimize reflection from an opaque surface such that the backscattered light is focused onto a spot on an imaging detector. By systematically varying different experimental parameters (genetic algorithm iterations, bin size, SLM active area, target area, spot size, and sample angle with respect to the optical axis) and optimizing the reflected light we determine how each parameter affects the intensity enhancement. We find that the effects of the experimental parameters on the enhancement are similar to those measured for a transmissive geometry, but with the exact functional forms changed due to the different geometry and the use of a genetic algorithm instead of an iterative algorithm. Additionally, we find preliminary evidence of greater enhancements than predicted by random mat...

  14. Optimizing reliability, maintainability and testability parameters of equipment based on GSPN

    Institute of Scientific and Technical Information of China (English)

    Yongcheng Xu

    2015-01-01

    Reliability, maintainability and testability (RMT) are important properties of equipment, since they have important influ-ence on operational availability and life cycle costs (LCC). There-fore, weighting and optimizing the three properties are of great significance. A new approach for optimization of RMT parameters is proposed. First of al , the model for the equipment operation pro-cess is established based on the generalized stochastic Petri nets (GSPN) theory. Then, by solving the GSPN model, the quantitative relationship between operational availability and RMT parameters is obtained. Afterwards, taking history data of similar equipment and operation process into consideration, a cost model of design, manufacture and maintenance is developed. Based on operational availability, the cost model and parameters ranges, an optimization model of RMT parameters is built. Final y, the effectiveness and practicability of this approach are validated through an example.

  15. A shot parameter specification subsystem for automated control of PBFA II accelerator shots

    International Nuclear Information System (INIS)

    The author reports on the shot parameter specification subsystem (SPSS), an integral part of the automatic control system developed for the Particle Beam Fusion Accelerator II (PBFA II). This system has been designed to fully utilize the accelerator by tailoring shot parameters to the needs of the experimenters. The SPSS is the key to this flexibility. Automatic systems will be required on many pulsed power machines for the fastest turnaround, the highest reliability, and most cost effective operation. These systems will require the flexibility and the ease of use that is part of the SPSS. The author discusses how the PBFA II control system has proved to be an effective modular system, flexible enough to meet the demands of both the fast track construction of PBFA II and the control needs of Hermes III. This system is expected to meet the demands of most future machine changes

  16. Components for automated microscopy

    Science.gov (United States)

    Determann, H.; Hartmann, H.; Schade, K. H.; Stankewitz, H. W.

    1980-12-01

    A number of devices, aiming at automated analysis of microscopic objects as regards their morphometrical parameters or their photometrical values, were developed. These comprise: (1) a device for automatic focusing tuned on maximum contrast; (2) a feedback system for automatic optimization of microscope illumination; and (3) microscope lenses with adjustable pupil distances for usage in the two previous devices. An extensive test program on histological and zytological applications proves the wide application possibilities of the autofocusing device.

  17. Automated Method for Estimating Nutation Time Constant Model Parameters for Spacecraft Spinning on Axis

    Science.gov (United States)

    2008-01-01

    Calculating an accurate nutation time constant (NTC), or nutation rate of growth, for a spinning upper stage is important for ensuring mission success. Spacecraft nutation, or wobble, is caused by energy dissipation anywhere in the system. Propellant slosh in the spacecraft fuel tanks is the primary source for this dissipation and, if it is in a state of resonance, the NTC can become short enough to violate mission constraints. The Spinning Slosh Test Rig (SSTR) is a forced-motion spin table where fluid dynamic effects in full-scale fuel tanks can be tested in order to obtain key parameters used to calculate the NTC. We accomplish this by independently varying nutation frequency versus the spin rate and measuring force and torque responses on the tank. This method was used to predict parameters for the Genesis, Contour, and Stereo missions, whose tanks were mounted outboard from the spin axis. These parameters are incorporated into a mathematical model that uses mechanical analogs, such as pendulums and rotors, to simulate the force and torque resonances associated with fluid slosh.

  18. Global parameter optimization of Mather type plasma focus in the framework of the Gratton-Vargas two-dimensional snowplow model

    CERN Document Server

    Auluck, S K H

    2014-01-01

    Dense Plasma Focus (DPF) is known to produce highly energetic ions, electrons and plasma environment which can be used for breeding of short-lived isotopes, plasma nanotechnology and other material processing applications. Commercial utilization of DPF in such areas would need a design tool which can be deployed in an automatic search for the best possible device configuration for a given application. The recently revisited [S K H Auluck, Physics of Plasmas 20, 112501 (2013)] Gratton-Vargas (GV) two-dimensional analytical snowplow model of plasma focus provides a numerical formula for dynamic inductance of a Mather type plasma focus fitted to thousands of automated computations, which enables construction of such design tool. This inductance formula is utilized in the present work to explore global optimization, based on first-principles optimality criteria, in a 4-dimensional parameter-subspace of the zero-resistance GV model. The optimization process is shown to reproduce the empirically observed constancy ...

  19. Optimizing TiO2-based phosphopeptide enrichment for automated multidimensional liquid chromatography coupled to tandem mass spectrometry

    OpenAIRE

    Cantin, Greg T.; Shock, Teresa R.; Park, Sung Kyu; Madhani, Hiten D.; Yates, John R.

    2007-01-01

    An automated online multidimensional liquid chromatography system coupled to ESI-based tandem mass spectrometry was used to assess the effectiveness of TiO2 in the enrichment of phosphopeptides from tryptic digests of protein mixtures. By monitoring the enrichment of phosphopeptides, an optimized set of loading, wash, and elution conditions were realized for TiO2. A comparison of TiO2 with other resins used for phosphopeptide enrichment, Fe(III)-IMAC and ZrO2, was also carried out using trypt...

  20. Analysis of Process Parameters for Optimization of Plastic Extrusion in Pipe Manufacturing

    Directory of Open Access Journals (Sweden)

    Mr. Sandip S. Gadekar

    2015-05-01

    Full Text Available The objective of this paper is to study the defects in the plastic pipe, to optimize the plastic pipe manufacturing process. It is very essential to learn the process parameter and the defect in the plastic pipe manufacturing process to optimize it. For the optimization Taguchi techniques is used in this paper. For the research work Shivraj HY-Tech Drip Irrigation pipe manufacturing, Company was selected. This paper is specifically design for the optimization in the current process. The experiment was analyzed using commercial Minitab16 software, interpretation has made, and optimized factor settings were chosen. After prediction of result the quality loss is calculated and it is compare with before implementation of DOE. The research works has improves the Production, quality and optimizes the process.

  1. Chaotic invasive weed optimization algorithm with application to parameter estimation of chaotic systems

    International Nuclear Information System (INIS)

    Highlights: ► A new meta-heuristic optimization algorithm. ► Integration of invasive weed optimization and chaotic search methods. ► A novel parameter identification scheme for chaotic systems. - Abstract: This paper introduces a novel hybrid optimization algorithm by taking advantage of the stochastic properties of chaotic search and the invasive weed optimization (IWO) method. In order to deal with the weaknesses associated with the conventional method, the proposed chaotic invasive weed optimization (CIWO) algorithm is presented which incorporates the capabilities of chaotic search methods. The functionality of the proposed optimization algorithm is investigated through several benchmark multi-dimensional functions. Furthermore, an identification technique for chaotic systems based on the CIWO algorithm is outlined and validated by several examples. The results established upon the proposed scheme are also supplemented which demonstrate superior performance with respect to other conventional methods.

  2. Adjoint Parameter Sensitivity Analysis for the Hydrodynamic Lattice Boltzmann Method with Applications to Design Optimization

    DEFF Research Database (Denmark)

    Pingen, Georg; Evgrafov, Anton; Maute, Kurt

    2009-01-01

    We present an adjoint parameter sensitivity analysis formulation and solution strategy for the lattice Boltzmann method (LBM). The focus is on design optimization applications, in particular topology optimization. The lattice Boltzmann method is briefly described with an in-depth discussion of...... generalized geometry optimization formulation and derive the corresponding sensitivity analysis for the single relaxation LBM for both topology and shape optimization applications. Using numerical examples, we verify the accuracy of the analytical sensitivity analysis through a comparison with finite...... differences. In addition, we show that for fluidic topology optimization a scaled volume constraint should be used to obtain the desired "0-1" optimal solutions. (C) 2008 Elsevier Ltd. All rights reserved....

  3. Assessing FPAR Source and Parameter Optimization Scheme in Application of a Diagnostic Carbon Flux Model

    Energy Technology Data Exchange (ETDEWEB)

    Turner, D P; Ritts, W D; Wharton, S; Thomas, C; Monson, R; Black, T A

    2009-02-26

    The combination of satellite remote sensing and carbon cycle models provides an opportunity for regional to global scale monitoring of terrestrial gross primary production, ecosystem respiration, and net ecosystem production. FPAR (the fraction of photosynthetically active radiation absorbed by the plant canopy) is a critical input to diagnostic models, however little is known about the relative effectiveness of FPAR products from different satellite sensors nor about the sensitivity of flux estimates to different parameterization approaches. In this study, we used multiyear observations of carbon flux at four eddy covariance flux tower sites within the conifer biome to evaluate these factors. FPAR products from the MODIS and SeaWiFS sensors, and the effects of single site vs. cross-site parameter optimization were tested with the CFLUX model. The SeaWiFs FPAR product showed greater dynamic range across sites and resulted in slightly reduced flux estimation errors relative to the MODIS product when using cross-site optimization. With site-specific parameter optimization, the flux model was effective in capturing seasonal and interannual variation in the carbon fluxes at these sites. The cross-site prediction errors were lower when using parameters from a cross-site optimization compared to parameter sets from optimization at single sites. These results support the practice of multisite optimization within a biome for parameterization of diagnostic carbon flux models.

  4. Optimization of machining parameters and tool selection in 2.5D milling using Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Puneet Tandon

    2011-08-01

    Full Text Available Optimization of machining parameters for improving the machining efficiency is become important, when high capital cost NC machines have been employed for high precision and efficient machining. The strategy is to minimize the production time and cost by optimizing feed per tooth, speed, width of cut, depth of cut and tool diameter by satisfying all the constraints such as maximum machine power, maximum cutting force, maximum machining speed, feed rate, tool life and required surface roughness. The optimal End milling cutter diameter and radial depth of cut (step over are also the key issues for minimization of total production cost. Therefore, in this paper an attempt has been made to include all major parameters such as feed per tooth, speed, width of cut (Step-over and depth of cut along with diameter of tool for minimising the time and production cost during 2.5 D milling. Hence, a mathematical model has been developed and Genetic Algorithm (GA has been proposed to solve the problem. Optimal values of machining parameters have been calculated for benchmark problems and compared with handbook recommendations. It has been found that approximately 13% of production cost can be reduced by choosing optimal cutter diameter and width of cut. Besides this 50% reduction in cost per unit volume and 61% increment in material removal rate has also been reported by selecting optimal cutting parameters over the handbook recommendations.

  5. Improving flood forecasting capability of physically based distributed hydrological model by parameter optimization

    Science.gov (United States)

    Chen, Y.; Li, J.; Xu, H.

    2015-10-01

    Physically based distributed hydrological models discrete the terrain of the whole catchment into a number of grid cells at fine resolution, and assimilate different terrain data and precipitation to different cells, and are regarded to have the potential to improve the catchment hydrological processes simulation and prediction capability. In the early stage, physically based distributed hydrological models are assumed to derive model parameters from the terrain properties directly, so there is no need to calibrate model parameters, but unfortunately, the uncertanties associated with this model parameter deriving is very high, which impacted their application in flood forecasting, so parameter optimization may also be necessary. There are two main purposes for this study, the first is to propose a parameter optimization method for physically based distributed hydrological models in catchment flood forecasting by using PSO algorithm and to test its competence and to improve its performances, the second is to explore the possibility of improving physically based distributed hydrological models capability in cathcment flood forecasting by parameter optimization. In this paper, based on the scalar concept, a general framework for parameter optimization of the PBDHMs for catchment flood forecasting is first proposed that could be used for all PBDHMs. Then, with Liuxihe model as the study model, which is a physically based distributed hydrological model proposed for catchment flood forecasting, the improverd Particle Swarm Optimization (PSO) algorithm is developed for the parameter optimization of Liuxihe model in catchment flood forecasting, the improvements include to adopt the linear decreasing inertia weight strategy to change the inertia weight, and the arccosine function strategy to adjust the acceleration coefficients. This method has been tested in two catchments in southern China with different sizes, and the results show that the improved PSO algorithm could be

  6. Optimization of hydrological parameters of a distributed runoff model based on multiple flood events

    Science.gov (United States)

    Miyamoto, Mamoru; Matsumoto, Kazuhiro; Tsuda, Morimasa; Yamakage, Yuzuru; Iwami, Yoichi; Anai, Hirokazu

    2015-04-01

    The error sources of flood forecasting by a runoff model commonly include input data, model structures, and parameter settings. This study focused on a calibration procedure to minimize errors due to parameter settings. Although many studies have been done on hydrological parameter optimization, they are mostly about individual optimization cases applying a specific optimization technique to a specific flood. Consequently, it is difficult to determine the most appropriate parameter set to make forecasts on future floods, because optimized parameter sets vary by flood type. Thus, this study aimed to develop a comprehensive method for optimizing hydrological parameters of a distributed runoff model for future flood forecasting. A distributed runoff model, PWRI-DHM, was applied to the Gokase River basin of 1,820km2 in Japan in this study. The model with gridded two-layer tanks for the entire target river basin includes hydrological parameters, such as hydraulic conductivity, surface roughness and runoff coefficient, which are set according to land-use and soil-type distributions. Global data sets, e.g., Global Map and DSMW (Digital Soil Map of the World), were employed as input data such as elevation, land use and soil type. Thirteen optimization algorithms such as GA, PSO and DEA were carefully selected from seventy-four open-source algorithms available for public use. These algorithms were used with three error assessment functions to calibrate the parameters of the model to each of fifteen past floods in the predetermined search range. Fifteen optimized parameter sets corresponding to the fifteen past floods were determined by selecting the best sets from the calibration results in terms of reproducible accuracy. This process helped eliminate bias due to type of optimization algorithms. Although the calibration results of each parameter were widely distributed in the search range, statistical significance was found in comparisons between the optimized parameters

  7. Guiding automated NMR structure determination using a global optimization metric, the NMR DP score

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Yuanpeng Janet, E-mail: yphuang@cabm.rutgers.edu; Mao, Binchen; Xu, Fei; Montelione, Gaetano T., E-mail: gtm@rutgers.edu [Rutgers, The State University of New Jersey, Department of Molecular Biology and Biochemistry, Center for Advanced Biotechnology and Medicine, and Northeast Structural Genomics Consortium (United States)

    2015-08-15

    ASDP is an automated NMR NOE assignment program. It uses a distinct bottom-up topology-constrained network anchoring approach for NOE interpretation, with 2D, 3D and/or 4D NOESY peak lists and resonance assignments as input, and generates unambiguous NOE constraints for iterative structure calculations. ASDP is designed to function interactively with various structure determination programs that use distance restraints to generate molecular models. In the CASD–NMR project, ASDP was tested and further developed using blinded NMR data, including resonance assignments, either raw or manually-curated (refined) NOESY peak list data, and in some cases {sup 15}N–{sup 1}H residual dipolar coupling data. In these blinded tests, in which the reference structure was not available until after structures were generated, the fully-automated ASDP program performed very well on all targets using both the raw and refined NOESY peak list data. Improvements of ASDP relative to its predecessor program for automated NOESY peak assignments, AutoStructure, were driven by challenges provided by these CASD–NMR data. These algorithmic improvements include (1) using a global metric of structural accuracy, the discriminating power score, for guiding model selection during the iterative NOE interpretation process, and (2) identifying incorrect NOESY cross peak assignments caused by errors in the NMR resonance assignment list. These improvements provide a more robust automated NOESY analysis program, ASDP, with the unique capability of being utilized with alternative structure generation and refinement programs including CYANA, CNS, and/or Rosetta.

  8. Guiding automated NMR structure determination using a global optimization metric, the NMR DP score

    International Nuclear Information System (INIS)

    ASDP is an automated NMR NOE assignment program. It uses a distinct bottom-up topology-constrained network anchoring approach for NOE interpretation, with 2D, 3D and/or 4D NOESY peak lists and resonance assignments as input, and generates unambiguous NOE constraints for iterative structure calculations. ASDP is designed to function interactively with various structure determination programs that use distance restraints to generate molecular models. In the CASD–NMR project, ASDP was tested and further developed using blinded NMR data, including resonance assignments, either raw or manually-curated (refined) NOESY peak list data, and in some cases 15N–1H residual dipolar coupling data. In these blinded tests, in which the reference structure was not available until after structures were generated, the fully-automated ASDP program performed very well on all targets using both the raw and refined NOESY peak list data. Improvements of ASDP relative to its predecessor program for automated NOESY peak assignments, AutoStructure, were driven by challenges provided by these CASD–NMR data. These algorithmic improvements include (1) using a global metric of structural accuracy, the discriminating power score, for guiding model selection during the iterative NOE interpretation process, and (2) identifying incorrect NOESY cross peak assignments caused by errors in the NMR resonance assignment list. These improvements provide a more robust automated NOESY analysis program, ASDP, with the unique capability of being utilized with alternative structure generation and refinement programs including CYANA, CNS, and/or Rosetta

  9. Application of an Evolutionary Algorithm for Parameter Optimization in a Gully Erosion Model

    Energy Technology Data Exchange (ETDEWEB)

    Rengers, Francis; Lunacek, Monte; Tucker, Gregory

    2016-06-01

    Herein we demonstrate how to use model optimization to determine a set of best-fit parameters for a landform model simulating gully incision and headcut retreat. To achieve this result we employed the Covariance Matrix Adaptation Evolution Strategy (CMA-ES), an iterative process in which samples are created based on a distribution of parameter values that evolve over time to better fit an objective function. CMA-ES efficiently finds optimal parameters, even with high-dimensional objective functions that are non-convex, multimodal, and non-separable. We ran model instances in parallel on a high-performance cluster, and from hundreds of model runs we obtained the best parameter choices. This method is far superior to brute-force search algorithms, and has great potential for many applications in earth science modeling. We found that parameters representing boundary conditions tended to converge toward an optimal single value, whereas parameters controlling geomorphic processes are defined by a range of optimal values.

  10. Automated mapping of linear dunefield morphometric parameters from remotely-sensed data

    Science.gov (United States)

    Telfer, M. W.; Fyfe, R. M.; Lewin, S.

    2015-12-01

    Linear dunes are among the world's most common desert dune types, and typically occur in dunefields arranged in remarkably organized patterns extending over hundreds of kilometers. The causes of the patterns, formed by dunes merging, bifurcating and terminating, are still poorly understood, although it is widely accepted that they are emergent properties of the complex system of interactions between the boundary layer and an often-vegetated erodible substrate. Where such dunefields are vegetated, they are typically used as extensive rangeland, yet it is evident that many currently stabilized dunefields have been reactivated repeatedly during the late Quaternary. It has been suggested that dunefield patterning and the temporal evolution of dunefields are related, and thus there is considerable interest in better understanding the boundary conditions controlling dune patterning, especially given the possibility of reactivation of currently-stabilized dunefields under 21st century climate change. However, the time-consuming process of manual dune mapping has hampered attempts at quantitative description of dunefield patterning. This study aims to develop and test methods for delineating linear dune trendlines automatically from freely-available remotely sensed datasets. The highest resolution free global topographic data presently available (Aster GDEM v2) proved to be of marginal use, as the topographic expression of the dunes is of the same order as the vertical precision of the dataset (∼10 m), but in regions with relatively simple patterning it defined dune trends adequately. Analysis of spectral data (panchromatic Landsat 8 data) proved more promising in five of the six test sites, and despite poor panchromatic signal/noise ratios for the sixth site, the reflectance in the deep blue/violet (Landsat 8 Band 1) offers an alternative method of delineating dune pattern. A new edge detection algorithm (LInear Dune Optimized edge detection; LIDO) is proposed, based on

  11. Themoeconomic optimization of triple pressure heat recovery steam generator operating parameters for combined cycle plants

    Directory of Open Access Journals (Sweden)

    Mohammd Mohammed S.

    2015-01-01

    Full Text Available The aim of this work is to develop a method for optimization of operating parameters of a triple pressure heat recovery steam generator. Two types of optimization: (a thermodynamic and (b thermoeconomic were preformed. The purpose of the thermodynamic optimization is to maximize the efficiency of the plant. The selected objective for this purpose is minimization of the exergy destruction in the heat recovery steam generator (HRSG. The purpose of the thermoeconomic optimization is to decrease the production cost of electricity. Here, the total annual cost of HRSG, defined as a sum of annual values of the capital costs and the cost of the exergy destruction, is selected as the objective function. The optimal values of the most influencing variables are obtained by minimizing the objective function while satisfying a group of constraints. The optimization algorithm is developed and tested on a case of CCGT plant with complex configuration. Six operating parameters were subject of optimization: pressures and pinch point temperatures of every three (high, intermediate and low pressure steam stream in the HRSG. The influence of these variables on the objective function and production cost are investigated in detail. The differences between results of thermodynamic and the thermoeconomic optimization are discussed.

  12. Automated Large Scale Parameter Extraction of Road-Side Trees Sampled by a Laser Mobile Mapping System

    Science.gov (United States)

    Lindenbergh, R. C.; Berthold, D.; Sirmacek, B.; Herrero-Huerta, M.; Wang, J.; Ebersbach, D.

    2015-08-01

    In urbanized Western Europe trees are considered an important component of the built-up environment. This also means that there is an increasing demand for tree inventories. Laser mobile mapping systems provide an efficient and accurate way to sample the 3D road surrounding including notable roadside trees. Indeed, at, say, 50 km/h such systems collect point clouds consisting of half a million points per 100m. Method exists that extract tree parameters from relatively small patches of such data, but a remaining challenge is to operationally extract roadside tree parameters at regional level. For this purpose a workflow is presented as follows: The input point clouds are consecutively downsampled, retiled, classified, segmented into individual trees and upsampled to enable automated extraction of tree location, tree height, canopy diameter and trunk diameter at breast height (DBH). The workflow is implemented to work on a laser mobile mapping data set sampling 100 km of road in Sachsen, Germany and is tested on a stretch of road of 7km long. Along this road, the method detected 315 trees that were considered well detected and 56 clusters of tree points were no individual trees could be identified. Using voxels, the data volume could be reduced by about 97 % in a default scenario. Processing the results of this scenario took ~2500 seconds, corresponding to about 10 km/h, which is getting close to but is still below the acquisition rate which is estimated at 50 km/h.

  13. Automated detection of sleep apnea from electrocardiogram signals using nonlinear parameters

    International Nuclear Information System (INIS)

    Sleep apnoea is a very common sleep disorder which can cause symptoms such as daytime sleepiness, irritability and poor concentration. To monitor patients with this sleeping disorder we measured the electrical activity of the heart. The resulting electrocardiography (ECG) signals are both non-stationary and nonlinear. Therefore, we used nonlinear parameters such as approximate entropy, fractal dimension, correlation dimension, largest Lyapunov exponent and Hurst exponent to extract physiological information. This information was used to train an artificial neural network (ANN) classifier to categorize ECG signal segments into one of the following groups: apnoea, hypopnoea and normal breathing. ANN classification tests produced an average classification accuracy of 90%; specificity and sensitivity were 100% and 95%, respectively. We have also proposed unique recurrence plots for the normal, hypopnea and apnea classes. Detecting sleep apnea with this level of accuracy can potentially reduce the need of polysomnography (PSG). This brings advantages to patients, because the proposed system is less cumbersome when compared to PSG

  14. GENPLAT: an automated platform for biomass enzyme discovery and cocktail optimization.

    Science.gov (United States)

    Walton, Jonathan; Banerjee, Goutami; Car, Suzana

    2011-01-01

    The high cost of enzymes for biomass deconstruction is a major impediment to the economic conversion of lignocellulosic feedstocks to liquid transportation fuels such as ethanol. We have developed an integrated high throughput platform, called GENPLAT, for the discovery and development of novel enzymes and enzyme cocktails for the release of sugars from diverse pretreatment/biomass combinations. GENPLAT comprises four elements: individual pure enzymes, statistical design of experiments, robotic pipeting of biomass slurries and enzymes, and automated colorimeteric determination of released Glc and Xyl. Individual enzymes are produced by expression in Pichia pastoris or Trichoderma reesei, or by chromatographic purification from commercial cocktails or from extracts of novel microorganisms. Simplex lattice (fractional factorial) mixture models are designed using commercial Design of Experiment statistical software. Enzyme mixtures of high complexity are constructed using robotic pipeting into a 96-well format. The measurement of released Glc and Xyl is automated using enzyme-linked colorimetric assays. Optimized enzyme mixtures containing as many as 16 components have been tested on a variety of feedstock and pretreatment combinations. GENPLAT is adaptable to mixtures of pure enzymes, mixtures of commercial products (e.g., Accellerase 1000 and Novozyme 188), extracts of novel microbes, or combinations thereof. To make and test mixtures of ˜10 pure enzymes requires less than 100 μg of each protein and fewer than 100 total reactions, when operated at a final total loading of 15 mg protein/g glucan. We use enzymes from several sources. Enzymes can be purified from natural sources such as fungal cultures (e.g., Aspergillus niger, Cochliobolus carbonum, and Galerina marginata), or they can be made by expression of the encoding genes (obtained from the increasing number of microbial genome sequences) in hosts such as E. coli, Pichia pastoris, or a filamentous fungus such

  15. Improving flood forecasting capability of physically based distributed hydrological models by parameter optimization

    Science.gov (United States)

    Chen, Y.; Li, J.; Xu, H.

    2016-01-01

    Physically based distributed hydrological models (hereafter referred to as PBDHMs) divide the terrain of the whole catchment into a number of grid cells at fine resolution and assimilate different terrain data and precipitation to different cells. They are regarded to have the potential to improve the catchment hydrological process simulation and prediction capability. In the early stage, physically based distributed hydrological models are assumed to derive model parameters from the terrain properties directly, so there is no need to calibrate model parameters. However, unfortunately the uncertainties associated with this model derivation are very high, which impacted their application in flood forecasting, so parameter optimization may also be necessary. There are two main purposes for this study: the first is to propose a parameter optimization method for physically based distributed hydrological models in catchment flood forecasting by using particle swarm optimization (PSO) algorithm and to test its competence and to improve its performances; the second is to explore the possibility of improving physically based distributed hydrological model capability in catchment flood forecasting by parameter optimization. In this paper, based on the scalar concept, a general framework for parameter optimization of the PBDHMs for catchment flood forecasting is first proposed that could be used for all PBDHMs. Then, with the Liuxihe model as the study model, which is a physically based distributed hydrological model proposed for catchment flood forecasting, the improved PSO algorithm is developed for the parameter optimization of the Liuxihe model in catchment flood forecasting. The improvements include adoption of the linearly decreasing inertia weight strategy to change the inertia weight and the arccosine function strategy to adjust the acceleration coefficients. This method has been tested in two catchments in southern China with different sizes, and the results show

  16. GEMSFITS: Code package for optimization of geochemical model parameters and inverse modeling

    International Nuclear Information System (INIS)

    Highlights: • Tool for generating consistent parameters against various types of experiments. • Handles a large number of experimental data and parameters (is parallelized). • Has a graphical interface and can perform statistical analysis on the parameters. • Tested on fitting the standard state Gibbs free energies of aqueous Al species. • Example on fitting interaction parameters of mixing models and thermobarometry. - Abstract: GEMSFITS is a new code package for fitting internally consistent input parameters of GEM (Gibbs Energy Minimization) geochemical–thermodynamic models against various types of experimental or geochemical data, and for performing inverse modeling tasks. It consists of the gemsfit2 (parameter optimizer) and gfshell2 (graphical user interface) programs both accessing a NoSQL database, all developed with flexibility, generality, efficiency, and user friendliness in mind. The parameter optimizer gemsfit2 includes the GEMS3K chemical speciation solver ( (http://gems.web.psi.ch/GEMS3K)), which features a comprehensive suite of non-ideal activity- and equation-of-state models of solution phases (aqueous electrolyte, gas and fluid mixtures, solid solutions, (ad)sorption. The gemsfit2 code uses the robust open-source NLopt library for parameter fitting, which provides a selection between several nonlinear optimization algorithms (global, local, gradient-based), and supports large-scale parallelization. The gemsfit2 code can also perform comprehensive statistical analysis of the fitted parameters (basic statistics, sensitivity, Monte Carlo confidence intervals), thus supporting the user with powerful tools for evaluating the quality of the fits and the physical significance of the model parameters. The gfshell2 code provides menu-driven setup of optimization options (data selection, properties to fit and their constraints, measured properties to compare with computed counterparts, and statistics). The practical utility, efficiency, and

  17. Comparative Study for Different Statistical Models to Optimize Cutting Parameters of CNC End Milling Machines

    International Nuclear Information System (INIS)

    In machining operation, the quality of surface finish is an important requirement for many work pieces. Thus, that is very important to optimize cutting parameters for controlling the required manufacturing quality. Surface roughness parameter (Ra) in mechanical parts depends on turning parameters during the turning process. In the development of predictive models, cutting parameters of feed, cutting speed, depth of cut, are considered as model variables. For this purpose, this study focuses on comparing various machining experiments which using CNC vertical machining center, work pieces was aluminum 6061. Multiple regression models is used to predict the surface roughness at different experiments.

  18. Comparative study for different statistical models to optimize cutting parameters of CNC end milling machines

    International Nuclear Information System (INIS)

    In machining operation, the quality of surface finish is an important requirement for many work pieces. Thus, that is very important to optimize cutting parameters for controlling the required manufacturing quality. Surface roughness parameter (Ra) in mechanical parts depends on turning parameters during the turning process. In the development of predictive models, cutting parameters of feed, cutting speed, depth of cut, are considered as model variables. For this purpose, this study focuses on comparing various machining experiments which using CNC vertical machining center, work pieces was aluminum 6061. Multiple regression models are used to predict the surface roughness at different experiments.

  19. Optimization of TRPO Process Parameters for Americium Extraction from High Level Waste

    Institute of Scientific and Technical Information of China (English)

    CHEN Jing; WANG Jianchen; SONG Chongli

    2001-01-01

    The numerical calculations for Am multistage fractional extraction by trialkyl phosphine oxide (TRPO) were verified by a hot test.1750 L/t-U high level waste (HLW) was used as the feed to the TRPO process.The analysis used the simple objective function to minimize the total waste content in the TRPO process streams.Some process parameters were optimized after other parameters were selected.The optimal process parameters for Am extraction by TRPO are:10 stages for extraction and 2 stages for scrubbing;a flow rate ratio of 0.931 for extraction and 4.42 for scrubbing;nitric acid concentration of 1.35 mol/L for the feed and 0.5 mol/L for the scrubbing solution.Finally,the nitric acid and Am concentration profiles in the optimal TRPO extraction process are given.

  20. Thermo-mechanical simulation and parameters optimization for beam blank continuous casting

    International Nuclear Information System (INIS)

    The objective of this work is to optimize the process parameters of beam blank continuous casting in order to ensure high quality and productivity. A transient thermo-mechanical finite element model is developed to compute the temperature and stress profile in beam blank continuous casting. By comparing the calculated data with the metallurgical constraints, the key factors causing defects of beam blank can be found out. Then based on the subproblem approximation method, an optimization program is developed to search out the optimum cooling parameters. Those optimum parameters can make it possible to run the caster at its maximum productivity, minimum cost and to reduce the defects. Now, online verifying of this optimization project has been put in practice, which can prove that it is very useful to control the actual production

  1. Optimal Step-wise Parameter Optimization of a FOREX Trading Strategy

    OpenAIRE

    Alberto De Santis; Umberto Dellepiane; Stefano Lucidi; Stefania Renzi

    2014-01-01

    The goal of trading simply consists in gaining profit by buying/selling a security: the difference between the entry and the exit price in a position determines the profit or loss of that trade. A trading strategy is used to identify proper conditions to trade a security. The role of optimization consists in finding the best conditions to start a trading maximizing the profit. In this general scenario, the strategy is trained on a chosen batch of data (training set) and applied on the next ba...

  2. EXERGOECONOMIC OPTIMIZATION OF GAS TURBINE POWER PLANTS OPERATING PARAMETERS USING GENETIC ALGORITHMS: A CASE STUDY

    OpenAIRE

    Mofid Gorji-Bandpy; Hamed Goodarzian

    2011-01-01

    Exergoeconomic analysis helps designers to find ways to improve the performance of a system in a cost effective way. This can play a vital role in the analysis, design and optimization of thermal systems. Thermoeconomic optimization is a powerful and effective tool in finding the best solutions between the two competing objectives, minimizing economic costs and maximizing exergetic efficiency. In this paper, operating parameters of a gas turbine power plant that produce 140MW of electricity w...

  3. OPTIMIZATION OF CUTTING PARAMETERS ON THE BASIS OF SEMANTIC NETWORK USAGE

    Directory of Open Access Journals (Sweden)

    V. M. Pashkevich

    2011-01-01

    Full Text Available The paper considers problems on accuracy assurance of machine component cutting while using edge tools. An approach based on artificial intelligence technologies in particular technologies of functional semantic networks. The paper analyzes a possibility to apply functional semantic networks for optimization of cutting parameters. An intellectual system intended for solution of applied problems is described in the paper. The paper reveals a system structure and an example for setting optimal cutting speed is cited in the paper. 

  4. Parameters identification of unknown delayed genetic regulatory networks by a switching particle swarm optimization algorithm

    OpenAIRE

    Tang, Y.; Wang, Z; J. Fang

    2011-01-01

    The official published version can be found at the link below. This paper presents a novel particle swarm optimization (PSO) algorithm based on Markov chains and competitive penalized method. Such an algorithm is developed to solve global optimization problems with applications in identifying unknown parameters of a class of genetic regulatory networks (GRNs). By using an evolutionary factor, a new switching PSO (SPSO) algorithm is first proposed and analyzed, where the velocity updating e...

  5. Taguchi Optimization of Process Parameters on the Hardness and Impact Energy of Aluminium Alloy Sand Castings

    Directory of Open Access Journals (Sweden)

    John O. OJI

    2013-11-01

    Full Text Available An optimization technique for sand casting process parameters based on the Taguchi method is reported in this paper. While keeping other casting parameters constant, aluminium alloy castings were prepared by sand casting technique using three different parameters, namely the mould temperature, pouring temperature and runner size. Hardness and impact energy tests were done for the resulted castings. The settings of parameters were determined by using the Taguchi experimental design method. The level of importance of the parameters on the hardness impact energy was determined using the analysis of variance (ANOVA. The optimum parameter combination was obtained by using the analysis of signal-to-noise (S/N ratio. Analysis of the results shows that 100°C mould temperature and 700°C pouring temperatures are optimal values for hardness and impact energy. However 200 mm2 and 285 mm2 runner sizes are the optimal values for hardness and impact energy respectively. The mould temperature was the most influential parameter on the hardness impact energy of the castings.

  6. Determination of the Johnson-Cook Constitutive Model Parameters of Materials by Cluster Global Optimization Algorithm

    Science.gov (United States)

    Huang, Zhipeng; Gao, Lihong; Wang, Yangwei; Wang, Fuchi

    2016-06-01

    The Johnson-Cook (J-C) constitutive model is widely used in the finite element simulation, as this model shows the relationship between stress and strain in a simple way. In this paper, a cluster global optimization algorithm is proposed to determine the J-C constitutive model parameters of materials. A set of assumed parameters is used for the accuracy verification of the procedure. The parameters of two materials (401 steel and 823 steel) are determined. Results show that the procedure is reliable and effective. The relative error between the optimized and assumed parameters is no more than 4.02%, and the relative error between the optimized and assumed stress is 0.2% × 10-5. The J-C constitutive parameters can be determined more precisely and quickly than the traditional manual procedure. Furthermore, all the parameters can be simultaneously determined using several curves under different experimental conditions. A strategy is also proposed to accurately determine the constitutive parameters.

  7. Trafficability Analysis at Traffic Crossing and Parameters Optimization Based on Particle Swarm Optimization Method

    OpenAIRE

    Bin He; Qiang Lu

    2014-01-01

    In city traffic, it is important to improve transportation efficiency and the spacing of platoon should be shortened when crossing the street. The best method to deal with this problem is automatic control of vehicles. In this paper, a mathematical model is established for the platoon’s longitudinal movement. A systematic analysis of longitudinal control law is presented for the platoon of vehicles. However, the parameter calibration for the platoon model is relatively difficult because the p...

  8. A multicriteria framework with voxel-dependent parameters for radiotherapy treatment plan optimization

    International Nuclear Information System (INIS)

    Purpose: To establish a new mathematical framework for radiotherapy treatment optimization with voxel-dependent optimization parameters. Methods: In the treatment plan optimization problem for radiotherapy, a clinically acceptable plan is usually generated by an optimization process with weighting factors or reference doses adjusted for a set of the objective functions associated to the organs. Recent discoveries indicate that adjusting parameters associated with each voxel may lead to better plan quality. However, it is still unclear regarding the mathematical reasons behind it. Furthermore, questions about the objective function selection and parameter adjustment to assure Pareto optimality as well as the relationship between the optimal solutions obtained from the organ-based and voxel-based models remain unanswered. To answer these questions, the authors establish in this work a new mathematical framework equipped with two theorems. Results: The new framework clarifies the different consequences of adjusting organ-dependent and voxel-dependent parameters for the treatment plan optimization of radiation therapy, as well as the impact of using different objective functions on plan qualities and Pareto surfaces. The main discoveries are threefold: (1) While in the organ-based model the selection of the objective function has an impact on the quality of the optimized plans, this is no longer an issue for the voxel-based model since the Pareto surface is independent of the objective function selection and the entire Pareto surface could be generated as long as the objective function satisfies certain mathematical conditions; (2) All Pareto solutions generated by the organ-based model with different objective functions are parts of a unique Pareto surface generated by the voxel-based model with any appropriate objective function; (3) A much larger Pareto surface is explored by adjusting voxel-dependent parameters than by adjusting organ-dependent parameters, possibly

  9. A multicriteria framework with voxel-dependent parameters for radiotherapy treatment plan optimization

    Energy Technology Data Exchange (ETDEWEB)

    Zarepisheh, Masoud; Uribe-Sanchez, Andres F.; Li, Nan; Jia, Xun; Jiang, Steve B., E-mail: Steve.Jiang@UTSouthwestern.edu [Center for Advanced Radiotherapy Technologies and Department of Radiation Medicine and Applied Sciences, University of California San Diego, La Jolla, California 92037-0843 (United States)

    2014-04-15

    Purpose: To establish a new mathematical framework for radiotherapy treatment optimization with voxel-dependent optimization parameters. Methods: In the treatment plan optimization problem for radiotherapy, a clinically acceptable plan is usually generated by an optimization process with weighting factors or reference doses adjusted for a set of the objective functions associated to the organs. Recent discoveries indicate that adjusting parameters associated with each voxel may lead to better plan quality. However, it is still unclear regarding the mathematical reasons behind it. Furthermore, questions about the objective function selection and parameter adjustment to assure Pareto optimality as well as the relationship between the optimal solutions obtained from the organ-based and voxel-based models remain unanswered. To answer these questions, the authors establish in this work a new mathematical framework equipped with two theorems. Results: The new framework clarifies the different consequences of adjusting organ-dependent and voxel-dependent parameters for the treatment plan optimization of radiation therapy, as well as the impact of using different objective functions on plan qualities and Pareto surfaces. The main discoveries are threefold: (1) While in the organ-based model the selection of the objective function has an impact on the quality of the optimized plans, this is no longer an issue for the voxel-based model since the Pareto surface is independent of the objective function selection and the entire Pareto surface could be generated as long as the objective function satisfies certain mathematical conditions; (2) All Pareto solutions generated by the organ-based model with different objective functions are parts of a unique Pareto surface generated by the voxel-based model with any appropriate objective function; (3) A much larger Pareto surface is explored by adjusting voxel-dependent parameters than by adjusting organ-dependent parameters, possibly

  10. Automated criterion-based analysis for Cole parameters assessment from cerebral neonatal electrical bioimpedance spectroscopy measurements

    International Nuclear Information System (INIS)

    Hypothermia has been proven as an effective rescue therapy for infants with moderate or severe neonatal hypoxic ischemic encephalopathy. Hypoxia-ischemia alters the electrical impedance characteristics of the brain in neonates; therefore, spectroscopic analysis of the cerebral bioimpedance of the neonate may be useful for the detection of candidate neonates eligible for hypothermia treatment. Currently, in addition to the lack of reference bioimpedance data obtained from healthy neonates, there is no standardized approach established for bioimpedance spectroscopy data analysis. In this work, cerebral bioimpedance measurements (12 h postpartum) in a cross-section of 84 term and near-term healthy neonates were performed at the bedside in the post-natal ward. To characterize the impedance spectra, Cole parameters (R0, R∞, fC and α) were extracted from the obtained measurements using an analysis process based on a best measurement and highest likelihood selection process. The results obtained in this study complement previously reported work and provide a standardized criterion-based method for data analysis. The availability of electrical bioimpedance spectroscopy reference data and the automatic criterion-based analysis method might support the development of a non-invasive method for prompt selection of neonates eligible for cerebral hypothermic rescue therapy. (paper)

  11. Microprocessor-based integration of microfluidic control for the implementation of automated sensor monitoring and multithreaded optimization algorithms.

    Science.gov (United States)

    Ezra, Elishai; Maor, Idan; Bavli, Danny; Shalom, Itai; Levy, Gahl; Prill, Sebastian; Jaeger, Magnus S; Nahmias, Yaakov

    2015-08-01

    Microfluidic applications range from combinatorial synthesis to high throughput screening, with platforms integrating analog perfusion components, digitally controlled micro-valves and a range of sensors that demand a variety of communication protocols. Currently, discrete control units are used to regulate and monitor each component, resulting in scattered control interfaces that limit data integration and synchronization. Here, we present a microprocessor-based control unit, utilizing the MS Gadgeteer open framework that integrates all aspects of microfluidics through a high-current electronic circuit that supports and synchronizes digital and analog signals for perfusion components, pressure elements, and arbitrary sensor communication protocols using a plug-and-play interface. The control unit supports an integrated touch screen and TCP/IP interface that provides local and remote control of flow and data acquisition. To establish the ability of our control unit to integrate and synchronize complex microfluidic circuits we developed an equi-pressure combinatorial mixer. We demonstrate the generation of complex perfusion sequences, allowing the automated sampling, washing, and calibrating of an electrochemical lactate sensor continuously monitoring hepatocyte viability following exposure to the pesticide rotenone. Importantly, integration of an optical sensor allowed us to implement automated optimization protocols that require different computational challenges including: prioritized data structures in a genetic algorithm, distributed computational efforts in multiple-hill climbing searches and real-time realization of probabilistic models in simulated annealing. Our system offers a comprehensive solution for establishing optimization protocols and perfusion sequences in complex microfluidic circuits. PMID:26227212

  12. Parameter estimation for chaotic system with initial random noises by particle swarm optimization

    International Nuclear Information System (INIS)

    This paper is concerned with the unknown parameters and time-delays of nonlinear chaotic systems with random initial noises. A scheme based on particle swarm optimization(PSO) is newly introduced to solve the problem via a nonnegative multi-modal nonlinear optimization, which finds a best combination of parameters and time-delays such that an objective function is minimized. The illustrative examples, in chaos systems with time-delays or free, are given to demonstrate the effectiveness of the present method.

  13. A Minimum Delta V Orbit Maintenance Strategy for Low-Altitude Missions Using Burn Parameter Optimization

    Science.gov (United States)

    Brown, Aaron J.

    2011-01-01

    Orbit maintenance is the series of burns performed during a mission to ensure the orbit satisfies mission constraints. Low-altitude missions often require non-trivial orbit maintenance Delta V due to sizable orbital perturbations and minimum altitude thresholds. A strategy is presented for minimizing this Delta V using impulsive burn parameter optimization. An initial estimate for the burn parameters is generated by considering a feasible solution to the orbit maintenance problem. An low-lunar orbit example demonstrates the Delta V savings from the feasible solution to the optimal solution. The strategy s extensibility to more complex missions is discussed, as well as the limitations of its use.

  14. Optimization of the parameters of a virtual-cathode oscillator with an inhomogeneous magnetic field

    Science.gov (United States)

    Kurkin, S. A.; Koronovskii, A. A.; Khramov, A. E.; Kuraev, A. A.; Kolosov, S. V.

    2013-10-01

    A two-dimensional numerical model is used to study the generation of powerful microwave radiation in a vircator with an inhomogeneous magnetic field applied to focus a beam. The characteristics of the external inhomogeneous magnetic field are found to strongly affect the vircator generation characteristics. Mathematical optimization is used to search for the optimum parameters of the magnetic periodic focusing system of the oscillator in order to achieve the maximum power of the output microwave radiation. The dependences of the output vircator power on the characteristics of the external inhomogeneous magnetic field are studied near the optimum control parameters. The physical processes that occur in optimized virtual cathode oscillators are investigated.

  15. Identification of Dynamic Parameters Based on Pseudo-Parallel Ant Colony Optimization Algorithm

    Institute of Scientific and Technical Information of China (English)

    ZHAO Feng-yao; MA Zhen-yue; ZHANG Yun-liang

    2007-01-01

    For the parameter identification of dynamic problems, a pseudo-parallel ant colony optimization (PPACO) algorithm based on graph-based ant system (AS) was introduced. On the platform of ANSYS dynamic analysis, the PPACO algorithm was applied to the identification of dynamic parameters successfully. Using simulated data of forces and displacements, elastic modulus E and damping ratio ξ was identified for a designed 3D finite element model, and the detailed identification step was given. Mathematical example and simulation example show that the proposed method has higher precision, faster convergence speed and stronger antinoise ability compared with the standard genetic algorithm and the ant colony optimization (ACO) algorithms.

  16. CH4 parameter estimation in CLM4.5bgc using surrogate global optimization

    Directory of Open Access Journals (Sweden)

    J. Müller

    2015-01-01

    Full Text Available Over the anthropocene methane has increased dramatically. Wetlands are one of the major sources of methane to the atmosphere, but the role of changes in wetland emissions is not well understood. The Community Land Model (CLM of the Community Earth System Models contains a module to estimate methane emissions from natural wetlands and rice paddies. Our comparison of CH4 emission observations at 16 sites around the planet reveals, however, that there are large discrepancies between the CLM predictions and the observations. The goal of our study is to adjust the model parameters in order to minimize the root mean squared error (RMSE between model predictions and observations. These parameters have been selected based on a sensitivity analysis. Because of the cost associated with running the CLM simulation (15 to 30 min on the Yellowstone Supercomputing Facility, only relatively few simulations can be allowed in order to find a near optimal solution within an acceptable time. Our results indicate that the parameter estimation problem has multiple local minima. Hence, we use a computationally efficient global optimization algorithm that uses a radial basis function (RBF surrogate model to approximate the objective function. We use the information from the RBF to select parameter values that are most promising with respect to improving the objective function value. We show with pseudo data that our optimization algorithm is able to make excellent progress with respect to decreasing the RMSE. Using the true CH4 emission observations for optimizing the parameters, we are able to significantly reduce the overall RMSE between observations and model predictions by about 50%. The CLM predictions with the optimized parameters agree for northern and tropical latitudes more with the observed data than when using the default parameters and the emission predictions are higher than with default settings in northern latitudes and lower than default settings in the

  17. OPTIMIZATION OF MACHINING PARAMETERS IN TURNING PROCESS USING GENETIC ALGORITHM AND PARTICLE SWARM OPTIMIZATION WITH EXPERIMENTAL VERIFICATION

    Directory of Open Access Journals (Sweden)

    K.RAMESH KUMAR

    2011-02-01

    Full Text Available Optimization of cutting parameters is one of the most important elements in any process planning of metal parts. Economy of machining operation plays a key role in competitiveness in the market. All CNCmachines produce finished components from cylindrical bar. Finished profiles consist of straight turning, facing, taper and circular machining. Finished profile from a cylindrical bar is done in two stages, rough machining and finish machining. Numbers of passes are required for rough machining and single pass is required for the finished pass. The machining parameters in multipass turning are depth of cut, cutting speed and feed. The machining performance is measured by the minimum production time. In this paper the optimal machining parameters for continuous profile machining are determinedwith respect to the minimum production time, subject to a set of practical constraints, cutting force, power and dimensional accuracy and surface finish. Due to complexity of this machining optimizationproblem, a genetic algorithm (GA and Particle Swarm Optimization (PSO are applied to resolve the problem and the results obtained from GA and PSO are compared.

  18. Design Optimization of RFI Parameters by Manufacturing T-shaped Composite Panel

    Institute of Scientific and Technical Information of China (English)

    ZHANG Guo-li; HUANG Gu

    2005-01-01

    The aim of this project is to develop a novel approach for optimizing design resin film infusion (RFI) processing parameters by manufacturing T-shaped composite panel. The dimensional accuracy was selected as the objective function. By investigating the rheological properties of resin film, the compaction behavior of fiber preform and characteristics of RFI process, an optimal mathematical model was established, it was found that the numerical results obtained from the RFICOMP program package have good consistency with the experimental results, and this optimization procedure can be applied to other composites manufacture processes.

  19. Adaptive hybrid optimization strategy for calibration and parameter estimation of physical models

    CERN Document Server

    Vesselinov, Velimir V

    2011-01-01

    A new adaptive hybrid optimization strategy, entitled squads, is proposed for complex inverse analysis of computationally intensive physical models. The new strategy is designed to be computationally efficient and robust in identification of the global optimum (e.g. maximum or minimum value of an objective function). It integrates a global Adaptive Particle Swarm Optimization (APSO) strategy with a local Levenberg-Marquardt (LM) optimization strategy using adaptive rules based on runtime performance. The global strategy optimizes the location of a set of solutions (particles) in the parameter space. The LM strategy is applied only to a subset of the particles at different stages of the optimization based on the adaptive rules. After the LM adjustment of the subset of particle positions, the updated particles are returned to the APSO strategy. The advantages of coupling APSO and LM in the manner implemented in squads is demonstrated by comparisons of squads performance against Levenberg-Marquardt (LM), Particl...

  20. Cutting Parameters Optimization for Surface Roughness in Turning Operation of Polyethylene (PE Using Taguchi Method

    Directory of Open Access Journals (Sweden)

    D. Lazarevic

    2012-06-01

    Full Text Available In any machining process, it is most important to determine the optimal settings of machining parameters aiming at reduction of production costs and achieving the desired product quality. This paper discusses the use of Taguchi method for minimizing the surface roughness in turning polyethylene. The influence of four cutting parameters, cutting speed, feed rate, depth of cut, and tool nose radius on average surface roughness (Ra was analyzed on the basis of the standard L27 Taguchi orthogonal array. The experimental results were then collected and analyzed with the help of the commercial software package MINITAB. Based on the analysis of means (ANOM and analysis of variance (ANOVA, the optimal cutting parameter settings are determined, as well as level of importance of the cutting parameters.

  1. Process optimization and biocompatibility of cell carriers suitable for automated magnetic manipulation.

    OpenAIRE

    Krejci, I; Piana, C.; Howitz, S.; Wegener, T; Fiedler, S.; ZWANZIG, M.; Schmitt, D.; Daum, N; Meier, K.; Lehr, C. M.; Batista, U; Zemljic, S; Messerschmidt, J.; Franzke, J; M. Wirth

    2012-01-01

    There is increasing demand for automated cell reprogramming in the fields of cell biology, biotechnology and the biomedical sciences. Microfluidic-based platforms that provide unattended manipulation of adherent cells promise to be an appropriate basis for cell manipulation. In this study we developed a magnetically driven cell carrier to serve as a vehicle within an in vitro environment. To elucidate the impact of the carrier on cells, biocompatibility was estimated using the human adenocarc...

  2. Automated optimal glycaemic control using a physiology based pharmacokinetic, pharmacodynamic model

    OpenAIRE

    Schaller, Stephan

    2015-01-01

    After decades of research, Automated Glucose Control (AGC) is still out of reach for everyday control of blood glucose. The inter- and intra-individual variability of glucose dynamics largely arising from variability in insulin absorption, distribution, and action, and related physiological lag-times remain a core problem in the development of suitable control algorithms. Over the years, model predictive control (MPC) has established itself as the gold standard in AGC systems in research. Mod...

  3. Optimal solutions for protection, control and automation in hydroelectric power systems

    International Nuclear Information System (INIS)

    Fault statistics and a poll at the electricity network companies show that incorrect functions from protection, control- and automation equipment contribute relatively much to undelivered energy. Yet there is little focus on doing fault analyses and register such faults in FASIT (a Norwegian system for registration of faults and interruption). This is especially true of the distribution network 1 - 22 kV. This is where the potential of reducing the amount of undelivered energy by introducing various automatic means is greatest

  4. Strategic Optimization and Investigation Effect Of Process Parameters On Performance Of Wire Electric Discharge Machine (WEDM)

    OpenAIRE

    ATUL KUMAR; DR.D.K.SINGH

    2012-01-01

    Wire electrical discharge machining (WEDM) is widely used in machining of conductive materials when precision is of primary significance. Wire-cut electric discharge machining of Skd 61alloy has been considered in the present work. Experimentation has been completed by using Taguchi’s L18 (21x37) orthogonal array under different conditions of parameters. Optimal combinations of parameters were obtained by this technique. The study shows that with the minimum number of experiments the complete...

  5. Further Characterizations of Design Optimality and Admissibility for Partial Parameter Estimation in Linear Regression

    OpenAIRE

    Gaffke, Norbert

    1987-01-01

    The paper gives a contribution to the problem of finding optimal linear regression designs, when only $s$ out of $k$ regression parameters are to be estimated. Also, a treatment of design admissibility for the parameters of interest is included. Previous results of Kiefer and Wolfowitz (1959), Karlin and Studden (1966) and Atwood (1969) are generalized. In particular, a connection to Tchebycheff-type approximation of $\\mathbb{R}^s$-valued functions is found, which has been known in case $s = ...

  6. Parameters extraction of photovoltaic module for long-term prediction using Artifical Bee Colony optimization

    OpenAIRE

    Garoudja, Elyes; Kara, Kamel; Chouder, Aissa; Silvestre Bergés, Santiago

    2015-01-01

    In this paper, a heuristic optimization approach based on Artificial Bee Colony (ABC) algorithm is applied to the extraction of the five electrical parameters of a photovoltaic (PV) module. The proposed approach has several interesting features such as no prior knowledge of the physical system and its convergence is not dependent on the initial conditions. The extracted parameters have been tested against several static IV characteristics of different PV modules from diff...

  7. Optimization of Torque Sensor Input Parameters and Determination of Sensor Errors and Uncertainties

    OpenAIRE

    2006-01-01

    This paper introduces the basic knowledge about magnetoelastic torque sensor designed for non-contact measurements. The paper brings results of Institutional project “The development and realization of torque sensor with appropriate equipment.” The optimization of torque sensor working conditions and sensor parameters are presented. The metrological parameters determined by testing showed availability of sensor using in outdoor applications. Another aim of the paper is evaluation of measureme...

  8. Optimization of torque sensor input parameters and determination of sensor errors and uncertainties

    OpenAIRE

    Jozef Vojtko

    2006-01-01

    This paper introduces the basic knowledge about magnetoelastic torque sensor designed for non-contact measurements. The paper brings results of Institutional project “The development and realization of torque sensor with appropriate equipment.” The optimization of torque sensor working conditions and sensor parameters are presented. The metrological parameters determined by testing showed availability of sensor using in outdoor applications. Another aim of the paper is evaluation of measureme...

  9. OPTIMALITY CRITERIA FOR MEASUREMENT POSES SELECTION IN CALIBRATION OF ROBOT STIFFNESS PARAMETERS

    OpenAIRE

    Wu, Yier; Klimchik, Alexandr; Pashkevich, Anatol; Caro, Stéphane; Furet, Benoît

    2012-01-01

    International audience The paper focuses on the accuracy improvement of industrial robots by means of elasto-static parameters calibration. It proposes a new optimality criterion for measurement poses selection in calibration of robot stiffness parameters. This criterion is based on the concept of the manipulator test pose that is defined by the user via the joint angles and the external force. The proposed approach essentially differs from the traditional ones and ensures the best complia...

  10. Optimizing advanced propeller designs by simultaneously updating flow variables and design parameters

    Science.gov (United States)

    Rizk, Magdi H.

    1988-01-01

    A scheme is developed for solving constrained optimization problems in which the objective function and the constraint function are dependent on the solution of the nonlinear flow equations. The scheme updates the design parameter iterative solutions and the flow variable iterative solutions simultaneously. It is applied to an advanced propeller design problem with the Euler equations used as the flow governing equations. The scheme's accuracy, efficiency and sensitivity to the computational parameters are tested.

  11. Longitudinal parameter identification of a small unmanned aerial vehicle based on modified particle swarm optimization

    OpenAIRE

    Jiang Tieying; Li Jie; Huang Kewei

    2015-01-01

    This paper describes a longitudinal parameter identification procedure for a small unmanned aerial vehicle (UAV) through modified particle swam optimization (PSO). The procedure is demonstrated using a small UAV equipped with only an micro-electro-mechanical systems (MEMS) inertial measuring element and a global positioning system (GPS) receiver to provide test information. A small UAV longitudinal parameter mathematical model is derived and the modified method is proposed based on PSO with s...

  12. Aerodynamic optimization by simultaneously updating flow variables and design parameters with application to advanced propeller designs

    Science.gov (United States)

    Rizk, Magdi H.

    1988-01-01

    A scheme is developed for solving constrained optimization problems in which the objective function and the constraint function are dependent on the solution of the nonlinear flow equations. The scheme updates the design parameter iterative solutions and the flow variable iterative solutions simultaneously. It is applied to an advanced propeller design problem with the Euler equations used as the flow governing equations. The scheme's accuracy, efficiency and sensitivity to the computational parameters are tested.

  13. An Improved Cuckoo Search Optimization Algorithm for the Problem of Chaotic Systems Parameter Estimation

    OpenAIRE

    Jun Wang; Bihua Zhou; Shudao Zhou

    2016-01-01

    This paper proposes an improved cuckoo search (ICS) algorithm to establish the parameters of chaotic systems. In order to improve the optimization capability of the basic cuckoo search (CS) algorithm, the orthogonal design and simulated annealing operation are incorporated in the CS algorithm to enhance the exploitation search ability. Then the proposed algorithm is used to establish parameters of the Lorenz chaotic system and Chen chaotic system under the noiseless and noise condition, resp...

  14. Optimization of regularization parameter of inversion in particle sizing using light extinction method

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    In particle sizing by light extinction method, the regularization parameter plays an important role in applying regularization to find the solution to ill-posed inverse problems. We combine the generalized cross-validation (GCV) and L-curve criteria with the Twomey-NNLS algorithm in parameter optimization. Numerical simulation and experimental validation show that the resistance of the newly developed algorithms to measurement errors can be improved leading to stable inversion results for unimodal particle size distribution.

  15. Quantifying dynamic sensitivity of optimization algorithm parameters to improve hydrological model calibration

    Science.gov (United States)

    Qi, Wei; Zhang, Chi; Fu, Guangtao; Zhou, Huicheng

    2016-02-01

    It is widely recognized that optimization algorithm parameters have significant impacts on algorithm performance, but quantifying the influence is very complex and difficult due to high computational demands and dynamic nature of search parameters. The overall aim of this paper is to develop a global sensitivity analysis based framework to dynamically quantify the individual and interactive influence of algorithm parameters on algorithm performance. A variance decomposition sensitivity analysis method, Analysis of Variance (ANOVA), is used for sensitivity quantification, because it is capable of handling small samples and more computationally efficient compared with other approaches. The Shuffled Complex Evolution method developed at the University of Arizona algorithm (SCE-UA) is selected as an optimization algorithm for investigation, and two criteria, i.e., convergence speed and success rate, are used to measure the performance of SCE-UA. Results show the proposed framework can effectively reveal the dynamic sensitivity of algorithm parameters in the search processes, including individual influences of parameters and their interactive impacts. Interactions between algorithm parameters have significant impacts on SCE-UA performance, which has not been reported in previous research. The proposed framework provides a means to understand the dynamics of algorithm parameter influence, and highlights the significance of considering interactive parameter influence to improve algorithm performance in the search processes.

  16. Optimization of the dressing parameters in cylindrical grinding based on a generalized utility function

    Science.gov (United States)

    Aleksandrova, Irina

    2016-01-01

    The existing studies, concerning the dressing process, focus on the major influence of the dressing conditions on the grinding response variables. However, the choice of the dressing conditions is often made, based on the experience of the qualified staff or using data from reference books. The optimal dressing parameters, which are only valid for the particular methods and dressing and grinding conditions, are also used. The paper presents a methodology for optimization of the dressing parameters in cylindrical grinding. The generalized utility function has been chosen as an optimization parameter. It is a complex indicator determining the economic, dynamic and manufacturing characteristics of the grinding process. The developed methodology is implemented for the dressing of aluminium oxide grinding wheels by using experimental diamond roller dressers with different grit sizes made of medium- and high-strength synthetic diamonds type ??32 and ??80. To solve the optimization problem, a model of the generalized utility function is created which reflects the complex impact of dressing parameters. The model is built based on the results from the conducted complex study and modeling of the grinding wheel lifetime, cutting ability, production rate and cutting forces during grinding. They are closely related to the dressing conditions (dressing speed ratio, radial in-feed of the diamond roller dresser and dress-out time), the diamond roller dresser grit size/grinding wheel grit size ratio, the type of synthetic diamonds and the direction of dressing. Some dressing parameters are determined for which the generalized utility function has a maximum and which guarantee an optimum combination of the following: the lifetime and cutting ability of the abrasive wheels, the tangential cutting force magnitude and the production rate of the grinding process. The results obtained prove the possibility of control and optimization of grinding by selecting particular dressing

  17. Automated design and optimization of flexible booster autopilots via linear programming. Volume 2: User's manual

    Science.gov (United States)

    Hauser, F. D.; Szollosi, G. D.; Lakin, W. S.

    1972-01-01

    COEBRA, the Computerized Optimization of Elastic Booster Autopilots, is an autopilot design program. The bulk of the design criteria is presented in the form of minimum allowed gain/phase stability margins. COEBRA has two optimization phases: (1) a phase to maximize stability margins; and (2) a phase to optimize structural bending moment load relief capability in the presence of minimum requirements on gain/phase stability margins.

  18. Research of Optimization Method of Swabbing Parameters of All Rods Pumping Wells in the Entire Oilfield

    Directory of Open Access Journals (Sweden)

    Zhang Xishun

    2013-03-01

    Full Text Available Aiming at the drawbacks of the optimization and design methods and the practical production goal of least energy consumption, a new theory is raised that the gas of the layer released energy in the lifting process including two parts: dissolved-gas expansion energy and free-gas expansion energy. The motor’s input power of rod pumping system is divided into hydraulic horse power, gas expansion power, surface mechanical loss power, subsurface loss power. Using the theory of energy-conservation, the simulation model of free-gas expansion power has been established, the simulating models of the motor’s input power which are based on the energy method have been improved and the simulation precision of system efficiency has been enhanced. The entire optimization design models have been set up in which the single-well output is taken as the optimum design variable, the planed production of all oil wells in an overall oilfield as the restraint condition and the least input power of the overall oilfield as the object. Synthesizing the optimization design results of the single well and the entire oilfield, the optimal output and the optimal swabbing parameters of all wells can be got. The actual optimizing examples show that the total power consumption designed by the entire optimization method is less 12.95% than that by the single optimization method.

  19. Optimization of injection molding parameters for poly(styrene-isobutylene-styrene) block copolymer

    Science.gov (United States)

    Fittipaldi, Mauro; Garcia, Carla; Rodriguez, Luis A.; Grace, Landon R.

    2016-03-01

    Poly(styrene-isobutylene-styrene) (SIBS) is a widely used thermoplastic elastomer in bioimplantable devices due to its inherent stability in vivo. However, the properties of the material are highly dependent on the fabrication conditions, molecular weight, and styrene content. An optimization method for injection molding is herein proposed which can be applied to varying SIBS formulations in order to maximize ultimate tensile strength, which is critical to certain load-bearing implantable applications. The number of injection molded samples required to ascertain the optimum conditions for maximum ultimate tensile strength is limited in order to minimize experimental time and effort. Injection molding parameters including nozzle temperature (three levels: 218, 246, and 274 °C), mold temperature (three levels: 50, 85, and 120 °C), injection speed (three levels: slow, medium and fast) and holding pressure time (three levels: 2, 6, and 10 seconds) were varied to fabricate dumbbell specimens for tensile testing. A three-level L9 Taguchi method utilizing orthogonal arrays was used in order to rank the importance of the different injection molding parameters and to find an optimal parameter setting to maximize the ultimate tensile strength of the thermoplastic elastomer. Based on the Taguchi design results, a Response Surface Methodology (RSM) was applied in order to build a model to predict the tensile strength of the material at different injection parameters. Finally, the model was optimized to find the injection molding parameters providing maximum ultimate tensile strength. Subsequently, the theoretically-optimum injection molding parameters were used to fabricate additional dumbbell specimens. The experimentally-determined ultimate tensile strength of these samples was found to be in close agreement (1.2%) with the theoretical results, successfully demonstrating the suitability of the Taguchi Method and RSM for optimizing injection molding parameters of SIBS.

  20. A CLINICO- HEMATOLOGICAL STUDY IN CASES OF PANCYTO PENIA: CORRELATION OF AUTOMATED CELL COUNTER PARAMETERS IN VARIOUS ETIOLOGIES

    Directory of Open Access Journals (Sweden)

    Soma

    2013-05-01

    Full Text Available ORIGINAL ARTICLE Journal of Evolution of Medical and Dental Sciences / Volume 2/ Issue 22/ June 3, 2013 Page 4013 A CLINICO- HEMATOLOGICAL STUDY IN CASES OF PANCYTO PENIA: CORRELATION OF AUTOMATED CELL COUNTER PARAMETERS IN VARIOUS ETIOLOGIES Soma Yadav 1 , Rashmi Kushwaha 2 , Kamal Aggrawal 3 , A.K Tripathi 4 , U.S Singh 5 , Ashutosh Kumar 6 . 1. Junior Resident, Department. Of pathology, King George’s Medical Uni versity 2. Assistant Professor, Department. Of pathology, King George’s Medical University 3. Professor, Department. Of pathology, King George’s Medical University 4. Professor and Head, Department. Of Clinical Hematol ogy, King George’s Medical University 5. Professor, Department. Of pathology, King George’s Medical University 6. Professor and officer in charge, Lymphoma- Leukemia Lab, Department. Of pathology, King George’s Medic al University. CORRESPONDING AUTHOR: Dr. Rashmi Kushwaha, King George’s Medical University, Lucknow. E-mail: docrashmi27@yahoo.co.in

  1. Fast reactor parameter optimization taking into account changes in fuel charge type during reactor operation time

    International Nuclear Information System (INIS)

    The formulation and solution of optimization problem for parameters determining the layout of the central part of sodium cooled power reactor taking into account possible changes in fuel charge type during reactor operation time are performed. The losses under change of fuel composition type for two reactor modifications providing for minimum doubling time for oxide and carbide fuels respectively, are estimated

  2. Optimization of Polishing Parameters with Taguchi Method for LBO Crystal in CMP

    Institute of Scientific and Technical Information of China (English)

    Jun Li; Yongwei Zhu; Dunwen Zuo; Yong Zhu; Chuangtian Chen

    2009-01-01

    Chemical mechanical polishing (CMP) was used to polish Lithium triborate (UB_3O_5 or LBO) crystal. Taguchi method was applied for optimization of the polishing parameters. Material removal rate (MRR) and surface roughness are considered as criteria for the optimization. The polishing pressure, the abrasive concentration and the table velocity are important parameters which influence MRR and surface roughness in CMP of LBO crystal. Experiment results indicate that for MRR the polishing pressure is the most significant polishing parameter followed by table velocity; while for the surface roughness, the abrasive concentration is the most important one. For high MRR in CMP of LBO crystal the optimal conditions are: pressure 620 g/cm~2, concentration 5.0 wt pct, and velocity 60 r/min, respectively. For the best surface roughness the optimal conditions are: pressure 416 g/cm~2, concentration 5.0 wt pct, and velocity 40 r/min, respectively. The contributions of individual parameters for MRR and surface roughness were obtained.

  3. TO THE QUESTION OF SOLVING OF THE PROBLEM OF OPTIMIZING PARAMETERS OF TRAFFIC FLOW COORDINATED CONTROL

    OpenAIRE

    L. Abramova; Chernobaev, N.

    2007-01-01

    A short review of main methods of traffic flow control is represented, great attention is paid to methods of coordinated control and quality characteristics of traffic control. The problem of parameter optimization of traffic coordinated control on the basis of vehicle delay minimizing at highway intersections has been defined.

  4. Optimization of Temperature Schedule Parameters on Heat Supply in Power-and-Heat Supply Systems

    OpenAIRE

    V. A. Sednin; A. V. Sednin; M. L. Bogdanovich

    2014-01-01

    The paper considers problems concerning optimization of a temperature schedule in the district heating systems with steam-turbine thermal power stations having average initial steam parameters. It has been shown in the paper that upkeeping of an optimum network water temperature permits to increase an energy efficiency of heat supply due to additional systematic saving of fuel. 

  5. Is transverse feedback necessary for the SSC emittance preservation? (Vibration noise analysis and feedback parameters optimization)

    International Nuclear Information System (INIS)

    The paper considers the Superconducting Super Collider (SSC) site ground motion measurements as well as data from accelerators worldwide about noises that worsen beam performance. Unacceptably fast emittance growth due to these noises is predicted for the SSC. A transverse feedback system was found to be the only satisfactory alternative to prevent emittance decay. Optimization of the primary feedback parameters was done

  6. Optimization of Temperature Schedule Parameters on Heat Supply in Power-and-Heat Supply Systems

    Directory of Open Access Journals (Sweden)

    V. A. Sednin

    2009-01-01

    Full Text Available The paper considers problems concerning optimization of a temperature schedule in the district heating systems with steam-turbine thermal power stations having average initial steam parameters. It has been shown in the paper that upkeeping of an optimum network water temperature permits to increase an energy efficiency of heat supply due to additional systematic saving of fuel. 

  7. High-resolution MRI of the labyrinth. Optimization of scan parameters with 3D-FSE

    International Nuclear Information System (INIS)

    The aim of our study was to optimize the parameters of high-resolution MRI of the labyrinth with a 3D fast spin-echo (3D-FSE) sequence. We investigated repetition time (TR), echo time (TE), Matrix, field of view (FOV), and coil selection in terms of CNR (contrast-to-noise ratio) and SNR (signal-to-noise ratio) by comparing axial images and/or three-dimensional images. The optimal 3D-FSE sequence parameters were as follows: 1.5 Tesla MR unit (Signa LX, GE Medical Systems), 3D-FSE sequence, dual 3-inch surface coil, acquisition time=12.08 min, TR=5000 msec, TE=300 msec, 3 number of excitations (NEX), FOV=12 cm, matrix=256 x 256, slice thickness=0.5 mm/0.0 sp, echo train=64, bandwidth=±31.5 kHz. High-resolution MRI of the labyrinth using the optimized 3D-FSE sequence parameters permits visualization of important anatomic details (such as scala tympani and scala vestibuli), making it possible to determine inner ear anomalies and the patency of cochlear turns. To obtain excellent heavily T2-weighted axial and three-dimensional images in the labyrinth, high CNR, SNR, and spatial resolution are significant factors at the present time. Furthermore, it is important not only to optimize the scan parameters of 3D-FSE but also to select an appropriate coil for high-resolution MRI of the labyrinth. (author)

  8. Cellular Neural Networks: A genetic algorithm for parameters optimization in artificial vision applications

    Energy Technology Data Exchange (ETDEWEB)

    Taraglio, S. [ENEA, Centro Ricerche Casaccia, Rome (Italy). Dipt. Innovazione; Zanela, A. [Rome Univ. `La Sapienza` (Italy). Dipt. di Fisica

    1997-03-01

    An optimization method for some of the CNN`s (Cellular Neural Network) parameters, based on evolutionary strategies, is proposed. The new class of feedback template found is more effective in extracting features from the images that an autonomous vehicle acquires, than in the previous CNN`s literature.

  9. Cellular Neural Networks: A genetic algorithm for parameters optimization in artificial vision applications

    International Nuclear Information System (INIS)

    An optimization method for some of the CNN's (Cellular Neural Network) parameters, based on evolutionary strategies, is proposed. The new class of feedback template found is more effective in extracting features from the images that an autonomous vehicle acquires, than in the previous CNN's literature

  10. Optimization of the Process Parameters for Controlling Residual Stress and Distortion in Friction Stir Welding

    DEFF Research Database (Denmark)

    Tutum, Cem Celal; Schmidt, Henrik Nikolaj Blicher; Hattel, Jesper Henri

    2008-01-01

    In the present paper, numerical optimization of the process parameters, i.e. tool rotation speed and traverse speed, aiming minimization of the two conflicting objectives, i.e. the residual stresses and welding time, subjected to process-specific thermal constraints in friction stir welding, is i...

  11. Selecting Optimal Parameters of Random Linear Network Coding for Wireless Sensor Networks

    DEFF Research Database (Denmark)

    Heide, Janus; Zhang, Qi; Fitzek, Frank

    2013-01-01

    This work studies how to select optimal code parameters of Random Linear Network Coding (RLNC) in Wireless Sensor Networks (WSNs). With Rateless Deluge [1] the authors proposed to apply Network Coding (NC) for Over-the-Air Programming (OAP) in WSNs, and demonstrated that with NC a significant...

  12. On Optimization Control Parameters in an Adaptive Error-Control Scheme in Satellite Networks

    Directory of Open Access Journals (Sweden)

    Ranko Vojinović

    2011-09-01

    Full Text Available This paper presents a method for optimization of control parameters of an adaptive GBN scheme in error-prone satellite channel. Method is based on the channel model with three state, where channel have the variable noise level.

  13. Adaptive Importance Sampling for Performance Evaluation and Parameter Optimization of Communication Systems

    NARCIS (Netherlands)

    Remondo, David; Srinivasan, Rajan; Nicola, Victor F.; Etten, van Wim C.; Tattje, Henk E.P.

    2000-01-01

    We present new adaptive importance sampling techniques based on stochastic Newton recursions. Their applicability to the performance evaluation of communication systems is studied. Besides bit-error rate (BER) estimation, the techniques are used for system parameter optimization. Two system models t

  14. Optimization of EDM Process Parameters on Titanium Super Alloys Based on the Grey Relational Analysis

    Directory of Open Access Journals (Sweden)

    J. Laxman, Dr. K. Guru Raj

    2014-05-01

    Full Text Available Electrical discharge machining (EDM is a unconventional machining process for the machining of complex shapes and hard materials that are difficult of machining by conventional machining process. In this paper deals with the optimization of EDM process parameters using the grey relational analysis (GRA based on an orthogonal array for the multi response process. The experiments are conducted on Titanium super alloys with copper electrode based on the Taguchi design of experiments L27 orthogonal array by choosing various parameters such as peak current, pulse on time, pulse off time and tool lift time for EDM process to obtain multiple process responses namely Metal removal rate (MRR and Tool Wear Rate (TWR. The combination of Taguchi method with GRA enables to determine the optimal parameters for multiple response process. Gray relational analysis is used to obtain a performance index called gray relational grade to optimize the EDM process with higher MRR and lower TWR and it is clearly found that the performance of the EDM has greatly increased by optimizing the responses the influence of individual machining parameters also investigated by using analysis of variance for the grey relational grade.

  15. Optimization of Cutting Parameters for Face Milling Titanium Alloy Using MQL

    Institute of Scientific and Technical Information of China (English)

    AHMED Hassan; YAO Zhen-qiang

    2005-01-01

    When using MQL as a cooling technique, many parameters have to be adjusted. The Taguchi method was used in this study to investigate the cutting characteristics of face milling of titanium alloys using PVD-coated inserts. To find the optimal volume removed and surface roughness, an orthogonal array, the signal-to-noise (S/N) ratio and the analysis of variance (ANOVA) were employed. The optimum cutting parameters was obtained. Throughout this study, it was found that the feed rate is the most influencing cutting parameter in the face milling of titanium alloys.

  16. Optimization of process parameters in explosive cladding of titanium/stainless steel 304L plates

    International Nuclear Information System (INIS)

    Explosive cladding is a solid state welding process best suited for joining incompatible metals. The selection of process parameters viz., explosive mass ratio, stand off distance and initial angle of inclination dictate the nature of the cladding. Optimization of process parameters in explosive cladding of titanium-stainless steel 304L plates, based on two level three factorial design, is attempted to establish the influencing parameters. Analysis of variance was employed to find the linear, regression and interaction values. Mathematical models to estimate the responses-amplitude and wavelength were developed. The microstructure of the Ti-SS304L explosive clad interface reveals characteristic undulations concurrent with design expectations. (orig.)

  17. Taguchi Orthogonal Array Based Parameter Optimization of Biodiesel Production from Fish Oil using Ultrasonic Energy

    Directory of Open Access Journals (Sweden)

    P. Arul Franco

    2014-02-01

    Full Text Available The aim of this study is to investigate the application of ultrasonic energy-assisted biodiesel production process from fish oil catalyzed by KOH at different conditions. The optimization of reaction parameters, such as molar ratio of methanol to oil, catalyst concentration and reaction time, on the transesterification for the production of fish oil methyl ester has been studied. The Taguchi method with an L 9 orthogonal array was implemented to optimize the reaction parameters. The optimal experimental conditions obtained from this study were molar ratio of 9:1, catalyst concentration of 1.5% by Wt. and a reaction time of 30 min. According to Taguchi method, the molar ratio contributed the most important role in the yield of fish oil methyl ester. Validations of the model were done by conducting laboratory experiments. Ultrasonic energy assisted biodiesel production process was proved to be an energy efficient and economically feasible process to produce biodiesel.

  18. Optimization of FIR Digital Filters Using a Real Parameter Parallel Genetic Algorithm and Implementations.

    Science.gov (United States)

    Xu, Dexiang

    This dissertation presents a novel method of designing finite word length Finite Impulse Response (FIR) digital filters using a Real Parameter Parallel Genetic Algorithm (RPPGA). This algorithm is derived from basic Genetic Algorithms which are inspired by natural genetics principles. Both experimental results and theoretical studies in this work reveal that the RPPGA is a suitable method for determining the optimal or near optimal discrete coefficients of finite word length FIR digital filters. Performance of RPPGA is evaluated by comparing specifications of filters designed by other methods with filters designed by RPPGA. The parallel and spatial structures of the algorithm result in faster and more robust optimization than basic genetic algorithms. A filter designed by RPPGA is implemented in hardware to attenuate high frequency noise in a data acquisition system for collecting seismic signals. These studies may lead to more applications of the Real Parameter Parallel Genetic Algorithms in Electrical Engineering.

  19. Optimization of the blade trailing edge geometric parameters for a small scale ORC turbine

    International Nuclear Information System (INIS)

    In general, the method proposed by Whitfield and Baines is adopted for the turbine preliminary design. In this design procedure for the turbine blade trailing edge geometry, two assumptions (ideal gas and zero discharge swirl) and two experience values (WR and γ) are used to get the three blade trailing edge geometric parameters: relative exit flow angle β6, the exit tip radius R6t and hub radius R6h for the purpose of maximizing the rotor total-to-static isentropic efficiency. The method above is established based on the experience and results of testing using air as working fluid, so it does not provide a mathematical optimal solution to instruct the optimization of geometry parameters and consider the real gas effects of the organic, working fluid which must be taken into consideration for the ORC turbine design procedure. In this paper, a new preliminary design and optimization method is established for the purpose of reducing the exit kinetic energy loss to improve the turbine efficiency ηts, and the blade trailing edge geometric parameters for a small scale ORC turbine with working fluid R123 are optimized based on this method. The mathematical optimal solution to minimize the exit kinetic energy is deduced, which can be used to design and optimize the exit shroud/hub radius and exit blade angle. And then, the influence of blade trailing edge geometric parameters on turbine efficiency ηts are analysed and the optimal working ranges of these parameters for the equations are recommended in consideration of working fluid R123. This method is used to modify an existing ORC turbine exit kinetic energy loss from 11.7% to 7%, which indicates the effectiveness of the method. However, the internal passage loss increases from 7.9% to 9.4%, so the only way to consider the influence of geometric parameters on internal passage loss is to give the empirical ranges of these parameters, such as the recommended ranges that the value of γ is at 0.3 to 0.4, and the value

  20. Protein standardization III: Method optimization basic principles for quantitative determination of human serum proteins on automated instruments based on turbidimetry or nephelometry.

    Science.gov (United States)

    Blirup-Jensen, S

    2001-11-01

    Quantitative protein determinations in routine laboratories are today most often carried out using automated instruments. However, slight variations in the assay principle, in the programming of the instrument or in the reagents may lead to different results. This has led to the prerequisite of method optimization and standardization. The basic principles of turbidimetry and nephelometry are discussed. The different reading principles are illustrated and investigated. Various problems are identified and a suggestion is made for an integrated, fast and convenient test system for the determination of a number of different proteins on the same instrument. An optimized test system for turbidimetry and nephelometry should comprise high-quality antibodies, calibrators, controls, and buffers and a protocol with detailed parameter settings in order to program the instrument correctly. A good user program takes full advantage of the optimal reading principles for the different instruments. This implies--for all suitable instruments--sample preincubation followed by real sample blanking, which automatically corrects for initial turbidity in the sample. Likewise it is recommended to measure the reagent blank, which represents any turbidity caused by the antibody itself. By correcting all signals with these two blank values the best possible signal is obtained for the specific analyte. An optimized test system should preferably offer a wide measuring range combined with a wide security range, which for the user means few re-runs and maximum security against antigen excess. A non-linear calibration curve based on six standards is obtained using a suitable mathematical fitting model, which normally is part of the instrument software. PMID:11831625

  1. Improving flash flood forecasting with distributed hydrological model by parameter optimization

    Science.gov (United States)

    Chen, Yangbo

    2016-04-01

    In China, flash food is usually regarded as flood occured in small and medium sized watersheds with drainage area less than 200 km2, and is mainly induced by heavy rains, and occurs in where hydrological observation is lacked. Flash flood is widely observed in China, and is the flood causing the most casualties nowadays in China. Due to hydrological data scarcity, lumped hydrological model is difficult to be employed for flash flood forecasting which requires lots of observed hydrological data to calibrate model parameters. Physically based distributed hydrological model discrete the terrain of the whole watershed into a number of grid cells at fine resolution, assimilate different terrain data and precipitation to different cells, and derive model parameteris from the terrain properties, thus having the potential to be used in flash flood forecasting and improving flash flood prediction capability. In this study, the Liuxihe Model, a physically based distributed hydrological model mainly proposed for watershed flood forecasting is employed to simulate flash floods in the Ganzhou area in southeast China, and models have been set up in 5 watersheds. Model parameters have been derived from the terrain properties including the DEM, the soil type and land use type, but the result shows that the flood simulation uncertainty is high, which may be caused by parameter uncertainty, and some kind of uncertainty control is needed before the model could be used in real-time flash flood forecastin. Considering currently many Chinese small and medium sized watersheds has set up hydrological observation network, and a few flood events could be collected, it may be used for model parameter optimization. For this reason, an automatic model parameter optimization algorithm using Particle Swam Optimization(PSO) is developed to optimize the model parameters, and it has been found that model parameters optimized even only with one observed flood events could largely reduce the flood

  2. An Automated Fixed-Point Optimization Tool in MATLAB XSG/SynDSP Environment

    OpenAIRE

    Wang, Cheng C.; Changchun Shi; Robert W. Brodersen; Dejan Marković

    2011-01-01

    This paper presents an automated tool for floating-point to fixed-point conversion. The tool is based on previous work that was built in MATLAB/Simulink environment and Xilinx System Generator support. The tool is now extended to include Synplify DSP blocksets in a seamless way from the users' view point. In addition to FPGA area estimation, the tool now also includes ASIC area estimation for end-users who choose the ASIC flow. The tool minimizes hardware cost subject to mean-squared quantiza...

  3. Strategic Optimization and Investigation Effect Of Process Parameters On Performance Of Wire Electric Discharge Machine (WEDM

    Directory of Open Access Journals (Sweden)

    ATUL KUMAR

    2012-06-01

    Full Text Available Wire electrical discharge machining (WEDM is widely used in machining of conductive materials when precision is of primary significance. Wire-cut electric discharge machining of Skd 61alloy has been considered in the present work. Experimentation has been completed by using Taguchi’s L18 (21x37 orthogonal array under different conditions of parameters. Optimal combinations of parameters were obtained by this technique. The study shows that with the minimum number of experiments the complete problem can be solvedwhen compared to full factorial design. Experimental results make obvious that the machining model is proper and the Taguchi’s method satisfies the practical conditions. The results obtained are analyzed for the selection of an optimal combination of WEDM parameters for proper machining of Skd 61 alloy to achieve better surface finish. Different analysis was made on the data obtained from the experiments.

  4. Parameter Optimization for Nadaraya-Watson Kernel Regression Method with Small Samples

    Directory of Open Access Journals (Sweden)

    Li Fengping

    2016-10-01

    Full Text Available Many current regression algorithms have unsatisfactory prediction accuracy with small samples. To solve this problem, a regression algorithm based on Nadaraya-Watson kernel regression (NWKR is proposed. The proposed method advocates parameter selection directly from the standard deviation of training data, optimized with leave-one-out cross- validation (LOO-CV. Good generalization performance of the proposed parameter selection is demonstrated empirically using small sample regression problems with Gaussian noise. The results show that proposed parameter optimization method is more robust and accurate than other methods for different noise levels and different sample sizes, and indicate the importance of Vapnik’s e-insensitive loss for regression problems with small samples.

  5. Impact of Parameter Variations and Optimization on DG-PNIN Tunnel FET

    Directory of Open Access Journals (Sweden)

    Priya Jhalani

    2014-04-01

    Full Text Available The downscaling of conventional MOSFETs has come to its fundamental limits. TFETs are very attractive devices for low power applications because of their low off-current and potential for smaller sub threshold slope. In this paper, the impact of various parameter variations on the performance of a DG-PNIN Tunnel field effect transistor is investigated. In this work, variations in gate oxide material, source doping, channel doping, drain doping, pocket doping and body thickness are studied and all these parameters are optimized as performance boosters to give better current characteristics parameters. After optimization with all these performance boosters, the device has shown improved performance with increased on-current and reduced threshold voltage and the Ion/Ioff ratio is > 106 .

  6. Process parameters optimization for friction stir welding of RDE-40 aluminium alloy using Taguchi technique

    Institute of Scientific and Technical Information of China (English)

    A.K.LAKSHMINARAYANAN; V.BALASUBRAMANIAN

    2008-01-01

    Taguchi approach was applied to determine the most influential control factors which will yield better tensile strength of the joints of friction stir welded RDE-40 aluminium alloy. In order to evaluate the effect of process parameters such as tool rotational speed, traverse speed and axial force on tensile strength of friction stir welded RDE-40 aluminium alloy, Taguchi parametric design and optimization approach was used. Through the Taguchi parametric design approach, the optimum levels of process parameters were determined. The results indicate that the rotational speed, welding speed and axial force are the significant parameters in deciding the tensile strength of the joint. The predicted optimal value of tensile strength of friction stir welded RDE-40 aluminium alloy is 303 MPa. The results were confirmed by further experiments.

  7. An Improved Cuckoo Search Optimization Algorithm for the Problem of Chaotic Systems Parameter Estimation.

    Science.gov (United States)

    Wang, Jun; Zhou, Bihua; Zhou, Shudao

    2016-01-01

    This paper proposes an improved cuckoo search (ICS) algorithm to establish the parameters of chaotic systems. In order to improve the optimization capability of the basic cuckoo search (CS) algorithm, the orthogonal design and simulated annealing operation are incorporated in the CS algorithm to enhance the exploitation search ability. Then the proposed algorithm is used to establish parameters of the Lorenz chaotic system and Chen chaotic system under the noiseless and noise condition, respectively. The numerical results demonstrate that the algorithm can estimate parameters with high accuracy and reliability. Finally, the results are compared with the CS algorithm, genetic algorithm, and particle swarm optimization algorithm, and the compared results demonstrate the method is energy-efficient and superior. PMID:26880874

  8. An Improved Cuckoo Search Optimization Algorithm for the Problem of Chaotic Systems Parameter Estimation

    Directory of Open Access Journals (Sweden)

    Jun Wang

    2016-01-01

    Full Text Available This paper proposes an improved cuckoo search (ICS algorithm to establish the parameters of chaotic systems. In order to improve the optimization capability of the basic cuckoo search (CS algorithm, the orthogonal design and simulated annealing operation are incorporated in the CS algorithm to enhance the exploitation search ability. Then the proposed algorithm is used to establish parameters of the Lorenz chaotic system and Chen chaotic system under the noiseless and noise condition, respectively. The numerical results demonstrate that the algorithm can estimate parameters with high accuracy and reliability. Finally, the results are compared with the CS algorithm, genetic algorithm, and particle swarm optimization algorithm, and the compared results demonstrate the method is energy-efficient and superior.

  9. Estimating stellar effective temperatures and detected angular parameters using stochastic particle swarm optimization

    CERN Document Server

    Zhang, Chuan-Xin; Zhang, Hao-Wei; Shuai, Yong; Tan, He-Ping

    2016-01-01

    Considering features of stellar spectral radiation and survey explorers, we established a computational model for stellar effective temperatures, detected angular parameters, and gray rates. Using known stellar flux data in some band, we estimated stellar effective temperatures and detected angular parameters using stochastic particle swarm optimization (SPSO). We first verified the reliability of SPSO, and then determined reasonable parameters that produced highly accurate estimates under certain gray deviation levels. Finally, we calculated 177,860 stellar effective temperatures and detected angular parameters using the Midcourse Space Experiment (MSX) catalog data. These derived stellar effective temperatures were accurate when we compared them to known values from literatures. This research made full use of catalog data and presented an original technique for studying stellar characteristics. It proposed a novel method for calculating stellar effective temperatures and detected angular parameters, and pro...

  10. Computer-Assisted Optimization of Electrodeposited Hydroxyapatite Coating Parameters on Medical Alloys

    Science.gov (United States)

    Coşkun, M. İbrahim; Karahan, İsmail H.; Yücel, Yasin; Golden, Teresa D.

    2016-04-01

    CoCrMo bio-metallic alloys were coated with a hydroxyapatite (HA) film by electrodeposition using various electrochemical parameters. Response surface methodology and central composite design were used to optimize deposition parameters such as electrolyte pH, deposition potential, and deposition time. The effects of the coating parameters were evaluated within the limits of solution pH (3.66 to 5.34), deposition potential (-1.13 to -1.97 V), and deposition time (6.36 to 73.64 minutes). A 5-level-3-factor experimental plan was used to determine ideal deposition parameters. Optimum conditions for the deposition parameters of the HA coating with high in vitro corrosion performance were determined as electrolyte pH of 5.00, deposition potential of -1.8 V, and deposition time of 20 minutes.

  11. Optimization of Soil Hydraulic Model Parameters Using Synthetic Aperture Radar Data: An Integrated Multidisciplinary Approach

    DEFF Research Database (Denmark)

    Pauwels, Valentijn; Balenzano, Anna; Satalino, Giuseppe;

    2009-01-01

    been focused on the retrieval of land and biogeophysical parameters (e.g., soil moisture contents). One relatively unexplored issue consists of the optimization of soil hydraulic model parameters, such its, for example, hydraulic conductivity, values, through remote sensing. This is due to the fact...... through a combination of remote sensing anti land surface modeling. Spatially distributed and multitemporal SAR-based soil moisture maps are the basis of the study. The surface soil moisture values are used in a parameter estimation procedure basest on the Extended Kalman Filter equations. In fact, the...... that no direct relationships between the remote-sensing observations, more specifically radar backscatter values, and the parameter values can be derived. However, land surface models can provide these relationships. The objective of this paper is to retrieve a number of soil physical model parameters...

  12. Estimating stellar effective temperatures and detected angular parameters using stochastic particle swarm optimization

    Science.gov (United States)

    Zhang, Chuan-Xin; Yuan, Yuan; Zhang, Hao-Wei; Shuai, Yong; Tan, He-Ping

    2016-09-01

    Considering features of stellar spectral radiation and sky surveys, we established a computational model for stellar effective temperatures, detected angular parameters and gray rates. Using known stellar flux data in some bands, we estimated stellar effective temperatures and detected angular parameters using stochastic particle swarm optimization (SPSO). We first verified the reliability of SPSO, and then determined reasonable parameters that produced highly accurate estimates under certain gray deviation levels. Finally, we calculated 177 860 stellar effective temperatures and detected angular parameters using data from the Midcourse Space Experiment (MSX) catalog. These derived stellar effective temperatures were accurate when we compared them to known values from literatures. This research makes full use of catalog data and presents an original technique for studying stellar characteristics. It proposes a novel method for calculating stellar effective temperatures and detecting angular parameters, and provides theoretical and practical data for finding information about radiation in any band.

  13. Connecting parameters optimization on unsymmetrical twin-tower structure linked by sky-bridge

    Institute of Scientific and Technical Information of China (English)

    孙黄胜; 刘默涵; 朱宏平

    2014-01-01

    Based on a simplified 3-DOF model of twin-tower structure linked by a sky-bridge, the frequency response functions, the displacement power spectral density (PSD) functions, and the time-averaged total vibration energy were derived, by assuming the white noise as the earthquake excitation. The effects of connecting parameters, such as linking stiffness ratio and linking damping ratio, on the structural vibration responses were then studied, and the optimal connecting parameters were obtained to minimize the vibration energy of either the independent monomer tower or the integral structure. The influences of sky-bridge elevation position on the optimal connecting parameters were also discussed. Finally, the distribution characteristics of the top displacement PSD and the structural responses, excited by El Centro, Taft and artificial waves, were compared in both frequency and time domain. It is found that the connecting parameters at either end of connection interactively affect the responses of the towers. The optimal connecting parameters can greatly improve the damping connections on their seismic reduction effectiveness, but are unable to reduce the seismic responses of the towers to the best extent simultaneously. It is also indicated that the optimal connecting parameters derived from the simplified 3-DOF model are applicable for two multi-story structures linked by a sky-bridge with dampers. The seismic reduction effectiveness obtained varies from 0.3 to 1.0 with different sky-bridge mass ratio. The displacement responses of the example structures are reduced by approximately 22% with sky-bridge connections.

  14. Coastal aquifer management under parameter uncertainty: Ensemble surrogate modeling based simulation-optimization

    Science.gov (United States)

    Janardhanan, S.; Datta, B.

    2011-12-01

    Surrogate models are widely used to develop computationally efficient simulation-optimization models to solve complex groundwater management problems. Artificial intelligence based models are most often used for this purpose where they are trained using predictor-predictand data obtained from a numerical simulation model. Most often this is implemented with the assumption that the parameters and boundary conditions used in the numerical simulation model are perfectly known. However, in most practical situations these values are uncertain. Under these circumstances the application of such approximation surrogates becomes limited. In our study we develop a surrogate model based coupled simulation optimization methodology for determining optimal pumping strategies for coastal aquifers considering parameter uncertainty. An ensemble surrogate modeling approach is used along with multiple realization optimization. The methodology is used to solve a multi-objective coastal aquifer management problem considering two conflicting objectives. Hydraulic conductivity and the aquifer recharge are considered as uncertain values. Three dimensional coupled flow and transport simulation model FEMWATER is used to simulate the aquifer responses for a number of scenarios corresponding to Latin hypercube samples of pumping and uncertain parameters to generate input-output patterns for training the surrogate models. Non-parametric bootstrap sampling of this original data set is used to generate multiple data sets which belong to different regions in the multi-dimensional decision and parameter space. These data sets are used to train and test multiple surrogate models based on genetic programming. The ensemble of surrogate models is then linked to a multi-objective genetic algorithm to solve the pumping optimization problem. Two conflicting objectives, viz, maximizing total pumping from beneficial wells and minimizing the total pumping from barrier wells for hydraulic control of

  15. Optimal Geometrical Set for Automated Marker Placement to Virtualized Real-Time Facial Emotions.

    Directory of Open Access Journals (Sweden)

    Vasanthan Maruthapillai

    Full Text Available In recent years, real-time face recognition has been a major topic of interest in developing intelligent human-machine interaction systems. Over the past several decades, researchers have proposed different algorithms for facial expression recognition, but there has been little focus on detection in real-time scenarios. The present work proposes a new algorithmic method of automated marker placement used to classify six facial expressions: happiness, sadness, anger, fear, disgust, and surprise. Emotional facial expressions were captured using a webcam, while the proposed algorithm placed a set of eight virtual markers on each subject's face. Facial feature extraction methods, including marker distance (distance between each marker to the center of the face and change in marker distance (change in distance between the original and new marker positions, were used to extract three statistical features (mean, variance, and root mean square from the real-time video sequence. The initial position of each marker was subjected to the optical flow algorithm for marker tracking with each emotional facial expression. Finally, the extracted statistical features were mapped into corresponding emotional facial expressions using two simple non-linear classifiers, K-nearest neighbor and probabilistic neural network. The results indicate that the proposed automated marker placement algorithm effectively placed eight virtual markers on each subject's face and gave a maximum mean emotion classification rate of 96.94% using the probabilistic neural network.

  16. Optimal Geometrical Set for Automated Marker Placement to Virtualized Real-Time Facial Emotions.

    Science.gov (United States)

    Maruthapillai, Vasanthan; Murugappan, Murugappan

    2016-01-01

    In recent years, real-time face recognition has been a major topic of interest in developing intelligent human-machine interaction systems. Over the past several decades, researchers have proposed different algorithms for facial expression recognition, but there has been little focus on detection in real-time scenarios. The present work proposes a new algorithmic method of automated marker placement used to classify six facial expressions: happiness, sadness, anger, fear, disgust, and surprise. Emotional facial expressions were captured using a webcam, while the proposed algorithm placed a set of eight virtual markers on each subject's face. Facial feature extraction methods, including marker distance (distance between each marker to the center of the face) and change in marker distance (change in distance between the original and new marker positions), were used to extract three statistical features (mean, variance, and root mean square) from the real-time video sequence. The initial position of each marker was subjected to the optical flow algorithm for marker tracking with each emotional facial expression. Finally, the extracted statistical features were mapped into corresponding emotional facial expressions using two simple non-linear classifiers, K-nearest neighbor and probabilistic neural network. The results indicate that the proposed automated marker placement algorithm effectively placed eight virtual markers on each subject's face and gave a maximum mean emotion classification rate of 96.94% using the probabilistic neural network. PMID:26859884

  17. Optimization of Nd:YAG laser welding parameters for sealing small titanium tube ends

    International Nuclear Information System (INIS)

    The purpose of the present study is to optimize Nd:YAG laser welding parameters to seal an iodine-125 radioisotope seed into a titanium capsule. If the end of a small titanium tube is irradiated to a Nd:YAG laser beam and melted down to the adequate length, it will be coalesced and sealed. The accurate control of the melted length of the tube end was the most important to obtain a sound sealed state. The effects of the laser welding parameters on the melted length were analyzed and optimized by the Taguchi and regression analysis method. The laser pulse width and focal position among the welding parameters had the greatest effects on the S/N ratio of the melted length. Optimal welding conditions were obtained at 0.86 ms of the pulse width and 3.18-3.35 mm of the focal position in the scope of the experiments. Confirmation experiments were conducted at the optimal welding conditions, and showed that both of the titanium tube ends were sealed soundly

  18. A Parallel Genetic Algorithm Based Feature Selection and Parameter Optimization for Support Vector Machine

    Directory of Open Access Journals (Sweden)

    Zhi Chen

    2016-01-01

    Full Text Available The extensive applications of support vector machines (SVMs require efficient method of constructing a SVM classifier with high classification ability. The performance of SVM crucially depends on whether optimal feature subset and parameter of SVM can be efficiently obtained. In this paper, a coarse-grained parallel genetic algorithm (CGPGA is used to simultaneously optimize the feature subset and parameters for SVM. The distributed topology and migration policy of CGPGA can help find optimal feature subset and parameters for SVM in significantly shorter time, so as to increase the quality of solution found. In addition, a new fitness function, which combines the classification accuracy obtained from bootstrap method, the number of chosen features, and the number of support vectors, is proposed to lead the search of CGPGA to the direction of optimal generalization error. Experiment results on 12 benchmark datasets show that our proposed approach outperforms genetic algorithm (GA based method and grid search method in terms of classification accuracy, number of chosen features, number of support vectors, and running time.

  19. Cellular scanning strategy for selective laser melting: Generating reliable, optimized scanning paths and processing parameters

    Science.gov (United States)

    Mohanty, Sankhya; Hattel, Jesper H.

    2015-03-01

    Selective laser melting is yet to become a standardized industrial manufacturing technique. The process continues to suffer from defects such as distortions, residual stresses, localized deformations and warpage caused primarily due to the localized heating, rapid cooling and high temperature gradients that occur during the process. While process monitoring and control of selective laser melting is an active area of research, establishing the reliability and robustness of the process still remains a challenge. In this paper, a methodology for generating reliable, optimized scanning paths and process parameters for selective laser melting of a standard sample is introduced. The processing of the sample is simulated by sequentially coupling a calibrated 3D pseudo-analytical thermal model with a 3D finite element mechanical model. The optimized processing parameters are subjected to a Monte Carlo method based uncertainty and reliability analysis. The reliability of the scanning paths are established using cumulative probability distribution functions for process output criteria such as sample density, thermal homogeneity, etc. A customized genetic algorithm is used along with the simulation model to generate optimized cellular scanning strategies and processing parameters, with an objective of reducing thermal asymmetries and mechanical deformations. The optimized scanning strategies are used for selective laser melting of the standard samples, and experimental and numerical results are compared.

  20. Optimization of the Effective Parameters on Hydraulic Fracturing Designing in an Iranian Sand Stone Reservoir

    Directory of Open Access Journals (Sweden)

    Reza Masoomi

    2015-06-01

    Full Text Available Hydraulic fracturing operation is one of the key technologies in order to stimulate oil and gas wells in sand stone reservoirs. Field data relating to the hydraulic fracturing operation are mostly available as pressure-time curves. The optimization of the hydraulic fracturing parameters is not possible with only this information. So the designing and controlling the development process of hydraulic fracturing are possible only with rely on complex mathematical and numerical models. The aim of this study is to optimize the effective parameters on designing of the hydraulic fracturing process in an Iranian oil reservoir with sandstone reservoir rocks. For this purpose the parameters of pump flow rate and hydraulic fracture half length have been optimized. In this study first variable pump flow rates scenarios have been investigated. The scenarios to determine the optimum value for hydraulic fracturing half length have been designed after determining of the optimal pump flow rate. In this study the calculation results in addition to the pseudo three-dimensional hydraulic fracturing model (P3D have also provided with the hydraulic fracturing two-dimensional modeling including PKN, KGD and Radial models.

  1. Development of optimization model for sputtering process parameter based on gravitational search algorithm

    Science.gov (United States)

    Norlina, M. S.; Diyana, M. S. Nor; Mazidah, P.; Rusop, M.

    2016-07-01

    In the RF magnetron sputtering process, the desirable layer properties are largely influenced by the process parameters and conditions. If the quality of the thin film has not reached up to its intended level, the experiments have to be repeated until the desirable quality has been met. This research is proposing Gravitational Search Algorithm (GSA) as the optimization model to reduce the time and cost to be spent in the thin film fabrication. The optimization model's engine has been developed using Java. The model is developed based on GSA concept, which is inspired by the Newtonian laws of gravity and motion. In this research, the model is expected to optimize four deposition parameters which are RF power, deposition time, oxygen flow rate and substrate temperature. The results have turned out to be promising and it could be concluded that the performance of the model is satisfying in this parameter optimization problem. Future work could compare GSA with other nature based algorithms and test them with various set of data.

  2. Electrochemical model parameter identification of a lithium-ion battery using particle swarm optimization method

    Science.gov (United States)

    Rahman, Md Ashiqur; Anwar, Sohel; Izadian, Afshin

    2016-03-01

    In this paper, a gradient-free optimization technique, namely particle swarm optimization (PSO) algorithm, is utilized to identify specific parameters of the electrochemical model of a Lithium-Ion battery with LiCoO2 cathode chemistry. Battery electrochemical model parameters are subject to change under severe or abusive operating conditions resulting in, for example, over-discharged battery, over-charged battery, etc. It is important for a battery management system to have these parameter changes fully captured in a bank of battery models that can be used to monitor battery conditions in real time. Here the PSO methodology has been successfully applied to identify four electrochemical model parameters that exhibit significant variations under severe operating conditions: solid phase diffusion coefficient at the positive electrode (cathode), solid phase diffusion coefficient at the negative electrode (anode), intercalation/de-intercalation reaction rate at the cathode, and intercalation/de-intercalation reaction rate at the anode. The identified model parameters were used to generate the respective battery models for both healthy and degraded batteries. These models were then validated by comparing the model output voltage with the experimental output voltage for the stated operating conditions. The identified Li-Ion battery electrochemical model parameters are within reasonable accuracy as evidenced by the experimental validation results.

  3. Parameter estimation with bio-inspired meta-heuristic optimization: modeling the dynamics of endocytosis

    Directory of Open Access Journals (Sweden)

    Tashkova Katerina

    2011-10-01

    Full Text Available Abstract Background We address the task of parameter estimation in models of the dynamics of biological systems based on ordinary differential equations (ODEs from measured data, where the models are typically non-linear and have many parameters, the measurements are imperfect due to noise, and the studied system can often be only partially observed. A representative task is to estimate the parameters in a model of the dynamics of endocytosis, i.e., endosome maturation, reflected in a cut-out switch transition between the Rab5 and Rab7 domain protein concentrations, from experimental measurements of these concentrations. The general parameter estimation task and the specific instance considered here are challenging optimization problems, calling for the use of advanced meta-heuristic optimization methods, such as evolutionary or swarm-based methods. Results We apply three global-search meta-heuristic algorithms for numerical optimization, i.e., differential ant-stigmergy algorithm (DASA, particle-swarm optimization (PSO, and differential evolution (DE, as well as a local-search derivative-based algorithm 717 (A717 to the task of estimating parameters in ODEs. We evaluate their performance on the considered representative task along a number of metrics, including the quality of reconstructing the system output and the complete dynamics, as well as the speed of convergence, both on real-experimental data and on artificial pseudo-experimental data with varying amounts of noise. We compare the four optimization methods under a range of observation scenarios, where data of different completeness and accuracy of interpretation are given as input. Conclusions Overall, the global meta-heuristic methods (DASA, PSO, and DE clearly and significantly outperform the local derivative-based method (A717. Among the three meta-heuristics, differential evolution (DE performs best in terms of the objective function, i.e., reconstructing the output, and in terms of

  4. Integration of numerical analysis tools for automated numerical optimization of a transportation package design

    International Nuclear Information System (INIS)

    The use of state-of-the-art numerical analysis tools to determine the optimal design of a radioactive material (RAM) transportation container is investigated. The design of a RAM package's components involves a complex coupling of structural, thermal, and radioactive shielding analyses. The final design must adhere to very strict design constraints. The current technique used by cask designers is uncoupled and involves designing each component separately with respect to its driving constraint. With the use of numerical optimization schemes, the complex couplings can be considered directly, and the performance of the integrated package can be maximized with respect to the analysis conditions. This can lead to more efficient package designs. Thermal and structural accident conditions are analyzed in the shape optimization of a simplified cask design. In this paper, details of the integration of numerical analysis tools, development of a process model, nonsmoothness difficulties with the optimization of the cask, and preliminary results are discussed

  5. Optimization of thread partitioning parameters in speculative multithreading based on artificial immune algorithm

    Institute of Scientific and Technical Information of China (English)

    Yu-xiang LI; Yin-liang ZHAO‡; Bin LIU; Shuo JI

    2015-01-01

    Thread partition plays an important role in speculative multithreading (SpMT) for automatic parallelization of ir-regular programs. Using unified values of partition parameters to partition different applications leads to the fact that every ap-plication cannot own its optimal partition scheme. In this paper, five parameters affecting thread partition are extracted from heuristic rules. They are the dependence threshold (DT), lower limit of thread size (TSL), upper limit of thread size (TSU), lower limit of spawning distance (SDL), and upper limit of spawning distance (SDU). Their ranges are determined in accordance with heuristic rules, and their step-sizes are set empirically. Under the condition of setting speedup as an objective function, all com-binations of five threshold values form the solution space, and our aim is to search for the best combination to obtain the best thread granularity, thread dependence, and spawning distance, so that every application has its best partition scheme. The issue can be attributed to a single objective optimization problem. We use the artificial immune algorithm (AIA) to search for the optimal solution. On Prophet, which is a generic SpMT processor to evaluate the performance of multithreaded programs, Olden bench-marks are used to implement the process. Experiments show that we can obtain the optimal parameter values for every benchmark, and Olden benchmarks partitioned with the optimized parameter values deliver a performance improvement of 3.00%on a 4-core platform compared with a machine learning based approach, and 8.92%compared with a heuristics-based approach.

  6. Extracellular voltage threshold settings can be tuned for optimal encoding of movement and stimulus parameters

    Science.gov (United States)

    Oby, Emily R.; Perel, Sagi; Sadtler, Patrick T.; Ruff, Douglas A.; Mischel, Jessica L.; Montez, David F.; Cohen, Marlene R.; Batista, Aaron P.; Chase, Steven M.

    2016-06-01

    Objective. A traditional goal of neural recording with extracellular electrodes is to isolate action potential waveforms of an individual neuron. Recently, in brain–computer interfaces (BCIs), it has been recognized that threshold crossing events of the voltage waveform also convey rich information. To date, the threshold for detecting threshold crossings has been selected to preserve single-neuron isolation. However, the optimal threshold for single-neuron identification is not necessarily the optimal threshold for information extraction. Here we introduce a procedure to determine the best threshold for extracting information from extracellular recordings. We apply this procedure in two distinct contexts: the encoding of kinematic parameters from neural activity in primary motor cortex (M1), and visual stimulus parameters from neural activity in primary visual cortex (V1). Approach. We record extracellularly from multi-electrode arrays implanted in M1 or V1 in monkeys. Then, we systematically sweep the voltage detection threshold and quantify the information conveyed by the corresponding threshold crossings. Main Results. The optimal threshold depends on the desired information. In M1, velocity is optimally encoded at higher thresholds than speed; in both cases the optimal thresholds are lower than are typically used in BCI applications. In V1, information about the orientation of a visual stimulus is optimally encoded at higher thresholds than is visual contrast. A conceptual model explains these results as a consequence of cortical topography. Significance. How neural signals are processed impacts the information that can be extracted from them. Both the type and quality of information contained in threshold crossings depend on the threshold setting. There is more information available in these signals than is typically extracted. Adjusting the detection threshold to the parameter of interest in a BCI context should improve our ability to decode motor intent

  7. EXPERIMENTAL TWO-FACTOR OPTIMIZATION OF PARAMETERS OF THE UNIVERSAL SOIL-PROCESSING INSTRUMENT

    Directory of Open Access Journals (Sweden)

    Popov I. V.

    2015-10-01

    Full Text Available The article considers the problem of reforestation on of processing stations, such as coupe, gully and mountain slopes. To improve the efficiency of the planting work proposed a construction of universal soil-processing instrument (USPI, is able to form discrete planting spot in the form of spot mounding in conditions temporarily humid soil or spot area (with removal of the top layer on drained soils with simultaneous formation of planting cup for planting of forest plantation. For assessing effectiveness of his work, there was developed an experimental sample of the USPI and conducted its field trials. During the two-factor solution of the problem of optimization of the performance of the USPI there were selected optimization criteria, namely performance, quality and economic feasibility of work instrument, as well as the varied parameters exerting the most influence. To detection the analytical dependences between these parameters, we have performed a series of nine experiments, performed the approximation of functions by polynomials of second order. The result was obtained analytical formulas characterizing the influence of the varied parameters of the USPI on the quality of his work. Also, we have found graphical surfaces response and performed a visual analysis , which allowed determining the optimal values of the varied parameters of the USPI

  8. Forecasting Annual Power Generation Using a Harmony Search Algorithm-Based Joint Parameters Optimization Combination Model

    Directory of Open Access Journals (Sweden)

    Hong Chang

    2012-10-01

    Full Text Available Accurate power generation forecasting provides the basis of decision making for electric power industry development plans, energy conservation and environmental protection. Since the power generation time series are rarely purely linear or nonlinear, no single forecasting model can identify the true data trends exactly in all situations. To combine forecasts from different models can reduce the model selection risk and effectively improve accuracy. In this paper, we propose a novel technique called the Harmony Search (HS algorithm-based joint parameters optimization combination model. In this model, the single forecasting model adopts power function form with unfixed exponential parameters. The exponential parameters of the single model and the combination weights are called joint parameters which are optimized by the HS algorithm by optimizing the objective function. Real power generation time series data sets of China, Japan, Russian Federation and India were used as samples to examine the forecasting accuracy of the presented model. The forecasting performance was compared with four single models and four combination models, respectively. The MAPE of our presented model is the lowest, which shows that the proposed model outperforms other comparative ones. Especially, the proposed combination model could better fit significant turning points of power generation time series. We can conclude that the proposed model can obviously improve forecasting accuracy and it can treat nonlinear time series with fluctuations better than other single models or combination models.

  9. Semivariogram Estimation Using Ant Colony Optimization and Ensemble Kriging Accounting for Parameter Uncertainty

    Science.gov (United States)

    Cardiff, M. A.; Kitanidis, P. K.

    2005-12-01

    In this presentation we revisit the problem of semivariogram estimation and present a modular, reusable, and encapsulated set of MATLAB programs that use a hybrid Ant Colony Optimization (ACO) heuristic to solve the "optimal fit" problem. Though the ACO heuristic involves a stochastic component, advantages of the heuristic over traditional gradient-search methods, like the Gauss-Newton method, include the ability to estimate model semivariogram parameters accurately without initial guesses input by the user. The ACO heuristic is also superiorly suited for strongly nonlinear optimization over spaces that may contain several local minima. The presentation will focus on the application of ACO to existing weighted least squares and restricted maximum likelihood estimation methods with a comparison of results. The presentation will also discuss parameter uncertainty, particularly in the context of restricted maximum likelihood and Bayesian methods. We compare the local linearized parameter estimates (or Cramer-Rao lower bounds) with modern Monte Carlo methods, such as acceptance-rejection. Finally, we present ensemble kriging in which conditional realizations are generated in a way that uncertainty in semi-variogram parameters is fully accounted for. Results for a variety of sample problems will be presented along with a discussion of solution accuracy and computational efficiency.

  10. Prediction and optimization of friction welding parameters for super duplex stainless steel (UNS S32760) joints

    International Nuclear Information System (INIS)

    Highlights: • Corrosion resistance and impact strength – predicted by response surface methodology. • Burn off length has highest significance on corrosion resistance. • Friction force is a strong determinant in changing impact strength. • Pareto front points generated by genetic algorithm aid to fix input control variable. • Pareto front will be a trade-off between corrosion resistance and impact strength. - Abstract: Friction welding finds widespread industrial use as a mass production process for joining materials. Friction welding process allows welding of several materials that are extremely difficult to fusion weld. Friction welding process parameters play a significant role in making good quality joints. To produce a good quality joint it is important to set up proper welding process parameters. This can be done by employing optimization techniques. This paper presents a multi objective optimization method for optimizing the process parameters during friction welding process. The proposed method combines the response surface methodology (RSM) with an intelligent optimization algorithm, i.e. genetic algorithm (GA). Corrosion resistance and impact strength of friction welded super duplex stainless steel (SDSS) (UNS S32760) joints were investigated considering three process parameters: friction force (F), upset force (U) and burn off length (B). Mathematical models were developed and the responses were adequately predicted. Direct and interaction effects of process parameters on responses were studied by plotting graphs. Burn off length has high significance on corrosion current followed by upset force and friction force. In the case of impact strength, friction force has high significance followed by upset force and burn off length. Multi objective optimization for maximizing the impact strength and minimizing the corrosion current (maximizing corrosion resistance) was carried out using GA with the RSM model. The optimization procedure resulted in

  11. The same number of optimized parameters scheme for determining intermolecular interaction energies

    DEFF Research Database (Denmark)

    Kristensen, Kasper; Ettenhuber, Patrick; Eriksen, Janus Juul; Jensen, Frank; Jørgensen, Poul

    2015-01-01

    We propose the Same Number Of Optimized Parameters (SNOOP) scheme as an alternative to the counterpoise method for treating basis set superposition errors in calculations of intermolecular interaction energies. The key point of the SNOOP scheme is to enforce that the number of optimized wave...... numerically. Numerical results for second-order Møller-Plesset perturbation theory (MP2) and coupled-cluster with single, double, and approximate triple excitations (CCSD(T)) show that the SNOOP scheme in general outperforms the uncorrected and counterpoise approaches. Furthermore, we show that SNOOP...

  12. Multi-parameter optimization of a nanomagnetic system for spintronic applications

    Energy Technology Data Exchange (ETDEWEB)

    Morales Meza, Mishel [Centro de Investigación en Materiales Avanzados, S.C. (CIMAV), Chihuahua/Monterrey, 120 Avenida Miguel de Cervantes, 31109 Chihuahua (Mexico); Zubieta Rico, Pablo F. [Centro de Investigación en Materiales Avanzados, S.C. (CIMAV), Chihuahua/Monterrey, 120 Avenida Miguel de Cervantes, 31109 Chihuahua (Mexico); Centro de Investigación y de Estudios Avanzados del IPN (CINVESTAV) Querétaro, Libramiento Norponiente 2000, Fracc. Real de Juriquilla, 76230 Querétaro (Mexico); Horley, Paul P., E-mail: paul.horley@cimav.edu.mx [Centro de Investigación en Materiales Avanzados, S.C. (CIMAV), Chihuahua/Monterrey, 120 Avenida Miguel de Cervantes, 31109 Chihuahua (Mexico); Sukhov, Alexander [Institut für Physik, Martin-Luther Universität Halle-Wittenberg, 06120 Halle (Saale) (Germany); Vieira, Vítor R. [Centro de Física das Interacções Fundamentais (CFIF), Instituto Superior Técnico, Universidade Técnica de Lisboa, Avenida Rovisco Pais, 1049-001 Lisbon (Portugal)

    2014-11-15

    Magnetic properties of nano-particles feature many interesting physical phenomena that are essentially important for the creation of a new generation of spin-electronic devices. The magnetic stability of the nano-particles can be improved by formation of ordered particle arrays, which should be optimized over several parameters. Here we report successful optimization regarding inter-particle distance and applied field frequency allowing to obtain about three-times reduction of coercivity of a particle array compared to that of a single particle, which opens new perspectives for development of new spintronic devices.

  13. Setting Up PID DC Motor Speed Control Alteration Parameters Using Particle Swarm Optimization Strategy

    Directory of Open Access Journals (Sweden)

    Boumediène ALLAOUA

    2009-07-01

    Full Text Available In this paper, an intelligent controller of DC Motor drive is designed using particle swarm optimization (PSO method for formative the optimal proportional-integral-derivative (PID controller Tuning parameters. The proposed approach has superior feature, including easy implementation, stable convergence characteristics and very good computational performances efficiency. The DC Motor Scheduling PID-PSO controller is modeled in MATLAB environment. Comparing with fuzzy logic controller using PSO intelligent algorithms, the planned method is more proficient in improving the speed loop response stability, the steady state error is reduced, the rising time is perfected and the disturbances do not affect the performances of driving motor with no overtaking.

  14. Optimal Design of Measurement Programs for the Parameter Identification of Dynamic Systems

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Sørensen, John Dalsgaard; Brincker, Rune

    1993-01-01

    The design of a measurement program devoted to parameter identification of structural dynamic systems is considered. The design problem is formulated as an optimization problem to minimize the total expected cost that is the cost of failure and the cost of the measurement program. All the...... calculations are based on a priori knowledge and engineering judgement. One of the contribution of the approach is that the optimal number of sensory can be estimated. This is shown in an numerical example where the proposed approach is demonstrated. The example is concerned with design of a measurement...

  15. Optimal Input Design for Aircraft Parameter Estimation using Dynamic Programming Principles

    Science.gov (United States)

    Morelli, Eugene A.; Klein, Vladislav

    1990-01-01

    A new technique was developed for designing optimal flight test inputs for aircraft parameter estimation experiments. The principles of dynamic programming were used for the design in the time domain. This approach made it possible to include realistic practical constraints on the input and output variables. A description of the new approach is presented, followed by an example for a multiple input linear model describing the lateral dynamics of a fighter aircraft. The optimal input designs produced by the new technique demonstrated improved quality and expanded capability relative to the conventional multiple input design method.

  16. Multi-parameter optimization of a nanomagnetic system for spintronic applications

    International Nuclear Information System (INIS)

    Magnetic properties of nano-particles feature many interesting physical phenomena that are essentially important for the creation of a new generation of spin-electronic devices. The magnetic stability of the nano-particles can be improved by formation of ordered particle arrays, which should be optimized over several parameters. Here we report successful optimization regarding inter-particle distance and applied field frequency allowing to obtain about three-times reduction of coercivity of a particle array compared to that of a single particle, which opens new perspectives for development of new spintronic devices

  17. Parameter estimation and uncertainty quantification in a biogeochemical model using optimal experimental design methods

    Science.gov (United States)

    Reimer, Joscha; Piwonski, Jaroslaw; Slawig, Thomas

    2016-04-01

    The statistical significance of any model-data comparison strongly depends on the quality of the used data and the criterion used to measure the model-to-data misfit. The statistical properties (such as mean values, variances and covariances) of the data should be taken into account by choosing a criterion as, e.g., ordinary, weighted or generalized least squares. Moreover, the criterion can be restricted onto regions or model quantities which are of special interest. This choice influences the quality of the model output (also for not measured quantities) and the results of a parameter estimation or optimization process. We have estimated the parameters of a three-dimensional and time-dependent marine biogeochemical model describing the phosphorus cycle in the ocean. For this purpose, we have developed a statistical model for measurements of phosphate and dissolved organic phosphorus. This statistical model includes variances and correlations varying with time and location of the measurements. We compared the obtained estimations of model output and parameters for different criteria. Another question is if (and which) further measurements would increase the model's quality at all. Using experimental design criteria, the information content of measurements can be quantified. This may refer to the uncertainty in unknown model parameters as well as the uncertainty regarding which model is closer to reality. By (another) optimization, optimal measurement properties such as locations, time instants and quantities to be measured can be identified. We have optimized such properties for additional measurement for the parameter estimation of the marine biogeochemical model. For this purpose, we have quantified the uncertainty in the optimal model parameters and the model output itself regarding the uncertainty in the measurement data using the (Fisher) information matrix. Furthermore, we have calculated the uncertainty reduction by additional measurements depending on time

  18. GPRS and Bluetooth Based Devices/Mobile Connectivity Shifting From Manual To Automation For Performance Optimization

    Directory of Open Access Journals (Sweden)

    Nazia Bibi

    2011-09-01

    Full Text Available Many companies/organizations are trying to move towards automation and provide their workers with the internet facility on their mobile in order to carry out their routine tasks to save time and resources. The proposed system is based on GPRS technology aims to provide a solution to problem faced in carryout routine tasks considering mobility. The system is designed in a way that facilitates Workers/field staff get updates on their mobile phone regarding tasks at hand. This System is beneficial in a sense that it saves resources in term of time, human resources and cuts down the paper work. The proposed system has been developed in view of research study conducted in the software development and telecom industry and provides a high end solution to the customers/fieldworkers that use GPRS technology for transactions updates of databases.

  19. Extraction of Cole parameters from the electrical bioimpedance spectrum using stochastic optimization algorithms.

    Science.gov (United States)

    Gholami-Boroujeny, Shiva; Bolic, Miodrag

    2016-04-01

    Fitting the measured bioimpedance spectroscopy (BIS) data to the Cole model and then extracting the Cole parameters is a common practice in BIS applications. The extracted Cole parameters then can be analysed as descriptors of tissue electrical properties. To have a better evaluation of physiological or pathological properties of biological tissue, accurate extraction of Cole parameters is of great importance. This paper proposes an improved Cole parameter extraction based on bacterial foraging optimization (BFO) algorithm. We employed simulated datasets to test the performance of the BFO fitting method regarding parameter extraction accuracy and noise sensitivity, and we compared the results with those of a least squares (LS) fitting method. The BFO method showed better robustness to the noise and higher accuracy in terms of extracted parameters. In addition, we applied our method to experimental data where bioimpedance measurements were obtained from forearm in three different positions of the arm. The goal of the experiment was to explore how robust Cole parameters are in classifying position of the arm for different people, and measured at different times. The extracted Cole parameters obtained by LS and BFO methods were applied to different classifiers. Two other evolutionary algorithms, GA and PSO were also used for comparison purpose. We showed that when the classifiers are fed with the extracted feature sets by BFO fitting method, higher accuracy is obtained both when applying on training data and test data. PMID:26215520

  20. OPTIMIZATION OF PROCESS PARAMETERS TO MINIMIZE ANGULAR DISTORTION IN GAS TUNGSTEN ARC WELDED STAINLESS STEEL 202 GRADE PLATES USING PARTICLE SWARM OPTIMIZATION

    Directory of Open Access Journals (Sweden)

    R. SUDHAKARAN

    2012-04-01

    Full Text Available This paper presents a study on optimization of process parameters using particle swarm optimization to minimize angular distortion in 202 grade stainless steel gas tungsten arc welded plates. Angular distortion is a major problem and most pronounced among different types of distortion in butt welded plates. The process control parameters chosen for the study are welding gun angle, welding speed, plate length, welding current and gas flow rate. The experiments were conducted using design of experiments technique with five factor five level central composite rotatable design with full replication technique. A mathematical model was developed correlating the process parameters with angular distortion. A source code was developed in MATLAB 7.6 to do the optimization. The optimal process parameters gave a value of 0.0305° for angular distortion which demonstrates the accuracy of the model developed. The results indicate that the optimized values for the process parameters are capable of producing weld with minimum distortion.

  1. IDENTIFICATION OF OPTIMAL PARAMETERS OF REINFORCED CONCRETE STRUCTURES WITH ACCOUNT FOR THE PROBABILITY OF FAILURE

    Directory of Open Access Journals (Sweden)

    Filimonova Ekaterina Aleksandrovna

    2012-10-01

    The author suggests splitting the aforementioned parameters into the two groups, namely, natural parameters and value-related parameters that are introduced to assess the costs of development, transportation, construction and operation of a structure, as well as the costs of its potential failure. The author proposes a new improved methodology for the identification of the above parameters that ensures optimal solutions to non-linear objective functions accompanied by non-linear restrictions that are critical to the design of reinforced concrete structures. Any structural failure may be interpreted as the bounce of a random process associated with the surplus bearing capacity into the negative domain. Monte Carlo numerical methods make it possible to assess these bounces into the unacc eptable domain.

  2. Statistical Learning in Automated Troubleshooting: Application to LTE Interference Mitigation

    CERN Document Server

    Tiwana, Moazzam Islam; Altman, Zwi

    2010-01-01

    This paper presents a method for automated healing as part of off-line automated troubleshooting. The method combines statistical learning with constraint optimization. The automated healing aims at locally optimizing radio resource management (RRM) or system parameters of cells with poor performance in an iterative manner. The statistical learning processes the data using Logistic Regression (LR) to extract closed form (functional) relations between Key Performance Indicators (KPIs) and Radio Resource Management (RRM) parameters. These functional relations are then processed by an optimization engine which proposes new parameter values. The advantage of the proposed formulation is the small number of iterations required by the automated healing method to converge, making it suitable for off-line implementation. The proposed method is applied to heal an Inter-Cell Interference Coordination (ICIC) process in a 3G Long Term Evolution (LTE) network which is based on soft-frequency reuse scheme. Numerical simulat...

  3. Optimizing parameters of a technical system using quality function deployment method

    Science.gov (United States)

    Baczkowicz, M.; Gwiazda, A.

    2015-11-01

    The article shows the practical use of Quality Function Deployment (QFD) on the example of a mechanized mining support. Firstly it gives a short description of this method and shows how the designing process, from the constructor point of view, looks like. The proposed method allows optimizing construction parameters and comparing them as well as adapting to customer requirements. QFD helps to determine the full set of crucial construction parameters and then their importance and difficulty of their execution. Secondly it shows chosen technical system and presents its construction with figures of the existing and future optimized model. The construction parameters were selected from the designer point of view. The method helps to specify a complete set of construction parameters, from the point of view, of the designed technical system and customer requirements. The QFD matrix can be adjusted depending on designing needs and not every part of it has to be considered. Designers can choose which parts are the most important. Due to this QFD can be a very flexible tool. The most important is to define relationships occurring between parameters and that part cannot be eliminated from the analysis.

  4. Novel Gauss-Hermite integration based Bayesian inference on optimal wavelet parameters for bearing fault diagnosis

    Science.gov (United States)

    Wang, Dong; Tsui, Kwok-Leung; Zhou, Qiang

    2016-05-01

    Rolling element bearings are commonly used in machines to provide support for rotating shafts. Bearing failures may cause unexpected machine breakdowns and increase economic cost. To prevent machine breakdowns and reduce unnecessary economic loss, bearing faults should be detected as early as possible. Because wavelet transform can be used to highlight impulses caused by localized bearing faults, wavelet transform has been widely investigated and proven to be one of the most effective and efficient methods for bearing fault diagnosis. In this paper, a new Gauss-Hermite integration based Bayesian inference method is proposed to estimate the posterior distribution of wavelet parameters. The innovations of this paper are illustrated as follows. Firstly, a non-linear state space model of wavelet parameters is constructed to describe the relationship between wavelet parameters and hypothetical measurements. Secondly, the joint posterior probability density function of wavelet parameters and hypothetical measurements is assumed to follow a joint Gaussian distribution so as to generate Gaussian perturbations for the state space model. Thirdly, Gauss-Hermite integration is introduced to analytically predict and update moments of the joint Gaussian distribution, from which optimal wavelet parameters are derived. At last, an optimal wavelet filtering is conducted to extract bearing fault features and thus identify localized bearing faults. Two instances are investigated to illustrate how the proposed method works. Two comparisons with the fast kurtogram are used to demonstrate that the proposed method can achieve better visual inspection performances than the fast kurtogram.

  5. Optimization of milling parameters using artificial neural network and artificial immune system

    International Nuclear Information System (INIS)

    The present paper is an attempt to predict the effective milling parameters on the final surface roughness of the work piece made of Ti 6Al 4V using a multi perceptron artificial neural network. The required data were collected during the experiments conducted on the mentioned material. These parameters include cutting speed, feed per tooth and depth of cut. A relatively newly discovered optimization algorithm entitled, artificial immune system is used to find the best cutting conditions resulting in minimum surface roughness. Finally, the process of validation of the optimum condition is presented

  6. A review paper on Optimization of process parameter of EDM for air hardening tool steel

    Directory of Open Access Journals (Sweden)

    Tarun Modi

    2015-01-01

    Full Text Available EDM is now more economical non convectional machining process. It is used widely used on small scale as well major industries. EDM process is affect by so many process parameter which are electrical and non electrical. In this project work the rotating tool is used to improve the Metal removal rate (MRR and to observe its effect on surface finish. I am using Taguchi’s method as a design of experiments and response surface methodology for analysis and optimization. The machining parameters selected as a variables are pulse off time, pulse on time, servo voltage. The output measurement include MRR and surface roughness.

  7. Data set of optimal parameters for colorimetric red assay of epoxide hydrolase activity.

    Science.gov (United States)

    de Oliveira, Gabriel Stephani; Adriani, Patricia Pereira; Borges, Flavia Garcia; Lopes, Adriana Rios; Campana, Patricia T; Chambergo, Felipe S

    2016-09-01

    The data presented in this article are related to the research article entitled "Epoxide hydrolase of Trichoderma reesei: Biochemical properties and conformational characterization" [1]. Epoxide hydrolases (EHs) are enzymes that catalyze the hydrolysis of epoxides to the corresponding vicinal diols. This article describes the optimal parameters for the colorimetric red assay to determine the enzymatic activity, with an emphasis on the characterization of the kinetic parameters, pH optimum and thermal stability of this enzyme. The effects of reagents that are not resistant to oxidation by sodium periodate on the reactions can generate false positives and interfere with the final results of the red assay. PMID:27366781

  8. Parameter Optimization of a 9 × 9 Polymer Arrayed Waveguide Grating Multiplexer

    Institute of Scientific and Technical Information of China (English)

    郭文滨; 马春生; 陈维友; 张大明; 陈开鑫; 崔战臣; 赵禹; 刘式墉

    2002-01-01

    Some important parameters are optimized for a 9 × 9 polymer arrayed waveguide grating multiplexer around the central wavelength of 1.55μm with the wavelength spacing of 1.6nm. These parameters include the thickness and width of the guide core, diffraction order, pitch of adjacent waveguides, path length difference of adjacent arrayed waveguides, focal length of slab waveguides, free spectral range, the number of input/output channels and the number of arrayed waveguides. Finally, a schematic waveguide layout of this device is presented, which contains 2 slabs, 9 input and 9 output channels, and 91 arrayed waveguides.

  9. Optimization of Squeeze Casting Parameters for 2017 A Wrought Al Alloy Using Taguchi Method

    Directory of Open Access Journals (Sweden)

    Najib Souissi

    2014-04-01

    Full Text Available This study applies the Taguchi method to investigate the relationship between the ultimate tensile strength, hardness and process variables in a squeeze casting 2017 A wrought aluminium alloy. The effects of various casting parameters including squeeze pressure, melt temperature and die temperature were studied. Therefore, the objectives of the Taguchi method for the squeeze casting process are to establish the optimal combination of process parameters and to reduce the variation in quality between only a few experiments. The experimental results show that the squeeze pressure significantly affects the microstructure and the mechanical properties of 2017 A Al alloy.

  10. The Optimal Extraction Parameters and Anti-Diabetic Activity of Flavonoids from Ipomoea Batatas Leaf

    OpenAIRE

    Li, Fenglin; Li, Qingwang; Gao, Dawei; Peng, Yong

    2009-01-01

    Extraction parameters of flavonoids from Ipomoea batatas leaf (FIBL) and anti-diabetic activity of FIBL on alloxan induced diabetic mice were studied. The optimal extraction parameters of FIBL were obtained by single factor test and orthogonal test, as follows: ethanol concentration 60 %, ratio of solvent to raw material 30, extraction temperature 75 ° and extraction time 1.5 h, while extraction yield of FIBL was 5.94 %. FIBL treatment (50, 100, and 150 mg/ kg body weight) for 28 days resulte...

  11. Image Segmentation Parameter Optimization Considering Within- and Between-Segment Heterogeneity at Multiple Scale Levels: Test Case for Mapping Residential Areas Using Landsat Imagery

    Directory of Open Access Journals (Sweden)

    Brian A. Johnson

    2015-10-01

    Full Text Available Multi-scale/multi-level geographic object-based image analysis (MS-GEOBIA methods are becoming widely-used in remote sensing because single-scale/single-level (SS-GEOBIA methods are often unable to obtain an accurate segmentation and classification of all land use/land cover (LULC types in an image. However, there have been few comparisons between SS-GEOBIA and MS-GEOBIA approaches for the purpose of mapping a specific LULC type, so it is not well understood which is more appropriate for this task. In addition, there are few methods for automating the selection of segmentation parameters for MS-GEOBIA, while manual selection (i.e., trial-and-error approach of parameters can be quite challenging and time-consuming. In this study, we examined SS-GEOBIA and MS-GEOBIA approaches for extracting residential areas in Landsat 8 imagery, and compared naïve and parameter-optimized segmentation approaches to assess whether unsupervised segmentation parameter optimization (USPO could improve the extraction of residential areas. Our main findings were: (i the MS-GEOBIA approaches achieved higher classification accuracies than the SS-GEOBIA approach, and (ii USPO resulted in more accurate MS-GEOBIA classification results while reducing the number of segmentation levels and classification variables considerably.

  12. Optimization of Bending Process Parameters for Seamless Tubes Using Taguchi Method and Finite Element Method

    Directory of Open Access Journals (Sweden)

    Jui-Chang Lin

    2015-01-01

    Full Text Available The three-dimensional tube (or pipe is manufactured by CNC tube bending machine. The key techniques are determined by tube diameter, wall thickness, material, and bending radius. The obtained technique through experience and the trial and error method is unreliable. Finite element method (FEM simulation for the tube bending process before production can avoid wasting manpower and raw materials. The computer-aided engineering (CAE software ABAQUS 6.12 is applied to simulate bending characteristics and to explore the maximum stress and strain conditions. The Taguchi method is used to find the optimal parameters of bending. The confirmation experiment is performed according to optimal parameters. Results indicate that the strain error between CAE simulation and bending experiments is within 6.39%.

  13. Parameter Optimization for Enhancement of Ethanol Yield by Atmospheric Pressure DBD-Treated Saccharomyces cerevisiae

    International Nuclear Information System (INIS)

    In this study, Saccharomyces cerevisiae (S. cerevisiae) was exposed to dielectric barrier discharge plasma (DBD) to improve its ethanol production capacity during fermentation. Response surface methodology (RSM) was used to optimize the discharge-associated parameters of DBD for the purpose of maximizing the ethanol yield achieved by DBD-treated S. cerevisiae. According to single factor experiments, a mathematical model was established using Box-Behnken central composite experiment design, with plasma exposure time, power supply voltage, and exposed-sample volume as impact factors and ethanol yield as the response. This was followed by response surface analysis. Optimal experimental parameters for plasma discharge-induced enhancement in ethanol yield were plasma exposure time of 1 min, power voltage of 26 V, and an exposed sample volume of 9 mL. Under these conditions, the resulting yield of ethanol was 0.48 g/g, representing an increase of 33% over control. (plasma technology)

  14. Experimental and numerical analysis for optimal design parameters of a falling film evaporator

    Indian Academy of Sciences (India)

    RAJNEESH KAUSHAL; RAJ KUMAR; GAURAV VATS

    2016-06-01

    Present study exhibits an experimental examination of mass transfer coefficient and evaporative effectiveness of a falling film evaporator. Further, a statistical replica is extended in order to have optimal controlling parameters viz. non-dimensional enthalpy potential, film Reynolds number of cooling water, Reynolds number of air and relative humidity of up-streaming air. The models not only give an optimal solution but also help in establishing a correlation among controlling parameters. In this context, response surface methodology is employed by aid of design of experiment approach. Later, the response surface curves are studied using ANOVA. Finally, the relations established are confirmed experimentally to validate the models. The relations thus established are beneficent in furtherance of designing evaporators. Additionally, the presentstudy is among the first attempts to reveal the effect of humidity on the performance of falling film evaporator.

  15. Optimal Experiment Design for Quantum State and Process Tomography and Hamiltonian Parameter Estimation

    CERN Document Server

    Kosut, R L; Rabitz, H; Kosut, Robert; Walmsley, Ian A.; Rabitz, Herschel

    2004-01-01

    A number of problems in quantum state and system identification are addressed. Specifically, it is shown that the maximum likelihood estimation (MLE) approach, already known to apply to quantum state tomography, is also applicable to quantum process tomography (estimating the Kraus operator sum representation (OSR)), Hamiltonian parameter estimation, and the related problems of state and process (OSR) distribution estimation. Except for Hamiltonian parameter estimation, the other MLE problems are formally of the same type of convex optimization problem and therefore can be solved very efficiently to within any desired accuracy. Associated with each of these estimation problems, and the focus of the paper, is an optimal experiment design (OED) problem invoked by the Cramer-Rao Inequality: find the number of experiments to be performed in a particular system configuration to maximize estimation accuracy; a configuration being any number of combinations of sample times, hardware settings, prepared initial states...

  16. Parameters estimation online for Lorenz system by a novel quantum-behaved particle swarm optimization

    International Nuclear Information System (INIS)

    This paper proposes a novel quantum-behaved particle swarm optimization (NQPSO) for the estimation of chaos' unknown parameters by transforming them into nonlinear functions' optimization. By means of the techniques in the following three aspects: contracting the searching space self-adaptively; boundaries restriction strategy; substituting the particles' convex combination for their centre of mass, this paper achieves a quite effective search mechanism with fine equilibrium between exploitation and exploration. Details of applying the proposed method and other methods into Lorenz systems are given, and experiments done show that NQPSO has better adaptability, dependability and robustness. It is a successful approach in unknown parameter estimation online especially in the cases with white noises

  17. High strength Al–Al2O3p composites: Optimization of extrusion parameters

    DEFF Research Database (Denmark)

    Luan, B.F.; Hansen, Niels; Godfrey, A.;

    2011-01-01

    Composite aluminium alloys reinforced with Al2O3p particles have been produced by squeeze casting followed by hot extrusion and a precipitation hardening treatment. Good mechanical properties can be achieved, and in this paper we describe an optimization of the key processing parameters. The...... investigation of their mechanical properties and microstructure, as well as on the surface quality of the extruded samples. The evaluation shows that material with good strength, though with limited ductility, can be reliably obtained using a production route of squeeze casting, followed by hot extrusion and a...... precipitation hardening treatment. For the extrusion step optimized processing parameters have been determined as: (i) extrusion temperature=500°C–560°C; (ii) extrusion rate=5mm/s; (iii) extrusion ratio=10:1....

  18. Physiochemical parameters optimization for enhanced nisin production by Lactococcus lactis (MTCC 440

    Directory of Open Access Journals (Sweden)

    Puspadhwaja Mall

    2010-02-01

    Full Text Available The influence of various physiochemical parameters on the growth of Lactococcus lactis sub sp. lactis MTCC 440 was studied at shake flask level for 20 h. Media optimization (MRS broth was studied to achieve enhanced growth of the organism and also nisin production. Bioassay of nisin was done with agar diffusion method using Streptococcus agalactae NCIM 2401 as indicator strain. MRS broth (6%, w/v with 0.15μg/ml of nisin supplemented with 0.5% (v/v skimmed milk was found to be the best for nisin production as well as for growth of L lactis. The production of nisin was strongly influenced by the presence of skimmed milk and nisin in MRS broth. The production of nisin was affected by the physical parameters and maximum nisin production was at 30(0C while the optimal temperature for biomass production was 37(0C.

  19. Slot Parameter Optimization for Multiband Antenna Performance Improvement Using Intelligent Systems

    Directory of Open Access Journals (Sweden)

    Erdem Demircioglu

    2015-01-01

    Full Text Available This paper discusses bandwidth enhancement for multiband microstrip patch antennas (MMPAs using symmetrical rectangular/square slots etched on the patch and the substrate properties. The slot parameters on MMPA are modeled using soft computing technique of artificial neural networks (ANN. To achieve the best ANN performance, Particle Swarm Optimization (PSO and Differential Evolution (DE are applied with ANN’s conventional training algorithm in optimization of the modeling performance. In this study, the slot parameters are assumed as slot distance to the radiating patch edge, slot width, and length. Bandwidth enhancement is applied to a formerly designed MMPA fed by a microstrip transmission line attached to the center pin of 50 ohm SMA connecter. The simulated antennas are fabricated and measured. Measurement results are utilized for training the artificial intelligence models. The ANN provides 98% model accuracy for rectangular slots and 97% for square slots; however, ANFIS offer 90% accuracy with lack of resonance frequency tracking.

  20. Cellular scanning strategy for selective laser melting: Generating reliable, optimized scanning paths and processing parameters

    DEFF Research Database (Denmark)

    Mohanty, Sankhya; Hattel, Jesper Henri

    2015-01-01

    Selective laser melting is yet to become a standardized industrial manufacturing technique. The process continues to suffer from defects such as distortions, residual stresses, localized deformations and warpage caused primarily due to the localized heating, rapid cooling and high temperature...... gradients that occur during the process. While process monitoring and control of selective laser melting is an active area of research, establishing the reliability and robustness of the process still remains a challenge.In this paper, a methodology for generating reliable, optimized scanning paths and...... process parameters for selective laser melting of a standard sample is introduced. The processing of the sample is simulated by sequentially coupling a calibrated 3D pseudo-analytical thermal model with a 3D finite element mechanical model.The optimized processing parameters are subjected to a Monte Carlo...

  1. Optimization of the Geometrical Parameters of a Solar Bubble Pump for Absorption-Diffusion Cooling Systems

    Directory of Open Access Journals (Sweden)

    N. Dammak

    2010-01-01

    Full Text Available Problem statement: The objective of this study was to optimize the geometrical parameters of a bubble pump integrated in a solar flat plate collector. Approach: This solar bubble pump was part of an ammonia/water/helium (NH3/H2O/He absorption-diffusion cooling system. Results: An empirical model was developed on the basis of momentum, mass, material equations and energy balances. The mathematical model was solved using the simulation tool “Engineering Equation Solver (EES”. Conclusion/Recommendations: Using metrological data from Gabes (Tunisia various parameters were geometrically optimized for maximum bubble pump efficiency which was best for a bubble pump tube diameter of 6 mm, a tube length of 1.5 m, an inclination to the horizontal between 30 and 50° of the solar flat plate collector and a submergence ratio between 0.2 and 0.3.

  2. Performance Evaluation and Parameter Optimization of SoftCast Wireless Video Broadcast

    Directory of Open Access Journals (Sweden)

    Dongxue Yang

    2015-08-01

    Full Text Available Wireless video broadcast plays an imp ortant role in multimedia communication with the emergence of mobile video applications. However, conventional video broadcast designs suffer from a cliff effect due to separated source and channel encoding. The newly prop osed SoftCast scheme employs a cross-layer design, whose reconstructed video quality is prop ortional to the channel condition. In this pap er, we provide the p erformance evaluation and the parameter optimization of the SoftCast system. Optimization principles on parameter selection are suggested to obtain a b etter video quality, o ccupy less bandwidth and/or utilize lower complexity. In addition, we compare SoftCast with H.264 in the LTE EPA scenario. The simulation results show that SoftCast provides a b etter p erformance in the scalability to channel conditions and the robustness to packet losses.

  3. Fully automated molecular biology routines on a plasmid-based functional proteomic workcell: Evaluation and Characterization of Yeast Strains Optimized for Growth on Xylose Expressing "Stealth" Insecticidal Peptides.

    Science.gov (United States)

    Optimization of genes important to production of fuel ethanol from hemicellulosic biomass for use in developing improved commercial yeast strains is necessary to meet the rapidly expanding need for ethanol. The United States Department of Agriculture has developed a fully automated platform for mol...

  4. Fully automated molecular biology: Plasmid-Based Functional Proteomic Workcell Evaluation and Characterization of Yeast Strains with Optimized "Trojan Horse" Amino Acid Scanning Mutational Inserts.

    Science.gov (United States)

    The optimization of various genes is important in cellulosic fuel ethanol production from S. cerevisiae to meet the rapidly expanding need for ethanol derived from hemicellulosic materials. The United States Department of Agriculture has developed a fully automated platform for molecular biology ro...

  5. Model Predictive Optimal Control of a Time-Delay Distributed-Parameter Systems

    Science.gov (United States)

    Nguyen, Nhan

    2006-01-01

    This paper presents an optimal control method for a class of distributed-parameter systems governed by first order, quasilinear hyperbolic partial differential equations that arise in many physical systems. Such systems are characterized by time delays since information is transported from one state to another by wave propagation. A general closed-loop hyperbolic transport model is controlled by a boundary control embedded in a periodic boundary condition. The boundary control is subject to a nonlinear differential equation constraint that models actuator dynamics of the system. The hyperbolic equation is thus coupled with the ordinary differential equation via the boundary condition. Optimality of this coupled system is investigated using variational principles to seek an adjoint formulation of the optimal control problem. The results are then applied to implement a model predictive control design for a wind tunnel to eliminate a transport delay effect that causes a poor Mach number regulation.

  6. Prediction Model of Battery State of Charge and Control Parameter Optimization for Electric Vehicle

    OpenAIRE

    Bambang Wahono; Kristian Ismail; Harutoshi Ogai

    2015-01-01

    This paper presents the construction of a battery state of charge (SOC) prediction model and the optimization method of the said model to appropriately control the number of parameters in compliance with the SOC as the battery output objectives. Research Centre for Electrical Power and Mechatronics, Indonesian Institute of Sciences has tested its electric vehicle research prototype on the road, monitoring its voltage, current, temperature, time, vehicle velocity, motor speed, and SOC during t...

  7. Optimal parameters for laccase-mediated destaining of Coomassie Brilliant Blue R-250-stained polyacrylamide gels

    OpenAIRE

    Yang, Jie; Yang, Xiaodan; Ye, Xiuyun; Lin, Juan

    2016-01-01

    The data presented in this article are related to the research article entitled “Destaining of Coomassie Brilliant Blue R-250-stained polyacrylamide gels with fungal laccase” [1]. Laccase is a class of multicopper oxidases that can catalyze oxidation of recalcitrant dyestuffs. This article describes optimal parameters for destaining of polyacrylamide gels, stained with Coomassie Brilliant Blue R-250, with laccase from basidiomycete Cerrena sp. strain HYB07. Effects of laccase activity, mediat...

  8. A review paper on Optimization of process parameter of EDM for air hardening tool steel

    OpenAIRE

    Tarun Modi; Shaileshbhai sanawada

    2015-01-01

    EDM is now more economical non convectional machining process. It is used widely used on small scale as well major industries. EDM process is affect by so many process parameter which are electrical and non electrical. In this project work the rotating tool is used to improve the Metal removal rate (MRR) and to observe its effect on surface finish. I am using Taguchi’s method as a design of experiments and response surface methodology for analysis and optimization. The machining p...

  9. HIGH HYDROSTATIC PRESSURE EXTRACTION OF ANTIOXIDANTS FROM MORINDA CITRIFOLIA FRUIT – PROCESS PARAMETERS OPTIMIZATION

    OpenAIRE

    PRAVEEN KUMAR; CHRIS CHU; DUDUKU KRISHNAIAH; AWANG BONO

    2006-01-01

    A modified version of high hydrostatic pressure extraction has been performed for extraction of antioxidants from M. citrifolia fruit at 5, 15, 25 bar and temperature 30° to 70°C for time duration 1, 2, 4 and 6 hours. The antioxidant activity of the extracts was determined by di-phenylpicrylhydrazyl radical scavenging method. The process parameters were optimized for antioxidant activity by central composite design method of response surface methodology using the statistical package, design e...

  10. Optimization of ridge parameters in multivariate generalized ridge regression by plug-in methods

    OpenAIRE

    Nagai, Isamu; Yanagihara, Hirokazu; Satoh, Kenichi

    2012-01-01

    Generalized ridge (GR) regression for an univariate linear model was proposed simultaneously with ridge regression by Hoerl and Kennard (1970). In this paper, we deal with a GR regression for a multivariate linear model, referred to as a multivariate GR (MGR) regression. From the viewpoint of reducing the mean squared error (MSE) of a predicted value, many authors have proposed several GR estimators consisting of ridge parameters optimized by non-iterative methods. By expanding...

  11. Effect of calibration data length on performance and optimal parameters of hydrological model

    OpenAIRE

    Chuan-Zhe LI; Hao Wang; Liu, Jia; Deng-hua YAN; Fu-liang YU; Zhang, Lu

    2010-01-01

    In order to assess the effects of calibration data length on the performance and optimal parameter values of hydrological model in ungauged or data limited catchments (actually, data are non-continuous and fragmental in some catchments), we choose to use non-continuous calibration periods to have more independent streamflow data for SIMHYD model calibration. Nash-Sutcliffe efficiency (NSE) and percentage water balance error (WBE) are used as performance measures. The Particle Swarm Optimizati...

  12. Optimization of operational parameters and bath control for electrodeposion of Ni-Mo-B amorphous alloys

    OpenAIRE

    Marinho Fabiano A.; Santana François S. M.; Vasconcelos André L. S.; Santana Renato A. C.; Prasad Shiva

    2002-01-01

    Optimization of operational parameters of an electrodeposition process for deposition of boron-containing amorphous metallic layer of nickel-molybdenum alloy onto a cathode from an electrolytic bath having nickel sulfate, sodium molybdate, boron phosphate, sodium citrate, sodium-1-dodecylsulfate and ammonia for pH adjustments to 9.5 has been studied. Detailed studies of the efects on bath temperature, mechanical agitation, cathode current density and anode format have led to optimum operation...

  13. Optimization Of Blasting Design Parameters On Open Pit Bench A Case Study Of Nchanga Open Pits

    OpenAIRE

    Victor Mwango Bowa

    2015-01-01

    Abstract In hard rock mining blasting is the most productive excavation technique applied to fragment insitu rock to the required size for efficient loading and crushing. In order to blast the insitu rock to the desired fragment size blast design parameter such as bench height hole diameter spacing burden hole length bottom charge specific charge and rock factor are considered. The research was carried out as a practical method on Nchanga Open Pits NOP ore Bench to optimize the blasting desig...

  14. Limit sets and switching strategies in parameter-optimal iterative learning control

    OpenAIRE

    Owens, D.H.; Tomas-Rodriguez, M; Daley, S

    2007-01-01

    This paper characterizes the existence and form of the possible limit error signals in typical parameter-optimal Iterative Learning Control. The set of limit errors has attracting and repelling components and the behaviour of the algorithm in the vicinity of these sets can be associated with the undesirable properties of apparent (but in fact temporary) convergence or permanent slow convergence properties in practice. The avoidance of these behaviours in practice is investigated using nove...

  15. Structural Damage Detection Based on Modal Parameters Using Continuous Ant Colony Optimization

    OpenAIRE

    Aditi Majumdar; Bharadwaj Nanda; Dipak Kumar Maiti; Damodar Maity

    2014-01-01

    A method is presented to detect and quantify structural damages from changes in modal parameters (such as natural frequencies and mode shapes). An inverse problem is formulated to minimize the objective function, defined in terms of discrepancy between the vibration data identified by modal testing and those computed from analytical model, which then solved to locate and assess the structural damage using continuous ant colony optimization algorithm. The damage is formulated as stiffness redu...

  16. Parameters Optimization of Curtain Grouting Reinforcement Cycle in Yonglian Tunnel and Its Application

    OpenAIRE

    Qingsong Zhang; Peng Li; Gang Wang; Shucai Li; Xiao Zhang; Qianqing Zhang; Qian Wang; Jianguo Liu

    2015-01-01

    For practical purposes, the curtain grouting method is an effective method to treat geological disasters and can be used to improve the strength and permeability resistance of surrounding rock. Selection of the optimal parameters of grouting reinforcement cycle especially reinforcement cycle thickness is one of the most interesting areas of research in curtain grouting designs. Based on the fluid-structure interaction theory and orthogonal analysis method, the influence of reinforcement cycle...

  17. Numerical Methods for Parameter Estimation and Optimal Control of the Red River Network

    OpenAIRE

    Thai, Tran Hong

    2005-01-01

    In this thesis efficient numerical methods for the simulation, the parameter estimation, and the optimal control of the Red River system are presented. The model of the Red River system is based on the Saint-Venant equation system, which consists of two nonlinear first-order hyperbolic Partial Differential Equations (PDE) in space and in time. In general a system of equations of this type can not be solved analytically. Therefore I choose a numerical approach, namely the Method Of Lines (...

  18. Optimization of the Effective Parameters on Hydraulic Fracturing Designing in an Iranian Sand Stone Reservoir

    OpenAIRE

    Reza Masoomi; Iniko Bassey; Dolgow Sergie Viktorovich

    2015-01-01

    Hydraulic fracturing operation is one of the key technologies in order to stimulate oil and gas wells in sand stone reservoirs. Field data relating to the hydraulic fracturing operation are mostly available as pressure-time curves. The optimization of the hydraulic fracturing parameters is not possible with only this information. So the designing and controlling the development process of hydraulic fracturing are possible only with rely on complex mathematical and numerical models. The aim of...

  19. Adaptive Importance Sampling for Performance Evaluation and Parameter Optimization of Communication Systems

    OpenAIRE

    Remondo, David; Srinivasan, Rajan; Nicola, Victor F.; Etten, van, WC Wim; Tattje, Henk E.P.

    2000-01-01

    We present new adaptive importance sampling techniques based on stochastic Newton recursions. Their applicability to the performance evaluation of communication systems is studied. Besides bit-error rate (BER) estimation, the techniques are used for system parameter optimization. Two system models that are analytically tractable are employed to demonstrate the validity of the techniques. As an application to situations that are analytically intractable and numerically intensive, the influence...

  20. The Distribution Population-based Genetic Algorithm for Parameter Optimization PID Controller

    Institute of Scientific and Technical Information of China (English)

    CHENQing-Geng; WANGNing; HUANGShao-Feng

    2005-01-01

    Enlightened by distribution of creatures in natural ecology environment, the distribution population-based genetic algorithm (DPGA) is presented in this paper. The searching capability of the algorithm is improved by competition between distribution populations to reduce the search zone.This method is applied to design of optimal parameters of PID controllers with examples, and the simulation results show that satisfactory performances are obtained.